WorldWideScience

Sample records for chemical analysis automation

  1. Development of a robotics system for automated chemical analysis of sediments, sludges, and soils

    International Nuclear Information System (INIS)

    McGrail, B.P.; Dodson, M.G.; Skorpik, J.R.; Strachan, D.M.; Barich, J.J.

    1989-01-01

    Adaptation and use of a high-reliability robot to conduct a standard laboratory procedure for soil chemical analysis are reported. Results from a blind comparative test were used to obtain a quantitative measure of the improvement in precision possible with the automated test method. Results from the automated chemical analysis procedure were compared with values obtained from an EPA-certified lab and with results from a more extensive interlaboratory round robin conducted by the EPA. For several elements, up to fivefold improvement in precision was obtained with the automated test method

  2. Application of quantum dots as analytical tools in automated chemical analysis: A review

    International Nuclear Information System (INIS)

    Frigerio, Christian; Ribeiro, David S.M.; Rodrigues, S. Sofia M.; Abreu, Vera L.R.G.; Barbosa, João A.C.; Prior, João A.V.; Marques, Karine L.; Santos, João L.M.

    2012-01-01

    Highlights: ► Review on quantum dots application in automated chemical analysis. ► Automation by using flow-based techniques. ► Quantum dots in liquid chromatography and capillary electrophoresis. ► Detection by fluorescence and chemiluminescence. ► Electrochemiluminescence and radical generation. - Abstract: Colloidal semiconductor nanocrystals or quantum dots (QDs) are one of the most relevant developments in the fast-growing world of nanotechnology. Initially proposed as luminescent biological labels, they are finding new important fields of application in analytical chemistry, where their photoluminescent properties have been exploited in environmental monitoring, pharmaceutical and clinical analysis and food quality control. Despite the enormous variety of applications that have been developed, the automation of QDs-based analytical methodologies by resorting to automation tools such as continuous flow analysis and related techniques, which would allow to take advantage of particular features of the nanocrystals such as the versatile surface chemistry and ligand binding ability, the aptitude to generate reactive species, the possibility of encapsulation in different materials while retaining native luminescence providing the means for the implementation of renewable chemosensors or even the utilisation of more drastic and even stability impairing reaction conditions, is hitherto very limited. In this review, we provide insights into the analytical potential of quantum dots focusing on prospects of their utilisation in automated flow-based and flow-related approaches and the future outlook of QDs applications in chemical analysis.

  3. A standard analysis method (SAM) for the automated analysis of polychlorinated biphenyls (PCBs) in soils using the chemical analysis automation (CAA) paradigm: validation and performance

    International Nuclear Information System (INIS)

    Rzeszutko, C.; Johnson, C.R.; Monagle, M.; Klatt, L.N.

    1997-10-01

    The Chemical Analysis Automation (CAA) program is developing a standardized modular automation strategy for chemical analysis. In this automation concept, analytical chemistry is performed with modular building blocks that correspond to individual elements of the steps in the analytical process. With a standardized set of behaviors and interactions, these blocks can be assembled in a 'plug and play' manner into a complete analysis system. These building blocks, which are referred to as Standard Laboratory Modules (SLM), interface to a host control system that orchestrates the entire analytical process, from sample preparation through data interpretation. The integrated system is called a Standard Analysis Method (SAME). A SAME for the automated determination of Polychlorinated Biphenyls (PCB) in soils, assembled in a mobile laboratory, is undergoing extensive testing and validation. The SAME consists of the following SLMs: a four channel Soxhlet extractor, a High Volume Concentrator, column clean up, a gas chromatograph, a PCB data interpretation module, a robot, and a human- computer interface. The SAME is configured to meet the requirements specified in U.S. Environmental Protection Agency's (EPA) SW-846 Methods 3541/3620A/8082 for the analysis of pcbs in soils. The PCB SAME will be described along with the developmental test plan. Performance data obtained during developmental testing will also be discussed

  4. Chemical composition dispersion in bi-metallic nanoparticles: semi-automated analysis using HAADF-STEM

    International Nuclear Information System (INIS)

    Epicier, T.; Sato, K.; Tournus, F.; Konno, T.

    2012-01-01

    We present a method using high-angle annular dark field scanning transmission electron microscopy (HAADF-STEM) to determine the chemical composition of bi-metallic nanoparticles. This method, which can be applied in a semi-automated way, allows large scale analysis with a statistical number of particles (several hundreds) in a short time. Once a calibration curve has been obtained, e.g., using energy-dispersive X-ray spectroscopy (EDX) measurements on a few particles, the HAADF integrated intensity of each particle can indeed be directly related to its chemical composition. After a theoretical description, this approach is applied to the case of iron–palladium nanoparticles (expected to be nearly stoichiometric) with a mean size of 8.3 nm. It will be shown that an accurate chemical composition histogram is obtained, i.e., the Fe content has been determined to be 49.0 at.% with a dispersion of 10.4 %. HAADF-STEM analysis represents a powerful alternative to fastidious single particle EDX measurements, for the compositional dispersion in alloy nanoparticles.

  5. Advantages of an Automated Chemical Processor for XAFS Analysis of Novel Materials

    International Nuclear Information System (INIS)

    Calvin, S.; Carpenter, E. E.

    2007-01-01

    For synthetic chemists and material scientists, XAFS analysis is typically used after promising samples have already been identified. This is not an inherent limitation of the technique, which can be applied in a crude fashion quite rapidly, but is rather an artifact of the separation of laboratory from synchrotron, and of the nature of the allocation of beam time. One way around these difficulties is to use automated chemical processors; preferably with identical machines at the home institution and at the beamline. This allows identical samples to be synthesized on demand in both locations, so that the characterization resources of the home institution's laboratory can be applied immediately to samples synthesized at the beamline. In addition, the processor can be run in a combinatorial mode, so that the XAFS of many possible synthesis results are examined in a short time. Finally, spectra possessing features of interest can be identified by software, and those syntheses singled out for further study

  6. Contaminant analysis automation, an overview

    International Nuclear Information System (INIS)

    Hollen, R.; Ramos, O. Jr.

    1996-01-01

    To meet the environmental restoration and waste minimization goals of government and industry, several government laboratories, universities, and private companies have formed the Contaminant Analysis Automation (CAA) team. The goal of this consortium is to design and fabricate robotics systems that standardize and automate the hardware and software of the most common environmental chemical methods. In essence, the CAA team takes conventional, regulatory- approved (EPA Methods) chemical analysis processes and automates them. The automation consists of standard laboratory modules (SLMs) that perform the work in a much more efficient, accurate, and cost- effective manner

  7. Automated sample analysis and remediation

    International Nuclear Information System (INIS)

    Hollen, R.; Settle, F.

    1995-01-01

    The Contaminant Analysis Automation Project is developing an automated chemical analysis system to address the current needs of the US Department of Energy (DOE). These needs focus on the remediation of large amounts of radioactive and chemically hazardous wastes stored, buried and still being processed at numerous DOE sites. This paper outlines the advantages of the system under development, and details the hardware and software design. A prototype system for characterizing polychlorinated biphenyls in soils is also described

  8. Development testing of the chemical analysis automation polychlorinated biphenyl standard analysis method during surface soils sampling at the David Witherspoon 1630 site

    International Nuclear Information System (INIS)

    Hunt, M.A.; Klatt, L.N.; Thompson, D.H.

    1998-02-01

    The Chemical Analysis Automation (CAA) project is developing standardized, software-driven, site-deployable robotic laboratory systems with the objective of lowering the per-sample analysis cost, decreasing sample turnaround time, and minimizing human exposure to hazardous and radioactive materials associated with DOE remediation projects. The first integrated system developed by the CAA project is designed to determine polychlorinated biphenyls (PCB) content in soil matrices. A demonstration and development testing of this system was conducted in conjuction with surface soil characterization activities at the David Witherspoon 1630 Site in Knoxville, Tennessee. The PCB system consists of five hardware standard laboratory modules (SLMs), one software SLM, the task sequence controller (TSC), and the human-computer interface (HCI). Four of the hardware SLMs included a four-channel Soxhlet extractor, a high-volume concentrator, a column cleanup, and a gas chromatograph. These SLMs performed the sample preparation and measurement steps within the total analysis protocol. The fifth hardware module was a robot that transports samples between the SLMs and the required consumable supplies to the SLMs. The software SLM is an automated data interpretation module that receives raw data from the gas chromatograph SLM and analyzes the data to yield the analyte information. The TSC is a software system that provides the scheduling, management of system resources, and the coordination of all SLM activities. The HCI is a graphical user interface that presents the automated laboratory to the analyst in terms of the analytical procedures and methods. Human control of the automated laboratory is accomplished via the HCI. Sample information required for processing by the automated laboratory is entered through the HCI. Information related to the sample and the system status is presented to the analyst via graphical icons

  9. Automation of activation analysis

    International Nuclear Information System (INIS)

    Ivanov, I.N.; Ivanets, V.N.; Filippov, V.V.

    1985-01-01

    The basic data on the methods and equipment of activation analysis are presented. Recommendations on the selection of activation analysis techniques, and especially the technique envisaging the use of short-lived isotopes, are given. The equipment possibilities to increase dataway carrying capacity, using modern computers for the automation of the analysis and data processing procedure, are shown

  10. Automated Motivic Analysis

    DEFF Research Database (Denmark)

    Lartillot, Olivier

    2016-01-01

    Motivic analysis provides very detailed understanding of musical composi- tions, but is also particularly difficult to formalize and systematize. A computational automation of the discovery of motivic patterns cannot be reduced to a mere extraction of all possible sequences of descriptions...... for lossless compression. The structural complexity resulting from successive repetitions of patterns can be controlled through a simple modelling of cycles. Generally, motivic patterns cannot always be defined solely as sequences of descriptions in a fixed set of dimensions: throughout the descriptions...... of the successive notes and intervals, various sets of musical parameters may be invoked. In this chapter, a method is presented that allows for these heterogeneous patterns to be discovered. Motivic repetition with local ornamentation is detected by reconstructing, on top of “surface-level” monodic voices, longer...

  11. The contaminant analysis automation robot implementation for the automated laboratory

    International Nuclear Information System (INIS)

    Younkin, J.R.; Igou, R.E.; Urenda, T.D.

    1995-01-01

    The Contaminant Analysis Automation (CAA) project defines the automated laboratory as a series of standard laboratory modules (SLM) serviced by a robotic standard support module (SSM). These SLMs are designed to allow plug-and-play integration into automated systems that perform standard analysis methods (SAM). While the SLMs are autonomous in the execution of their particular chemical processing task, the SAM concept relies on a high-level task sequence controller (TSC) to coordinate the robotic delivery of materials requisite for SLM operations, initiate an SLM operation with the chemical method dependent operating parameters, and coordinate the robotic removal of materials from the SLM when its commands and events has been established to allow ready them for transport operations as well as performing the Supervisor and Subsystems (GENISAS) software governs events from the SLMs and robot. The Intelligent System Operating Environment (ISOE) enables the inter-process communications used by GENISAS. CAA selected the Hewlett-Packard Optimized Robot for Chemical Analysis (ORCA) and its associated Windows based Methods Development Software (MDS) as the robot SSM. The MDS software is used to teach the robot each SLM position and required material port motions. To allow the TSC to command these SLM motions, a hardware and software implementation was required that allowed message passing between different operating systems. This implementation involved the use of a Virtual Memory Extended (VME) rack with a Force CPU-30 computer running VxWorks; a real-time multitasking operating system, and a Radiuses PC compatible VME computer running MDS. A GENISAS server on The Force computer accepts a transport command from the TSC, a GENISAS supervisor, over Ethernet and notifies software on the RadiSys PC of the pending command through VMEbus shared memory. The command is then delivered to the MDS robot control software using a Windows Dynamic Data Exchange conversation

  12. Automated Analysis of Accountability

    DEFF Research Database (Denmark)

    Bruni, Alessandro; Giustolisi, Rosario; Schürmann, Carsten

    2017-01-01

    that the system can detect the misbehaving parties who caused that failure. Accountability is an intuitively stronger property than verifiability as the latter only rests on the possibility of detecting the failure of a goal. A plethora of accountability and verifiability definitions have been proposed...... in the literature. Those definitions are either very specific to the protocols in question, hence not applicable in other scenarios, or too general and widely applicable but requiring complicated and hard to follow manual proofs. In this paper, we advance formal definitions of verifiability and accountability...... that are amenable to automated verification. Our definitions are general enough to be applied to different classes of protocols and different automated security verification tools. Furthermore, we point out formally the relation between verifiability and accountability. We validate our definitions...

  13. Automated activation-analysis system

    International Nuclear Information System (INIS)

    Minor, M.M.; Hensley, W.K.; Denton, M.M.; Garcia, S.R.

    1981-01-01

    An automated delayed neutron counting and instrumental neutron activation analysis system has been developed at Los Alamos National Laboratory's Omega West Reactor (OWR) to analyze samples for uranium and 31 additional elements with a maximum throughput of 400 samples per day. The system and its mode of operation for a large reconnaissance survey are described

  14. Automated Analysis of Infinite Scenarios

    DEFF Research Database (Denmark)

    Buchholtz, Mikael

    2005-01-01

    The security of a network protocol crucially relies on the scenario in which the protocol is deployed. This paper describes syntactic constructs for modelling network scenarios and presents an automated analysis tool, which can guarantee that security properties hold in all of the (infinitely many...

  15. Automated activation-analysis system

    International Nuclear Information System (INIS)

    Minor, M.M.; Garcia, S.R.; Denton, M.M.

    1982-01-01

    An automated delayed neutron counting and instrumental neutron activation analysis system has been developed at Los Alamos National Laboratory's Omega West Reactor (OWR) to analyze samples for uranium and 31 additional elements with a maximum throughput of 400 samples per day

  16. AUTOMATED ANALYSIS OF BREAKERS

    Directory of Open Access Journals (Sweden)

    E. M. Farhadzade

    2014-01-01

    Full Text Available Breakers relate to Electric Power Systems’ equipment, the reliability of which influence, to a great extend, on reliability of Power Plants. In particular, the breakers determine structural reliability of switchgear circuit of Power Stations and network substations. Failure in short-circuit switching off by breaker with further failure of reservation unit or system of long-distance protection lead quite often to system emergency.The problem of breakers’ reliability improvement and the reduction of maintenance expenses is becoming ever more urgent in conditions of systematic increasing of maintenance cost and repair expenses of oil circuit and air-break circuit breakers. The main direction of this problem solution is the improvement of diagnostic control methods and organization of on-condition maintenance. But this demands to use a great amount of statistic information about nameplate data of breakers and their operating conditions, about their failures, testing and repairing, advanced developments (software of computer technologies and specific automated information system (AIS.The new AIS with AISV logo was developed at the department: “Reliability of power equipment” of AzRDSI of Energy. The main features of AISV are:· to provide the security and data base accuracy;· to carry out systematic control of breakers conformity with operating conditions;· to make the estimation of individual  reliability’s value and characteristics of its changing for given combination of characteristics variety;· to provide personnel, who is responsible for technical maintenance of breakers, not only with information but also with methodological support, including recommendations for the given problem solving  and advanced methods for its realization.

  17. Judson_Mansouri_Automated_Chemical_Curation_QSAREnvRes_Data

    Data.gov (United States)

    U.S. Environmental Protection Agency — Here we describe the development of an automated KNIME workflow to curate and correct errors in the structure and identity of chemicals using the publically...

  18. Automated Analysis of Corpora Callosa

    DEFF Research Database (Denmark)

    Stegmann, Mikkel Bille; Davies, Rhodri H.

    2003-01-01

    This report describes and evaluates the steps needed to perform modern model-based interpretation of the corpus callosum in MRI. The process is discussed from the initial landmark-free contours to full-fledged statistical models based on the Active Appearance Models framework. Topics treated incl...... include landmark placement, background modelling and multi-resolution analysis. Preliminary quantitative and qualitative validation in a cross-sectional study show that fully automated analysis and segmentation of the corpus callosum are feasible....

  19. Automated Software Vulnerability Analysis

    Science.gov (United States)

    Sezer, Emre C.; Kil, Chongkyung; Ning, Peng

    Despite decades of research, software continues to have vulnerabilities. Successful exploitations of these vulnerabilities by attackers cost millions of dollars to businesses and individuals. Unfortunately, most effective defensive measures, such as patching and intrusion prevention systems, require an intimate knowledge of the vulnerabilities. Many systems for detecting attacks have been proposed. However, the analysis of the exploited vulnerabilities is left to security experts and programmers. Both the human effortinvolved and the slow analysis process are unfavorable for timely defensive measure to be deployed. The problem is exacerbated by zero-day attacks.

  20. Automated analysis of gastric emptying

    International Nuclear Information System (INIS)

    Abutaleb, A.; Frey, D.; Spicer, K.; Spivey, M.; Buckles, D.

    1986-01-01

    The authors devised a novel method to automate the analysis of nuclear gastric emptying studies. Many previous methods have been used to measure gastric emptying but, are cumbersome and require continuing interference by the operator to use. Two specific problems that occur are related to patient movement between images and changes in the location of the radioactive material within the stomach. Their method can be used with either dual or single phase studies. For dual phase studies the authors use In-111 labeled water and Tc-99MSC (Sulfur Colloid) labeled scrambled eggs. For single phase studies either the liquid or solid phase material is used

  1. Chemical Security Analysis Center

    Data.gov (United States)

    Federal Laboratory Consortium — In 2006, by Presidential Directive, DHS established the Chemical Security Analysis Center (CSAC) to identify and assess chemical threats and vulnerabilities in the...

  2. Reload safety analysis automation tools

    International Nuclear Information System (INIS)

    Havlůj, F.; Hejzlar, J.; Vočka, R.

    2013-01-01

    Performing core physics calculations for the sake of reload safety analysis is a very demanding and time consuming process. This process generally begins with the preparation of libraries for the core physics code using a lattice code. The next step involves creating a very large set of calculations with the core physics code. Lastly, the results of the calculations must be interpreted, correctly applying uncertainties and checking whether applicable limits are satisfied. Such a procedure requires three specialized experts. One must understand the lattice code in order to correctly calculate and interpret its results. The next expert must have a good understanding of the physics code in order to create libraries from the lattice code results and to correctly define all the calculations involved. The third expert must have a deep knowledge of the power plant and the reload safety analysis procedure in order to verify, that all the necessary calculations were performed. Such a procedure involves many steps and is very time consuming. At ÚJV Řež, a.s., we have developed a set of tools which can be used to automate and simplify the whole process of performing reload safety analysis. Our application QUADRIGA automates lattice code calculations for library preparation. It removes user interaction with the lattice code and reduces his task to defining fuel pin types, enrichments, assembly maps and operational parameters all through a very nice and user-friendly GUI. The second part in reload safety analysis calculations is done by CycleKit, a code which is linked with our core physics code ANDREA. Through CycleKit large sets of calculations with complicated interdependencies can be performed using simple and convenient notation. CycleKit automates the interaction with ANDREA, organizes all the calculations, collects the results, performs limit verification and displays the output in clickable html format. Using this set of tools for reload safety analysis simplifies

  3. Automated detection of structural alerts (chemical fragments in (ecotoxicology

    Directory of Open Access Journals (Sweden)

    Ronan Bureau

    2013-02-01

    Full Text Available This mini-review describes the evolution of different algorithms dedicated to the automated discovery of chemical fragments associated to (ecotoxicological endpoints. These structural alerts correspond to one of the most interesting approach of in silico toxicology due to their direct link with specific toxicological mechanisms. A number of expert systems are already available but, since the first work in this field which considered a binomial distribution of chemical fragments between two datasets, new data miners were developed and applied with success in chemoinformatics. The frequency of a chemical fragment in a dataset is often at the core of the process for the definition of its toxicological relevance. However, recent progresses in data mining provide new insights into the automated discovery of new rules. Particularly, this review highlights the notion of Emerging Patterns that can capture contrasts between classes of data.

  4. AUTOMATED DETECTION OF STRUCTURAL ALERTS (CHEMICAL FRAGMENTS IN (ECOTOXICOLOGY

    Directory of Open Access Journals (Sweden)

    Alban Lepailleur

    2013-02-01

    Full Text Available This mini-review describes the evolution of different algorithms dedicated to the automated discovery of chemical fragments associated to (ecotoxicological endpoints. These structural alerts correspond to one of the most interesting approach of in silico toxicology due to their direct link with specific toxicological mechanisms. A number of expert systems are already available but, since the first work in this field which considered a binomial distribution of chemical fragments between two datasets, new data miners were developed and applied with success in chemoinformatics. The frequency of a chemical fragment in a dataset is often at the core of the process for the definition of its toxicological relevance. However, recent progresses in data mining provide new insights into the automated discovery of new rules. Particularly, this review highlights the notion of Emerging Patterns that can capture contrasts between classes of data.

  5. Automated analysis of instructional text

    Energy Technology Data Exchange (ETDEWEB)

    Norton, L.M.

    1983-05-01

    The development of a capability for automated processing of natural language text is a long-range goal of artificial intelligence. This paper discusses an investigation into the issues involved in the comprehension of descriptive, as opposed to illustrative, textual material. The comprehension process is viewed as the conversion of knowledge from one representation into another. The proposed target representation consists of statements of the prolog language, which can be interpreted both declaratively and procedurally, much like production rules. A computer program has been written to model in detail some ideas about this process. The program successfully analyzes several heavily edited paragraphs adapted from an elementary textbook on programming, automatically synthesizing as a result of the analysis a working Prolog program which, when executed, can parse and interpret let commands in the basic language. The paper discusses the motivations and philosophy of the project, the many kinds of prerequisite knowledge which are necessary, and the structure of the text analysis program. A sentence-by-sentence account of the analysis of the sample text is presented, describing the syntactic and semantic processing which is involved. The paper closes with a discussion of lessons learned from the project, possible alternative approaches, and possible extensions for future work. The entire project is presented as illustrative of the nature and complexity of the text analysis process, rather than as providing definitive or optimal solutions to any aspects of the task. 12 references.

  6. Automated extraction of chemical structure information from digital raster images

    Directory of Open Access Journals (Sweden)

    Shedden Kerby A

    2009-02-01

    Full Text Available Abstract Background To search for chemical structures in research articles, diagrams or text representing molecules need to be translated to a standard chemical file format compatible with cheminformatic search engines. Nevertheless, chemical information contained in research articles is often referenced as analog diagrams of chemical structures embedded in digital raster images. To automate analog-to-digital conversion of chemical structure diagrams in scientific research articles, several software systems have been developed. But their algorithmic performance and utility in cheminformatic research have not been investigated. Results This paper aims to provide critical reviews for these systems and also report our recent development of ChemReader – a fully automated tool for extracting chemical structure diagrams in research articles and converting them into standard, searchable chemical file formats. Basic algorithms for recognizing lines and letters representing bonds and atoms in chemical structure diagrams can be independently run in sequence from a graphical user interface-and the algorithm parameters can be readily changed-to facilitate additional development specifically tailored to a chemical database annotation scheme. Compared with existing software programs such as OSRA, Kekule, and CLiDE, our results indicate that ChemReader outperforms other software systems on several sets of sample images from diverse sources in terms of the rate of correct outputs and the accuracy on extracting molecular substructure patterns. Conclusion The availability of ChemReader as a cheminformatic tool for extracting chemical structure information from digital raster images allows research and development groups to enrich their chemical structure databases by annotating the entries with published research articles. Based on its stable performance and high accuracy, ChemReader may be sufficiently accurate for annotating the chemical database with links

  7. Automation for System Safety Analysis

    Science.gov (United States)

    Malin, Jane T.; Fleming, Land; Throop, David; Thronesbery, Carroll; Flores, Joshua; Bennett, Ted; Wennberg, Paul

    2009-01-01

    This presentation describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.

  8. Evaluation of an automated karyotyping system for chromosome aberration analysis

    International Nuclear Information System (INIS)

    Prichard, H.M.

    1987-01-01

    Chromosome aberration analysis is a promising complement to conventional radiation dosimetry, particularly in the complex radiation fields encountered in the space environment. The capabilities of a recently developed automated karyotyping system were evaluated both to determine current capabilities and limitations and to suggest areas where future development should be emphasized. Cells exposed to radiometric chemicals and to photon and particulate radiation were evaluated by manual inspection and by automated karyotyping. It was demonstrated that the evaluated programs were appropriate for image digitization, storage, and transmission. However, automated and semi-automated scoring techniques must be advanced significantly if in-flight chromosome aberration analysis is to be practical. A degree of artificial intelligence may be necessary to realize this goal

  9. Automated metabolic gas analysis systems: a review.

    Science.gov (United States)

    Macfarlane, D J

    2001-01-01

    The use of automated metabolic gas analysis systems or metabolic measurement carts (MMC) in exercise studies is common throughout the industrialised world. They have become essential tools for diagnosing many hospital patients, especially those with cardiorespiratory disease. Moreover, the measurement of maximal oxygen uptake (VO2max) is routine for many athletes in fitness laboratories and has become a defacto standard in spite of its limitations. The development of metabolic carts has also facilitated the noninvasive determination of the lactate threshold and cardiac output, respiratory gas exchange kinetics, as well as studies of outdoor activities via small portable systems that often use telemetry. Although the fundamental principles behind the measurement of oxygen uptake (VO2) and carbon dioxide production (VCO2) have not changed, the techniques used have, and indeed, some have almost turned through a full circle. Early scientists often employed a manual Douglas bag method together with separate chemical analyses, but the need for faster and more efficient techniques fuelled the development of semi- and full-automated systems by private and commercial institutions. Yet, recently some scientists are returning back to the traditional Douglas bag or Tissot-spirometer methods, or are using less complex automated systems to not only save capital costs, but also to have greater control over the measurement process. Over the last 40 years, a considerable number of automated systems have been developed, with over a dozen commercial manufacturers producing in excess of 20 different automated systems. The validity and reliability of all these different systems is not well known, with relatively few independent studies having been published in this area. For comparative studies to be possible and to facilitate greater consistency of measurements in test-retest or longitudinal studies of individuals, further knowledge about the performance characteristics of these

  10. Development of an automated on-line pepsin digestion-liquid chromatography-tandem mass spectrometry configuration for the rapid analysis of protein adducts of chemical warfare agents

    NARCIS (Netherlands)

    Carol-Visser, J.; van der Schans, M.; Fidder, A.; Huist, A.G.; van Baar, B.L.M.; Irth, H.; Noort, D.

    2008-01-01

    Rapid monitoring and retrospective verification are key issues in protection against and non-proliferation of chemical warfare agents (CWA). Such monitoring and verification are adequately accomplished by the analysis of persistent protein adducts of these agents. Liquid chromatography-mass

  11. Chemical analysis as production guide

    International Nuclear Information System (INIS)

    Bouzigues, H.; Fontaine, A.; Patigny, P.

    1975-01-01

    All piloting data of chemical processing plants are based on the results of analysis. The first part of this article describes a system of analysers adapted to the needs of the Pierrelatte plant, with management of signals collected by the factory computer. Part two shows the influence of analytical development in the establishment of material balance sheets for the Marcoule spent fuel processing plant. Part three stresses the contribution of the automation of analytical test processes at the La Hague spent fuel processing plant. In all three cases the progress in analytical methods greatly improves the safety, reliability and response time of the various operations [fr

  12. Manual or automated measuring of antipsychotics' chemical oxygen demand.

    Science.gov (United States)

    Pereira, Sarah A P; Costa, Susana P F; Cunha, Edite; Passos, Marieta L C; Araújo, André R S T; Saraiva, M Lúcia M F S

    2018-05-15

    Antipsychotic (AP) drugs are becoming accumulated in terrestrial and aqueous resources due to their actual consumption. Thus, the search of methods for assessing the contamination load of these drugs is mandatory. The COD is a key parameter used for monitoring water quality upon the assessment of the effect of polluting agents on the oxygen level. Thus, the present work aims to assess the chemical oxygen demand (COD) levels of several typical and atypical antipsychotic drugs in order to obtain structure-activity relationships. It was implemented the titrimetric method with potassium dichromate as oxidant and a digestion step of 2h, followed by the measurement of remained unreduced dichromate by titration. After that, an automated sequential injection analysis (SIA) method was, also, used aiming to overcome some drawbacks of the titrimetric method. The results obtained showed a relationship between the chemical structures of antipsychotic drugs and their COD values, where the presence of aromatic rings and oxidable groups give higher COD values. It was obtained a good compliance between the results of the reference batch procedure and the SIA system, and the APs were clustered in two groups, with the values ratio between the methodologies, of 2 or 4, in the case of lower or higher COD values, respectively. The SIA methodology is capable of operating as a screening method, in any stage of a synthetic process, being also more environmentally friendly, and cost-effective. Besides, the studies presented open promising perspectives for the improvement of the effectiveness of pharmaceutical removal from the waste effluents, by assessing COD values. Copyright © 2018 Elsevier Inc. All rights reserved.

  13. An overview of the contaminant analysis automation program

    International Nuclear Information System (INIS)

    Hollen, R.M.; Erkkila, T.; Beugelsdijk, T.J.

    1992-01-01

    The Department of Energy (DOE) has significant amounts of radioactive and hazardous wastes stored, buried, and still being generated at many sites within the United States. These wastes must be characterized to determine the elemental, isotopic, and compound content before remediation can begin. In this paper, the authors project that sampling requirements will necessitate generating more than 10 million samples by 1995, which will far exceed the capabilities of our current manual chemical analysis laboratories. The Contaminant Analysis Automation effort (CAA), with Los Alamos National Laboratory (LANL) as to the coordinating Laboratory, is designing and fabricating robotic systems that will standardize and automate both the hardware and the software of the most common environmental chemical methods. This will be accomplished by designing and producing several unique analysis systems called Standard Analysis Methods (SAM). Each SAM will automate a specific chemical method, including sample preparation, the analytical analysis, and the data interpretation, by using a building block known as the Standard Laboratory Module (SLM). This concept allows the chemist to assemble an automated environmental method using standardized SLMs easily and without the worry of hardware compatibility or the necessity of generating complicated control programs

  14. Specdata: Automated Analysis Software for Broadband Spectra

    Science.gov (United States)

    Oliveira, Jasmine N.; Martin-Drumel, Marie-Aline; McCarthy, Michael C.

    2017-06-01

    With the advancement of chirped-pulse techniques, broadband rotational spectra with a few tens to several hundred GHz of spectral coverage are now routinely recorded. When studying multi-component mixtures that might result, for example, with the use of an electrical discharge, lines of new chemical species are often obscured by those of known compounds, and analysis can be laborious. To address this issue, we have developed SPECdata, an open source, interactive tool which is designed to simplify and greatly accelerate the spectral analysis and discovery. Our software tool combines both automated and manual components that free the user from computation, while giving him/her considerable flexibility to assign, manipulate, interpret and export their analysis. The automated - and key - component of the new software is a database query system that rapidly assigns transitions of known species in an experimental spectrum. For each experiment, the software identifies spectral features, and subsequently assigns them to known molecules within an in-house database (Pickett .cat files, list of frequencies...), or those catalogued in Splatalogue (using automatic on-line queries). With suggested assignments, the control is then handed over to the user who can choose to accept, decline or add additional species. Data visualization, statistical information, and interactive widgets assist the user in making decisions about their data. SPECdata has several other useful features intended to improve the user experience. Exporting a full report of the analysis, or a peak file in which assigned lines are removed are among several options. A user may also save their progress to continue at another time. Additional features of SPECdata help the user to maintain and expand their database for future use. A user-friendly interface allows one to search, upload, edit or update catalog or experiment entries.

  15. Automated analysis of autoradiographic imagery

    International Nuclear Information System (INIS)

    Bisignani, W.T.; Greenhouse, S.C.

    1975-01-01

    A research programme is described which has as its objective the automated characterization of neurological tissue regions from autoradiographs by utilizing hybrid-resolution image processing techniques. An experimental system is discussed which includes raw imagery, scanning an digitizing equipments, feature-extraction algorithms, and regional characterization techniques. The parameters extracted by these algorithms are presented as well as the regional characteristics which are obtained by operating on the parameters with statistical sampling techniques. An approach is presented for validating the techniques and initial experimental results are obtained from an anlysis of an autoradiograph of a region of the hypothalamus. An extension of these automated techniques to other biomedical research areas is discussed as well as the implications of applying automated techniques to biomedical research problems. (author)

  16. Distribution system analysis and automation

    CERN Document Server

    Gers, Juan

    2013-01-01

    A comprehensive guide to techniques that allow engineers to simulate, analyse and optimise power distribution systems which combined with automation, underpin the emerging concept of the "smart grid". This book is supported by theoretical concepts with real-world applications and MATLAB exercises.

  17. Automated analysis of slitless spectra. II. Quasars

    International Nuclear Information System (INIS)

    Edwards, G.; Beauchemin, M.; Borra, F.

    1988-01-01

    Automated software have been developed to process slitless spectra. The software, described in a previous paper, automatically separates stars from extended objects and quasars from stars. This paper describes the quasar search techniques and discusses the results. The performance of the software is compared and calibrated with a plate taken in a region of SA 57 that has been extensively surveyed by others using a variety of techniques: the proposed automated software performs very well. It is found that an eye search of the same plate is less complete than the automated search: surveys that rely on eye searches suffer from incompleteness at least from a magnitude brighter than the plate limit. It is shown how the complete automated analysis of a plate and computer simulations are used to calibrate and understand the characteristics of the present data. 20 references

  18. Automated Physico-Chemical Cell Model Development through Information Theory

    Energy Technology Data Exchange (ETDEWEB)

    Peter J. Ortoleva

    2005-11-29

    The objective of this project was to develop predictive models of the chemical responses of microbial cells to variations in their surroundings. The application of these models is optimization of environmental remediation and energy-producing biotechnical processes.The principles on which our project is based are as follows: chemical thermodynamics and kinetics; automation of calibration through information theory; integration of multiplex data (e.g. cDNA microarrays, NMR, proteomics), cell modeling, and bifurcation theory to overcome cellular complexity; and the use of multiplex data and information theory to calibrate and run an incomplete model. In this report we review four papers summarizing key findings and a web-enabled, multiple module workflow we have implemented that consists of a set of interoperable systems biology computational modules.

  19. Automated analysis in generic groups

    Science.gov (United States)

    Fagerholm, Edvard

    This thesis studies automated methods for analyzing hardness assumptions in generic group models, following ideas of symbolic cryptography. We define a broad class of generic and symbolic group models for different settings---symmetric or asymmetric (leveled) k-linear groups --- and prove ''computational soundness'' theorems for the symbolic models. Based on this result, we formulate a master theorem that relates the hardness of an assumption to solving problems in polynomial algebra. We systematically analyze these problems identifying different classes of assumptions and obtain decidability and undecidability results. Then, we develop automated procedures for verifying the conditions of our master theorems, and thus the validity of hardness assumptions in generic group models. The concrete outcome is an automated tool, the Generic Group Analyzer, which takes as input the statement of an assumption, and outputs either a proof of its generic hardness or shows an algebraic attack against the assumption. Structure-preserving signatures are signature schemes defined over bilinear groups in which messages, public keys and signatures are group elements, and the verification algorithm consists of evaluating ''pairing-product equations''. Recent work on structure-preserving signatures studies optimality of these schemes in terms of the number of group elements needed in the verification key and the signature, and the number of pairing-product equations in the verification algorithm. While the size of keys and signatures is crucial for many applications, another aspect of performance is the time it takes to verify a signature. The most expensive operation during verification is the computation of pairings. However, the concrete number of pairings is not captured by the number of pairing-product equations considered in earlier work. We consider the question of what is the minimal number of pairing computations needed to verify structure-preserving signatures. We build an

  20. Flow injection analysis: Emerging tool for laboratory automation in radiochemistry

    International Nuclear Information System (INIS)

    Egorov, O.; Ruzicka, J.; Grate, J.W.; Janata, J.

    1996-01-01

    Automation of routine and serial assays is a common practice of modern analytical laboratory, while it is virtually nonexistent in the field of radiochemistry. Flow injection analysis (FIA) is a general solution handling methodology that has been extensively used for automation of routine assays in many areas of analytical chemistry. Reproducible automated solution handling and on-line separation capabilities are among several distinctive features that make FI a very promising, yet under utilized tool for automation in analytical radiochemistry. The potential of the technique is demonstrated through the development of an automated 90 Sr analyzer and its application in the analysis of tank waste samples from the Hanford site. Sequential injection (SI), the latest generation of FIA, is used to rapidly separate 90 Sr from interfering radionuclides and deliver separated Sr zone to a flow-through liquid scintillation detector. The separation is performed on a mini column containing Sr-specific sorbent extraction material, which selectively retains Sr under acidic conditions. The 90 Sr is eluted with water, mixed with scintillation cocktail, and sent through the flow cell of a flow through counter, where 90 Sr radioactivity is detected as a transient signal. Both peak area and peak height can be used for quantification of sample radioactivity. Alternatively, stopped flow detection can be performed to improve detection precision for low activity samples. The authors current research activities are focused on expansion of radiochemical applications of FIA methodology, with an ultimate goal of creating a set of automated methods that will cover the basic needs of radiochemical analysis at the Hanford site. The results of preliminary experiments indicate that FIA is a highly suitable technique for the automation of chemically more challenging separations, such as separation of actinide elements

  1. Automated Analysis of Security in Networking Systems

    DEFF Research Database (Denmark)

    Buchholtz, Mikael

    2004-01-01

    such networking systems are modelled in the process calculus LySa. On top of this programming language based formalism an analysis is developed, which relies on techniques from data and control ow analysis. These are techniques that can be fully automated, which make them an ideal basis for tools targeted at non...

  2. Automated analysis of damages for radiation in plastics surfaces

    International Nuclear Information System (INIS)

    Andrade, C.; Camacho M, E.; Tavera, L.; Balcazar, M.

    1990-02-01

    Analysis of damages done by the radiation in a polymer characterized by optic properties of polished surfaces, of uniformity and chemical resistance that the acrylic; resistant until the 150 centigrade grades of temperature, and with an approximate weight of half of the glass. An objective of this work is the development of a method that analyze in automated form the superficial damages induced by radiation in plastic materials means an images analyst. (Author)

  3. Automated Technology for Verificiation and Analysis

    DEFF Research Database (Denmark)

    -of-the-art research on theoretical and practical aspects of automated analysis, verification, and synthesis. Among 74 research papers and 10 tool papers submitted to ATVA 2009, the Program Committee accepted 23 as regular papers and 3 as tool papers. In all, 33 experts from 17 countries worked hard to make sure......This volume contains the papers presented at the 7th International Symposium on Automated Technology for Verification and Analysis held during October 13-16 in Macao SAR, China. The primary objective of the ATVA conferences remains the same: to exchange and promote the latest advances of state...

  4. Chemical process hazards analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-02-01

    The Office of Worker Health and Safety (EH-5) under the Assistant Secretary for the Environment, Safety and Health of the US Department (DOE) has published two handbooks for use by DOE contractors managing facilities and processes covered by the Occupational Safety and Health Administration (OSHA) Rule for Process Safety Management of Highly Hazardous Chemicals (29 CFR 1910.119), herein referred to as the PSM Rule. The PSM Rule contains an integrated set of chemical process safety management elements designed to prevent chemical releases that can lead to catastrophic fires, explosions, or toxic exposures. The purpose of the two handbooks, ``Process Safety Management for Highly Hazardous Chemicals`` and ``Chemical Process Hazards Analysis,`` is to facilitate implementation of the provisions of the PSM Rule within the DOE. The purpose of this handbook ``Chemical Process Hazards Analysis,`` is to facilitate, within the DOE, the performance of chemical process hazards analyses (PrHAs) as required under the PSM Rule. It provides basic information for the performance of PrHAs, and should not be considered a complete resource on PrHA methods. Likewise, to determine if a facility is covered by the PSM rule, the reader should refer to the handbook, ``Process Safety Management for Highly Hazardous Chemicals`` (DOE- HDBK-1101-96). Promulgation of the PSM Rule has heightened the awareness of chemical safety management issues within the DOE. This handbook is intended for use by DOE facilities and processes covered by the PSM rule to facilitate contractor implementation of the PrHA element of the PSM Rule. However, contractors whose facilities and processes not covered by the PSM Rule may also use this handbook as a basis for conducting process hazards analyses as part of their good management practices. This handbook explains the minimum requirements for PrHAs outlined in the PSM Rule. Nowhere have requirements been added beyond what is specifically required by the rule.

  5. Automated x-ray fluorescence analysis

    International Nuclear Information System (INIS)

    O'Connell, A.M.

    1977-01-01

    A fully automated x-ray fluorescence analytical system is described. The hardware is based on a Philips PW1220 sequential x-ray spectrometer. Software for on-line analysis of a wide range of sample types has been developed for the Hewlett-Packard 9810A programmable calculator. Routines to test the system hardware are also described. (Author)

  6. Computer-automated neutron activation analysis system

    International Nuclear Information System (INIS)

    Minor, M.M.; Garcia, S.R.

    1983-01-01

    An automated delayed neutron counting and instrumental neutron activation analysis system has been developed at Los Alamos National Laboratory's Omega West Reactor (OWR) to analyze samples for uranium and 31 additional elements with a maximum throughput of 400 samples per day. 5 references

  7. Service activities of chemical analysis division

    International Nuclear Information System (INIS)

    Eom, Tae Yoon; Suh, Moo Yul; Park, Kyoung Kyun; Jung, Ki Suk; Joe, Kih Soo; Jee, Kwang Yong; Jung, Woo Sik; Sohn, Se Chul; Yeo, In Heong; Han, Sun Ho

    1988-12-01

    Progress of the Division during the year of 1988 was described on the service activities for various R and D projects carrying out in the Institute, for the fuel fabrication and conversion plant, and for the post-irradiation examination facility. Relevant analytical methodologies developed for the chemical analysis of an irradiated fuel, safeguards chemical analysis, and pool water monitoring were included such as chromatographic separation of lanthanides, polarographic determination of dissolved oxygen in water, and automation on potentiometric titration of uranium. Some of the laboratory manuals revised were also included in this progress report. (Author)

  8. Planning representation for automated exploratory data analysis

    Science.gov (United States)

    St. Amant, Robert; Cohen, Paul R.

    1994-03-01

    Igor is a knowledge-based system for exploratory statistical analysis of complex systems and environments. Igor has two related goals: to help automate the search for interesting patterns in data sets, and to help develop models that capture significant relationships in the data. We outline a language for Igor, based on techniques of opportunistic planning, which balances control and opportunism. We describe the application of Igor to the analysis of the behavior of Phoenix, an artificial intelligence planning system.

  9. Analysis And Control System For Automated Welding

    Science.gov (United States)

    Powell, Bradley W.; Burroughs, Ivan A.; Kennedy, Larry Z.; Rodgers, Michael H.; Goode, K. Wayne

    1994-01-01

    Automated variable-polarity plasma arc (VPPA) welding apparatus operates under electronic supervision by welding analysis and control system. System performs all major monitoring and controlling functions. It acquires, analyzes, and displays weld-quality data in real time and adjusts process parameters accordingly. Also records pertinent data for use in post-weld analysis and documentation of quality. System includes optoelectronic sensors and data processors that provide feedback control of welding process.

  10. Improvement of Binary Analysis Components in Automated Malware Analysis Framework

    Science.gov (United States)

    2017-02-21

    AFRL-AFOSR-JP-TR-2017-0018 Improvement of Binary Analysis Components in Automated Malware Analysis Framework Keiji Takeda KEIO UNIVERSITY Final...TYPE Final 3. DATES COVERED (From - To) 26 May 2015 to 25 Nov 2016 4. TITLE AND SUBTITLE Improvement of Binary Analysis Components in Automated Malware ...analyze malicious software ( malware ) with minimum human interaction. The system autonomously analyze malware samples by analyzing malware binary program

  11. Automated Loads Analysis System (ATLAS)

    Science.gov (United States)

    Gardner, Stephen; Frere, Scot; O’Reilly, Patrick

    2013-01-01

    ATLAS is a generalized solution that can be used for launch vehicles. ATLAS is used to produce modal transient analysis and quasi-static analysis results (i.e., accelerations, displacements, and forces) for the payload math models on a specific Shuttle Transport System (STS) flight using the shuttle math model and associated forcing functions. This innovation solves the problem of coupling of payload math models into a shuttle math model. It performs a transient loads analysis simulating liftoff, landing, and all flight events between liftoff and landing. ATLAS utilizes efficient and numerically stable algorithms available in MSC/NASTRAN.

  12. Automated spectral and timing analysis of AGNs

    Science.gov (United States)

    Munz, F.; Karas, V.; Guainazzi, M.

    2006-12-01

    % We have developed an autonomous script that helps the user to automate the XMM-Newton data analysis for the purposes of extensive statistical investigations. We test this approach by examining X-ray spectra of bright AGNs pre-selected from the public database. The event lists extracted in this process were studied further by constructing their energy-resolved Fourier power-spectrum density. This analysis combines energy distributions, light-curves, and their power-spectra and it proves useful to assess the variability patterns present is the data. As another example, an automated search was based on the XSPEC package to reveal the emission features in 2-8 keV range.

  13. Techniques for Automated Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  14. Automated information retrieval system for radioactivation analysis

    International Nuclear Information System (INIS)

    Lambrev, V.G.; Bochkov, P.E.; Gorokhov, S.A.; Nekrasov, V.V.; Tolstikova, L.I.

    1981-01-01

    An automated information retrieval system for radioactivation analysis has been developed. An ES-1022 computer and a problem-oriented software ''The description information search system'' were used for the purpose. Main aspects and sources of forming the system information fund, characteristics of the information retrieval language of the system are reported and examples of question-answer dialogue are given. Two modes can be used: selective information distribution and retrospective search [ru

  15. Automated analysis of objective-prism spectra

    International Nuclear Information System (INIS)

    Hewett, P.C.; Irwin, M.J.; Bunclark, P.; Bridgeland, M.T.; Kibblewhite, E.J.; Smith, M.G.

    1985-01-01

    A fully automated system for the location, measurement and analysis of large numbers of low-resolution objective-prism spectra is described. The system is based on the APM facility at the University of Cambridge, and allows processing of objective-prism, grens or grism data. Particular emphasis is placed on techniques to obtain the maximum signal-to-noise ratio from the data, both in the initial spectral estimation procedure and for subsequent feature identification. Comparison of a high-quality visual catalogue of faint quasar candidates with an equivalent automated sample demonstrates the ability of the APM system to identify all the visually selected quasar candidates. In addition, a large population of new, faint (msub(J)approx. 20) candidates is identified. (author)

  16. Automation of route identification and optimisation based on data-mining and chemical intuition.

    Science.gov (United States)

    Lapkin, A A; Heer, P K; Jacob, P-M; Hutchby, M; Cunningham, W; Bull, S D; Davidson, M G

    2017-09-21

    Data-mining of Reaxys and network analysis of the combined literature and in-house reactions set were used to generate multiple possible reaction routes to convert a bio-waste feedstock, limonene, into a pharmaceutical API, paracetamol. The network analysis of data provides a rich knowledge-base for generation of the initial reaction screening and development programme. Based on the literature and the in-house data, an overall flowsheet for the conversion of limonene to paracetamol was proposed. Each individual reaction-separation step in the sequence was simulated as a combination of the continuous flow and batch steps. The linear model generation methodology allowed us to identify the reaction steps requiring further chemical optimisation. The generated model can be used for global optimisation and generation of environmental and other performance indicators, such as cost indicators. However, the identified further challenge is to automate model generation to evolve optimal multi-step chemical routes and optimal process configurations.

  17. Automated mini-column solid-phase extraction cleanup for high-throughput analysis of chemical contaminants in foods by low-pressure gas chromatography – tandem mass spectrometry

    Science.gov (United States)

    This study demonstrated the application of an automated high-throughput mini-cartridge solid-phase extraction (mini-SPE) cleanup for the rapid low-pressure gas chromatography – tandem mass spectrometry (LPGC-MS/MS) analysis of pesticides and environmental contaminants in QuEChERS extracts of foods. ...

  18. Automating risk analysis of software design models.

    Science.gov (United States)

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  19. Automating Trend Analysis for Spacecraft Constellations

    Science.gov (United States)

    Davis, George; Cooter, Miranda; Updike, Clark; Carey, Everett; Mackey, Jennifer; Rykowski, Timothy; Powers, Edward I. (Technical Monitor)

    2001-01-01

    Spacecraft trend analysis is a vital mission operations function performed by satellite controllers and engineers, who perform detailed analyses of engineering telemetry data to diagnose subsystem faults and to detect trends that may potentially lead to degraded subsystem performance or failure in the future. It is this latter function that is of greatest importance, for careful trending can often predict or detect events that may lead to a spacecraft's entry into safe-hold. Early prediction and detection of such events could result in the avoidance of, or rapid return to service from, spacecraft safing, which not only results in reduced recovery costs but also in a higher overall level of service for the satellite system. Contemporary spacecraft trending activities are manually intensive and are primarily performed diagnostically after a fault occurs, rather than proactively to predict its occurrence. They also tend to rely on information systems and software that are oudated when compared to current technologies. When coupled with the fact that flight operations teams often have limited resources, proactive trending opportunities are limited, and detailed trend analysis is often reserved for critical responses to safe holds or other on-orbit events such as maneuvers. While the contemporary trend analysis approach has sufficed for current single-spacecraft operations, it will be unfeasible for NASA's planned and proposed space science constellations. Missions such as the Dynamics, Reconnection and Configuration Observatory (DRACO), for example, are planning to launch as many as 100 'nanospacecraft' to form a homogenous constellation. A simple extrapolation of resources and manpower based on single-spacecraft operations suggests that trending for such a large spacecraft fleet will be unmanageable, unwieldy, and cost-prohibitive. It is therefore imperative that an approach to automating the spacecraft trend analysis function be studied, developed, and applied to

  20. Automated reasoning applications to design analysis

    International Nuclear Information System (INIS)

    Stratton, R.C.

    1984-01-01

    Given the necessary relationships and definitions of design functions and components, validation of system incarnation (the physical product of design) and sneak function analysis can be achieved via automated reasoners. The relationships and definitions must define the design specification and incarnation functionally. For the design specification, the hierarchical functional representation is based on physics and engineering principles and bounded by design objectives and constraints. The relationships and definitions of the design incarnation are manifested as element functional definitions, state relationship to functions, functional relationship to direction, element connectivity, and functional hierarchical configuration

  1. Automated analysis for nitrate by hydrazine reduction

    Energy Technology Data Exchange (ETDEWEB)

    Kamphake, L J; Hannah, S A; Cohen, J M

    1967-01-01

    An automated procedure for the simultaneous determinations of nitrate and nitrite in water is presented. Nitrite initially present in the sample is determined by a conventional diazotization-coupling reaction. Nitrate in another portion of sample is quantitatively reduced with hydrazine sulfate to nitrite which is then determined by the same diazotization-coupling reaction. Subtracting the nitrite initially present in the sample from that after reduction yields nitrite equivalent to nitrate initially in the sample. The rate of analysis is 20 samples/hr. Applicable range of the described method is 0.05-10 mg/l nitrite or nitrate nitrogen; however, increased sensitivity can be obtained by suitable modifications.

  2. Applications of Automation Methods for Nonlinear Fracture Test Analysis

    Science.gov (United States)

    Allen, Phillip A.; Wells, Douglas N.

    2013-01-01

    Using automated and standardized computer tools to calculate the pertinent test result values has several advantages such as: 1. allowing high-fidelity solutions to complex nonlinear phenomena that would be impractical to express in written equation form, 2. eliminating errors associated with the interpretation and programing of analysis procedures from the text of test standards, 3. lessening the need for expertise in the areas of solid mechanics, fracture mechanics, numerical methods, and/or finite element modeling, to achieve sound results, 4. and providing one computer tool and/or one set of solutions for all users for a more "standardized" answer. In summary, this approach allows a non-expert with rudimentary training to get the best practical solution based on the latest understanding with minimum difficulty.Other existing ASTM standards that cover complicated phenomena use standard computer programs: 1. ASTM C1340/C1340M-10- Standard Practice for Estimation of Heat Gain or Loss Through Ceilings Under Attics Containing Radiant Barriers by Use of a Computer Program 2. ASTM F 2815 - Standard Practice for Chemical Permeation through Protective Clothing Materials: Testing Data Analysis by Use of a Computer Program 3. ASTM E2807 - Standard Specification for 3D Imaging Data Exchange, Version 1.0 The verification, validation, and round-robin processes required of a computer tool closely parallel the methods that are used to ensure the solution validity for equations included in test standard. The use of automated analysis tools allows the creation and practical implementation of advanced fracture mechanics test standards that capture the physics of a nonlinear fracture mechanics problem without adding undue burden or expense to the user. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.

  3. Automation of 3D cell culture using chemically defined hydrogels.

    Science.gov (United States)

    Rimann, Markus; Angres, Brigitte; Patocchi-Tenzer, Isabel; Braum, Susanne; Graf-Hausner, Ursula

    2014-04-01

    Drug development relies on high-throughput screening involving cell-based assays. Most of the assays are still based on cells grown in monolayer rather than in three-dimensional (3D) formats, although cells behave more in vivo-like in 3D. To exemplify the adoption of 3D techniques in drug development, this project investigated the automation of a hydrogel-based 3D cell culture system using a liquid-handling robot. The hydrogel technology used offers high flexibility of gel design due to a modular composition of a polymer network and bioactive components. The cell inert degradation of the gel at the end of the culture period guaranteed the harmless isolation of live cells for further downstream processing. Human colon carcinoma cells HCT-116 were encapsulated and grown in these dextran-based hydrogels, thereby forming 3D multicellular spheroids. Viability and DNA content of the cells were shown to be similar in automated and manually produced hydrogels. Furthermore, cell treatment with toxic Taxol concentrations (100 nM) had the same effect on HCT-116 cell viability in manually and automated hydrogel preparations. Finally, a fully automated dose-response curve with the reference compound Taxol showed the potential of this hydrogel-based 3D cell culture system in advanced drug development.

  4. Management issues in automated audit analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jackson, K.A.; Hochberg, J.G.; Wilhelmy, S.K.; McClary, J.F.; Christoph, G.G.

    1994-03-01

    This paper discusses management issues associated with the design and implementation of an automated audit analysis system that we use to detect security events. It gives the viewpoint of a team directly responsible for developing and managing such a system. We use Los Alamos National Laboratory`s Network Anomaly Detection and Intrusion Reporter (NADIR) as a case in point. We examine issues encountered at Los Alamos, detail our solutions to them, and where appropriate suggest general solutions. After providing an introduction to NADIR, we explore four general management issues: cost-benefit questions, privacy considerations, legal issues, and system integrity. Our experiences are of general interest both to security professionals and to anyone who may wish to implement a similar system. While NADIR investigates security events, the methods used and the management issues are potentially applicable to a broad range of complex systems. These include those used to audit credit card transactions, medical care payments, and procurement systems.

  5. Chemical analysis quality assurance at the ICPP

    International Nuclear Information System (INIS)

    Hand, R.L.

    1990-01-01

    This document discusses the chemical analysis quality assurance program at the ICPP which involves records management, analytical methods quality control, analysis procedures and training and qualification. Since 1979, the major portion of the quality assurance program has been implemented on a central analytical computer system. The individual features provided by the system are storage, retrieval, and search capabilities over all general request and sample analysis information, automatic method selection for all process streams, automation of all method calculations, automatic assignment of bias and precision estimates at all analysis levels, with-method-use requalification, untrained or unqualified analyst method lockout, statistical testing of all process stream results for replicate agreement, automatic testing of process results against pre- established operating, safety, or failure limits at varying confidence levels, and automatic transfer and report of all analysis data plus all statistical testing to the Production Department

  6. An open framework for automated chemical hazard assessment based on GreenScreen for Safer Chemicals: A proof of concept.

    Science.gov (United States)

    Wehage, Kristopher; Chenhansa, Panan; Schoenung, Julie M

    2017-01-01

    GreenScreen® for Safer Chemicals is a framework for comparative chemical hazard assessment. It is the first transparent, open and publicly accessible framework of its kind, allowing manufacturers and governmental agencies to make informed decisions about the chemicals and substances used in consumer products and buildings. In the GreenScreen® benchmarking process, chemical hazards are assessed and classified based on 18 hazard endpoints from up to 30 different sources. The result is a simple numerical benchmark score and accompanying assessment report that allows users to flag chemicals of concern and identify safer alternatives. Although the screening process is straightforward, aggregating and sorting hazard data is tedious, time-consuming, and prone to human error. In light of these challenges, the present work demonstrates the usage of automation to cull chemical hazard data from publicly available internet resources, assign metadata, and perform a GreenScreen® hazard assessment using the GreenScreen® "List Translator." The automated technique, written as a module in the Python programming language, generates GreenScreen® List Translation data for over 3000 chemicals in approximately 30 s. Discussion of the potential benefits and limitations of automated techniques is provided. By embedding the library into a web-based graphical user interface, the extensibility of the library is demonstrated. The accompanying source code is made available to the hazard assessment community. Integr Environ Assess Manag 2017;13:167-176. © 2016 SETAC. © 2016 SETAC.

  7. An approach to automated chromosome analysis

    International Nuclear Information System (INIS)

    Le Go, Roland

    1972-01-01

    The methods of approach developed with a view to automatic processing of the different stages of chromosome analysis are described in this study divided into three parts. Part 1 relates the study of automated selection of metaphase spreads, which operates a decision process in order to reject ail the non-pertinent images and keep the good ones. This approach has been achieved by Computing a simulation program that has allowed to establish the proper selection algorithms in order to design a kit of electronic logical units. Part 2 deals with the automatic processing of the morphological study of the chromosome complements in a metaphase: the metaphase photographs are processed by an optical-to-digital converter which extracts the image information and writes it out as a digital data set on a magnetic tape. For one metaphase image this data set includes some 200 000 grey values, encoded according to a 16, 32 or 64 grey-level scale, and is processed by a pattern recognition program isolating the chromosomes and investigating their characteristic features (arm tips, centromere areas), in order to get measurements equivalent to the lengths of the four arms. Part 3 studies a program of automated karyotyping by optimized pairing of human chromosomes. The data are derived from direct digitizing of the arm lengths by means of a BENSON digital reader. The program supplies' 1/ a list of the pairs, 2/ a graphic representation of the pairs so constituted according to their respective lengths and centromeric indexes, and 3/ another BENSON graphic drawing according to the author's own representation of the chromosomes, i.e. crosses with orthogonal arms, each branch being the accurate measurement of the corresponding chromosome arm. This conventionalized karyotype indicates on the last line the really abnormal or non-standard images unpaired by the program, which are of special interest for the biologist. (author) [fr

  8. Ecological Automation Design, Extending Work Domain Analysis

    NARCIS (Netherlands)

    Amelink, M.H.J.

    2010-01-01

    In high–risk domains like aviation, medicine and nuclear power plant control, automation has enabled new capabilities, increased the economy of operation and has greatly contributed to safety. However, automation increases the number of couplings in a system, which can inadvertently lead to more

  9. Chemical analysis report 2014

    International Nuclear Information System (INIS)

    Elbouzidi, Saliha; Elyahyaoui, Adil; Ghassan, Acil; Marah, Hamid

    2014-01-01

    This report highlights the results of chemical analyzes related to Major elements, traces and heavy metals carried out at the CNESTEN DASTE in Rabat (Morocco), on behalf of Senegal. These analyzes cover 120 samples. The report presents the analytical techniques used (parameters and methods), a legend and the results tables.

  10. Chemical analysis report 2015

    International Nuclear Information System (INIS)

    2015-01-01

    This report highlights the results of chemical analyzes of fluorides, bromides, lithium and boron carried out at the CNESTEN DASTE in Rabat (Morocco), on behalf of Senegal. These analyzes cover 120 samples. The report presents the analytical techniques used (parameters and methods), a legend and the results tables.

  11. Automated Image Analysis Corrosion Working Group Update: February 1, 2018

    Energy Technology Data Exchange (ETDEWEB)

    Wendelberger, James G. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-02-01

    These are slides for the automated image analysis corrosion working group update. The overall goals were: automate the detection and quantification of features in images (faster, more accurate), how to do this (obtain data, analyze data), focus on Laser Scanning Confocal Microscope (LCM) data (laser intensity, laser height/depth, optical RGB, optical plus laser RGB).

  12. Automated fault tree analysis: the GRAFTER system

    International Nuclear Information System (INIS)

    Sancaktar, S.; Sharp, D.R.

    1985-01-01

    An inherent part of probabilistic risk assessment (PRA) is the construction and analysis of detailed fault trees. For this purpose, a fault tree computer graphics code named GRAFTER has been developed. The code system centers around the GRAFTER code. This code is used interactively to construct, store, update and print fault trees of small or large sizes. The SIMON code is used to provide data for the basic event probabilities. ENCODE is used to process the GRAFTER files to prepare input for the WAMCUT code. WAMCUT is used to quantify the top event probability and to identify the cutsets. This code system has been extensively used in various PRA projects. It has resulted in reduced manpower costs, increased QA capability, ease of documentation and it has simplified sensitivity analyses. Because of its automated nature, it is also suitable for LIVING PRA Studies which require updating and modifications during the lifetime of the plant. Brief descriptions and capabilities of the GRAFTER, SIMON and ENCODE codes are provided; an application of the GRAFTER system is outlined; and conclusions and comments on the code system are given

  13. Automated and connected vehicle implications and analysis.

    Science.gov (United States)

    2017-05-01

    Automated and connected vehicles (ACV) and, in particular, autonomous vehicles have captured : the interest of the public, industry and transportation authorities. ACVs can significantly reduce : accidents, fuel consumption, pollution and the costs o...

  14. System analysis of automated speed enforcement implementation.

    Science.gov (United States)

    2016-04-01

    Speeding is a major factor in a large proportion of traffic crashes, injuries, and fatalities in the United States. Automated Speed Enforcement (ASE) is one of many approaches shown to be effective in reducing speeding violations and crashes. However...

  15. Delineated Analysis of Robotic Process Automation Tools

    OpenAIRE

    Ruchi Isaac; Riya Muni; Kenali Desai

    2017-01-01

    In this age and time when celerity is expected out of all the sectors of the country, the speed of execution of various processes and hence efficiency, becomes a prominent factor. To facilitate the speeding demands of these diverse platforms, Robotic Process Automation (RPA) is used. Robotic Process Automation can expedite back-office tasks in commercial industries, remote management tasks in IT industries and conservation of resources in multiple sectors. To implement RPA, many software ...

  16. Chemical substructure analysis in toxicology

    Energy Technology Data Exchange (ETDEWEB)

    Beauchamp, R.O. Jr. [Center for Information on Toxicology and Environment, Raleigh, NC (United States)

    1990-12-31

    A preliminary examination of chemical-substructure analysis (CSA) demonstrates the effective use of the Chemical Abstracts compound connectivity file in conjunction with the bibliographic file for relating chemical structures to biological activity. The importance of considering the role of metabolic intermediates under a variety of conditions is illustrated, suggesting structures that should be examined that may exhibit potential activity. This CSA technique, which utilizes existing large files accessible with online personal computers, is recommended for use as another tool in examining chemicals in drugs. 2 refs., 4 figs.

  17. Chemical Analysis Facility

    Data.gov (United States)

    Federal Laboratory Consortium — FUNCTION: Uses state-of-the-art instrumentation for qualitative and quantitative analysis of organic and inorganic compounds, and biomolecules from gas, liquid, and...

  18. Microprocessors in automatic chemical analysis

    International Nuclear Information System (INIS)

    Goujon de Beauvivier, M.; Perez, J.-J.

    1979-01-01

    Application of microprocessors to programming and computing of solutions chemical analysis by a sequential technique is examined. Safety, performances reliability are compared to other methods. An example is given on uranium titration by spectrophotometry [fr

  19. Priority setting for existing chemicals : automated data selection routine

    NARCIS (Netherlands)

    Haelst, A.G. van; Hansen, B.G.

    2000-01-01

    One of the four steps within Council Regulation 793/93/EEC on the evaluation and control of existing chemicals is the priority setting step. The priority setting step is concerned with selecting high-priority substances from a large number of substances, initially starting with 2,474

  20. Trace Chemical Analysis Methodology

    Science.gov (United States)

    1980-04-01

    147 65 Modified DR/2 spectrophotometer face ........... ... 150 66 Colorimetric oil analysis field test kit ......... .. 152 67 Pictorial step...Assisted Pattern Recognitio Perhaps the most promising application of pattern recogntiontechniques for this research effort is the elucidation ".f the...large compartment on the spectrophotomer face . The screwdriver is used to adjust the zero adjust and light ad- just knobs, and the stainless steel

  1. Automated SEM Modal Analysis Applied to the Diogenites

    Science.gov (United States)

    Bowman, L. E.; Spilde, M. N.; Papike, James J.

    1996-01-01

    Analysis of volume proportions of minerals, or modal analysis, is routinely accomplished by point counting on an optical microscope, but the process, particularly on brecciated samples such as the diogenite meteorites, is tedious and prone to error by misidentification of very small fragments, which may make up a significant volume of the sample. Precise volume percentage data can be gathered on a scanning electron microscope (SEM) utilizing digital imaging and an energy dispersive spectrometer (EDS). This form of automated phase analysis reduces error, and at the same time provides more information than could be gathered using simple point counting alone, such as particle morphology statistics and chemical analyses. We have previously studied major, minor, and trace-element chemistry of orthopyroxene from a suite of diogenites. This abstract describes the method applied to determine the modes on this same suite of meteorites and the results of that research. The modal abundances thus determined add additional information on the petrogenesis of the diogenites. In addition, low-abundance phases such as spinels were located for further analysis by this method.

  2. The standard laboratory module approach to automation of the chemical laboratory

    International Nuclear Information System (INIS)

    Hollen, R.M.; Erkkila, T.H.

    1993-01-01

    Automation of the technology and practice of environmental laboratory automation has not been as rapid or complete as one might expect. Confined to autosamplers and limited robotic systems, our ability to apply production concepts to environmental analytical analysis is not great. With the impending remediation of our hazardous waste sites in the US, only the application of production chemistry techniques will even begin to provide those responsible with the necessary knowledge to accomplish the cleanup expeditiously and safely. Tightening regulatory requirements have already mandated staggering increases in sampling and characterization needs with the future only guaranteeing greater demands. The Contaminant Analysis Automation Program has been initiated by our government to address these current and future characterization by application of a new robotic paradigm for analytical chemistry. By using standardized modular instruments, named Standard Laboratory Modules, flexible automation systems can rapidly be configured to apply production techniques to our nations environmental problems at-site

  3. Automated Steel Cleanliness Analysis Tool (ASCAT)

    Energy Technology Data Exchange (ETDEWEB)

    Gary Casuccio (RJ Lee Group); Michael Potter (RJ Lee Group); Fred Schwerer (RJ Lee Group); Dr. Richard J. Fruehan (Carnegie Mellon University); Dr. Scott Story (US Steel)

    2005-12-30

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment

  4. Automated Steel Cleanliness Analysis Tool (ASCAT)

    International Nuclear Information System (INIS)

    Gary Casuccio; Michael Potter; Fred Schwerer; Richard J. Fruehan; Dr. Scott Story

    2005-01-01

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment/steel cleanliness; slab, billet

  5. Automated Cache Performance Analysis And Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Mohror, Kathryn [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-12-23

    While there is no lack of performance counter tools for coarse-grained measurement of cache activity, there is a critical lack of tools for relating data layout to cache behavior to application performance. Generally, any nontrivial optimizations are either not done at all, or are done ”by hand” requiring significant time and expertise. To the best of our knowledge no tool available to users measures the latency of memory reference instructions for partic- ular addresses and makes this information available to users in an easy-to-use and intuitive way. In this project, we worked to enable the Open|SpeedShop performance analysis tool to gather memory reference latency information for specific instructions and memory ad- dresses, and to gather and display this information in an easy-to-use and intuitive way to aid performance analysts in identifying problematic data structures in their codes. This tool was primarily designed for use in the supercomputer domain as well as grid, cluster, cloud-based parallel e-commerce, and engineering systems and middleware. Ultimately, we envision a tool to automate optimization of application cache layout and utilization in the Open|SpeedShop performance analysis tool. To commercialize this soft- ware, we worked to develop core capabilities for gathering enhanced memory usage per- formance data from applications and create and apply novel methods for automatic data structure layout optimizations, tailoring the overall approach to support existing supercom- puter and cluster programming models and constraints. In this Phase I project, we focused on infrastructure necessary to gather performance data and present it in an intuitive way to users. With the advent of enhanced Precise Event-Based Sampling (PEBS) counters on recent Intel processor architectures and equivalent technology on AMD processors, we are now in a position to access memory reference information for particular addresses. Prior to the introduction of PEBS counters

  6. Automated migration analysis based on cell texture: method & reliability

    Directory of Open Access Journals (Sweden)

    Chittenden Thomas W

    2005-03-01

    Full Text Available Abstract Background In this paper, we present and validate a way to measure automatically the extent of cell migration based on automated examination of a series of digital photographs. It was designed specifically to identify the impact of Second Hand Smoke (SHS on endothelial cell migration but has broader applications. The analysis has two stages: (1 preprocessing of image texture, and (2 migration analysis. Results The output is a graphic overlay that indicates the front lines of cell migration superimposed on each original image, with automated reporting of the distance traversed vs. time. Expert preference compares to manual placement of leading edge shows complete equivalence of automated vs. manual leading edge definition for cell migration measurement. Conclusion Our method is indistinguishable from careful manual determinations of cell front lines, with the advantages of full automation, objectivity, and speed.

  7. Automation of the Analysis of Moessbauer Spectra

    International Nuclear Information System (INIS)

    Souza, Paulo A. de Jr.; Garg, R.; Garg, V. K.

    1998-01-01

    In the present report we propose the automation of least square fitting of Moessbauer spectra, the identification of the substance, its crystal structure and the access to the references with the help of a genetic algorith, Fuzzy logic, and the artificial neural network associated with a databank of Moessbauer parameters and references. This system could be useful for specialists and non-specialists, in industry as well as in research laboratories

  8. Granulometric profiling of aeolian dust deposits by automated image analysis

    Science.gov (United States)

    Varga, György; Újvári, Gábor; Kovács, János; Jakab, Gergely; Kiss, Klaudia; Szalai, Zoltán

    2016-04-01

    Determination of granulometric parameters is of growing interest in the Earth sciences. Particle size data of sedimentary deposits provide insights into the physicochemical environment of transport, accumulation and post-depositional alterations of sedimentary particles, and are important proxies applied in paleoclimatic reconstructions. It is especially true for aeolian dust deposits with a fairly narrow grain size range as a consequence of the extremely selective nature of wind sediment transport. Therefore, various aspects of aeolian sedimentation (wind strength, distance to source(s), possible secondary source regions and modes of sedimentation and transport) can be reconstructed only from precise grain size data. As terrestrial wind-blown deposits are among the most important archives of past environmental changes, proper explanation of the proxy data is a mandatory issue. Automated imaging provides a unique technique to gather direct information on granulometric characteristics of sedimentary particles. Granulometric data obtained from automatic image analysis of Malvern Morphologi G3-ID is a rarely applied new technique for particle size and shape analyses in sedimentary geology. Size and shape data of several hundred thousand (or even million) individual particles were automatically recorded in this study from 15 loess and paleosoil samples from the captured high-resolution images. Several size (e.g. circle-equivalent diameter, major axis, length, width, area) and shape parameters (e.g. elongation, circularity, convexity) were calculated by the instrument software. At the same time, the mean light intensity after transmission through each particle is automatically collected by the system as a proxy of optical properties of the material. Intensity values are dependent on chemical composition and/or thickness of the particles. The results of the automated imaging were compared to particle size data determined by three different laser diffraction instruments

  9. Lab-on-a-Disc Platform for Automated Chemical Cell Lysis.

    Science.gov (United States)

    Seo, Moo-Jung; Yoo, Jae-Chern

    2018-02-26

    Chemical cell lysis is an interesting topic in the research to Lab-on-a-Disc (LOD) platforms on account of its perfect compatibility with the centrifugal spin column format. However, standard procedures followed in chemical cell lysis require sophisticated non-contact temperature control as well as the use of pressure resistant valves. These requirements pose a significant challenge thereby making the automation of chemical cell lysis on an LOD extremely difficult to achieve. In this study, an LOD capable of performing fully automated chemical cell lysis is proposed, where a combination of chemical and thermal methods has been used. It comprises a sample inlet, phase change material sheet (PCMS)-based temperature sensor, heating chamber, and pressure resistant valves. The PCMS melts and solidifies at a certain temperature and thus is capable of indicating whether the heating chamber has reached a specific temperature. Compared to conventional cell lysis systems, the proposed system offers advantages of reduced manual labor and a compact structure that can be readily integrated onto an LOD. Experiments using Salmonella typhimurium strains were conducted to confirm the performance of the proposed cell lysis system. The experimental results demonstrate that the proposed system has great potential in realizing chemical cell lysis on an LOD whilst achieving higher throughput in terms of purity and yield of DNA thereby providing a good alternative to conventional cell lysis systems.

  10. Lab-on-a-Disc Platform for Automated Chemical Cell Lysis

    Directory of Open Access Journals (Sweden)

    Moo-Jung Seo

    2018-02-01

    Full Text Available Chemical cell lysis is an interesting topic in the research to Lab-on-a-Disc (LOD platforms on account of its perfect compatibility with the centrifugal spin column format. However, standard procedures followed in chemical cell lysis require sophisticated non-contact temperature control as well as the use of pressure resistant valves. These requirements pose a significant challenge thereby making the automation of chemical cell lysis on an LOD extremely difficult to achieve. In this study, an LOD capable of performing fully automated chemical cell lysis is proposed, where a combination of chemical and thermal methods has been used. It comprises a sample inlet, phase change material sheet (PCMS-based temperature sensor, heating chamber, and pressure resistant valves. The PCMS melts and solidifies at a certain temperature and thus is capable of indicating whether the heating chamber has reached a specific temperature. Compared to conventional cell lysis systems, the proposed system offers advantages of reduced manual labor and a compact structure that can be readily integrated onto an LOD. Experiments using Salmonella typhimurium strains were conducted to confirm the performance of the proposed cell lysis system. The experimental results demonstrate that the proposed system has great potential in realizing chemical cell lysis on an LOD whilst achieving higher throughput in terms of purity and yield of DNA thereby providing a good alternative to conventional cell lysis systems.

  11. Automated quantitative cytological analysis using portable microfluidic microscopy.

    Science.gov (United States)

    Jagannadh, Veerendra Kalyan; Murthy, Rashmi Sreeramachandra; Srinivasan, Rajesh; Gorthi, Sai Siva

    2016-06-01

    In this article, a portable microfluidic microscopy based approach for automated cytological investigations is presented. Inexpensive optical and electronic components have been used to construct a simple microfluidic microscopy system. In contrast to the conventional slide-based methods, the presented method employs microfluidics to enable automated sample handling and image acquisition. The approach involves the use of simple in-suspension staining and automated image acquisition to enable quantitative cytological analysis of samples. The applicability of the presented approach to research in cellular biology is shown by performing an automated cell viability assessment on a given population of yeast cells. Further, the relevance of the presented approach to clinical diagnosis and prognosis has been demonstrated by performing detection and differential assessment of malaria infection in a given sample. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    Oblow, E.M.; Pin, F.G.

    1986-01-01

    An automated procedure for performing sensitivity analysis has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with direct and adjoint sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies

  13. Analysis of Trinity Power Metrics for Automated Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Michalenko, Ashley Christine [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-07-28

    This is a presentation from Los Alamos National Laboraotyr (LANL) about the analysis of trinity power metrics for automated monitoring. The following topics are covered: current monitoring efforts, motivation for analysis, tools used, the methodology, work performed during the summer, and future work planned.

  14. Initial development of an automated task analysis profiling system

    International Nuclear Information System (INIS)

    Jorgensen, C.C.

    1984-01-01

    A program for automated task analysis is described. Called TAPS (task analysis profiling system), the program accepts normal English prose and outputs skills, knowledges, attitudes, and abilities (SKAAs) along with specific guidance and recommended ability measurement tests for nuclear power plant operators. A new method for defining SKAAs is presented along with a sample program output

  15. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    Oblow, E.M.; Pin, F.G.

    1985-01-01

    An automated procedure for performing sensitivity analyses has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with ''direct'' and ''adjoint'' sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies. 24 refs., 2 figs

  16. Robust automated mass spectra interpretation and chemical formula calculation using mixed integer linear programming.

    Science.gov (United States)

    Baran, Richard; Northen, Trent R

    2013-10-15

    Untargeted metabolite profiling using liquid chromatography and mass spectrometry coupled via electrospray ionization is a powerful tool for the discovery of novel natural products, metabolic capabilities, and biomarkers. However, the elucidation of the identities of uncharacterized metabolites from spectral features remains challenging. A critical step in the metabolite identification workflow is the assignment of redundant spectral features (adducts, fragments, multimers) and calculation of the underlying chemical formula. Inspection of the data by experts using computational tools solving partial problems (e.g., chemical formula calculation for individual ions) can be performed to disambiguate alternative solutions and provide reliable results. However, manual curation is tedious and not readily scalable or standardized. Here we describe an automated procedure for the robust automated mass spectra interpretation and chemical formula calculation using mixed integer linear programming optimization (RAMSI). Chemical rules among related ions are expressed as linear constraints and both the spectra interpretation and chemical formula calculation are performed in a single optimization step. This approach is unbiased in that it does not require predefined sets of neutral losses and positive and negative polarity spectra can be combined in a single optimization. The procedure was evaluated with 30 experimental mass spectra and was found to effectively identify the protonated or deprotonated molecule ([M + H](+) or [M - H](-)) while being robust to the presence of background ions. RAMSI provides a much-needed standardized tool for interpreting ions for subsequent identification in untargeted metabolomics workflows.

  17. A catalog of automated analysis methods for enterprise models.

    Science.gov (United States)

    Florez, Hector; Sánchez, Mario; Villalobos, Jorge

    2016-01-01

    Enterprise models are created for documenting and communicating the structure and state of Business and Information Technologies elements of an enterprise. After models are completed, they are mainly used to support analysis. Model analysis is an activity typically based on human skills and due to the size and complexity of the models, this process can be complicated and omissions or miscalculations are very likely. This situation has fostered the research of automated analysis methods, for supporting analysts in enterprise analysis processes. By reviewing the literature, we found several analysis methods; nevertheless, they are based on specific situations and different metamodels; then, some analysis methods might not be applicable to all enterprise models. This paper presents the work of compilation (literature review), classification, structuring, and characterization of automated analysis methods for enterprise models, expressing them in a standardized modeling language. In addition, we have implemented the analysis methods in our modeling tool.

  18. Automated image analysis in the study of collagenous colitis

    DEFF Research Database (Denmark)

    Fiehn, Anne-Marie Kanstrup; Kristensson, Martin; Engel, Ulla

    2016-01-01

    PURPOSE: The aim of this study was to develop an automated image analysis software to measure the thickness of the subepithelial collagenous band in colon biopsies with collagenous colitis (CC) and incomplete CC (CCi). The software measures the thickness of the collagenous band on microscopic...... slides stained with Van Gieson (VG). PATIENTS AND METHODS: A training set consisting of ten biopsies diagnosed as CC, CCi, and normal colon mucosa was used to develop the automated image analysis (VG app) to match the assessment by a pathologist. The study set consisted of biopsies from 75 patients...

  19. Automation of reactor neutron activation analysis

    International Nuclear Information System (INIS)

    Pavlov, S.S.; Dmitriev, A.Yu.; Frontasyeva, M.V.

    2013-01-01

    The present status of the development of a software package designed for automation of NAA at the IBR-2 reactor of FLNP, JINR, Dubna, is reported. Following decisions adopted at the CRP Meeting in Delft, August 27-31, 2012, the missing tool - a sample changer - will be installed for NAA in compliance with the peculiar features of the radioanalytical laboratory REGATA at the IBR-2 reactor. The details of the design are presented. The software for operation with the sample changer consists of two parts. The first part is a user interface and the second one is a program to control the sample changer. The second part will be developed after installing the tool.

  20. Automated sensitivity analysis using the GRESS language

    International Nuclear Information System (INIS)

    Pin, F.G.; Oblow, E.M.; Wright, R.Q.

    1986-04-01

    An automated procedure for performing large-scale sensitivity studies based on the use of computer calculus is presented. The procedure is embodied in a FORTRAN precompiler called GRESS, which automatically processes computer models and adds derivative-taking capabilities to the normal calculated results. In this report, the GRESS code is described, tested against analytic and numerical test problems, and then applied to a major geohydrological modeling problem. The SWENT nuclear waste repository modeling code is used as the basis for these studies. Results for all problems are discussed in detail. Conclusions are drawn as to the applicability of GRESS in the problems at hand and for more general large-scale modeling sensitivity studies

  1. Chemical analysis of geological samples

    International Nuclear Information System (INIS)

    Malhotra, R.K.

    1997-01-01

    Most of the analytical methodology used in geochemical exploration has been based on molecular absorption, atomic absorption, and ICP-AES, ICPMAS etc. Detection limit and precision are factors in the choice of methodology in search of metallic ores and are related to the accuracy of data. A brief outline of the various chemical analysis techniques explaining essentially the basics of measurement principles and instrumentation is discussed

  2. Automated analysis and design of complex structures

    International Nuclear Information System (INIS)

    Wilson, E.L.

    1977-01-01

    This paper discusses the following: 1. The relationship of analysis to design. 2. New methods of analysis. 3. Improved finite elements. 4. Effect of minicomputer on structural analysis methods. 5. The use of system of microprocessors for nonlinear structural analysis. 6. The role of interacting graphics systems in future analysis and design. The discussion focusses on the impact of new inexpensive computer hardware on design and analysis methods. (Auth.)

  3. ClassyFire: automated chemical classification with a comprehensive, computable taxonomy.

    Science.gov (United States)

    Djoumbou Feunang, Yannick; Eisner, Roman; Knox, Craig; Chepelev, Leonid; Hastings, Janna; Owen, Gareth; Fahy, Eoin; Steinbeck, Christoph; Subramanian, Shankar; Bolton, Evan; Greiner, Russell; Wishart, David S

    2016-01-01

    Scientists have long been driven by the desire to describe, organize, classify, and compare objects using taxonomies and/or ontologies. In contrast to biology, geology, and many other scientific disciplines, the world of chemistry still lacks a standardized chemical ontology or taxonomy. Several attempts at chemical classification have been made; but they have mostly been limited to either manual, or semi-automated proof-of-principle applications. This is regrettable as comprehensive chemical classification and description tools could not only improve our understanding of chemistry but also improve the linkage between chemistry and many other fields. For instance, the chemical classification of a compound could help predict its metabolic fate in humans, its druggability or potential hazards associated with it, among others. However, the sheer number (tens of millions of compounds) and complexity of chemical structures is such that any manual classification effort would prove to be near impossible. We have developed a comprehensive, flexible, and computable, purely structure-based chemical taxonomy (ChemOnt), along with a computer program (ClassyFire) that uses only chemical structures and structural features to automatically assign all known chemical compounds to a taxonomy consisting of >4800 different categories. This new chemical taxonomy consists of up to 11 different levels (Kingdom, SuperClass, Class, SubClass, etc.) with each of the categories defined by unambiguous, computable structural rules. Furthermore each category is named using a consensus-based nomenclature and described (in English) based on the characteristic common structural properties of the compounds it contains. The ClassyFire webserver is freely accessible at http://classyfire.wishartlab.com/. Moreover, a Ruby API version is available at https://bitbucket.org/wishartlab/classyfire_api, which provides programmatic access to the ClassyFire server and database. ClassyFire has been used to

  4. Automated hazard analysis of digital control systems

    International Nuclear Information System (INIS)

    Garrett, Chris J.; Apostolakis, George E.

    2002-01-01

    Digital instrumentation and control (I and C) systems can provide important benefits in many safety-critical applications, but they can also introduce potential new failure modes that can affect safety. Unlike electro-mechanical systems, whose failure modes are fairly well understood and which can often be built to fail in a particular way, software errors are very unpredictable. There is virtually no nontrivial software that will function as expected under all conditions. Consequently, there is a great deal of concern about whether there is a sufficient basis on which to resolve questions about safety. In this paper, an approach for validating the safety requirements of digital I and C systems is developed which uses the Dynamic Flowgraph Methodology to conduct automated hazard analyses. The prime implicants of these analyses can be used to identify unknown system hazards, prioritize the disposition of known system hazards, and guide lower-level design decisions to either eliminate or mitigate known hazards. In a case study involving a space-based reactor control system, the method succeeded in identifying an unknown failure mechanism

  5. Rapid chemical analysis of allanite

    International Nuclear Information System (INIS)

    Nishiyama, Goro; Hayashi, Hiroshi

    1981-01-01

    Rapid chemical analysis of allanite was studied by atomic absorption spectrophotometry. Powdered sample was fused with mixture of sodium carbonate anhydrous and borax (4 : 1 weight) in platinum crucible and sample solution was prepared. SiO 2 , Fe 2 O 3 , Al 2 O 3 , MnO and rare earth metals were determined by atomic absorption spectrophotometry, CaO, MgO and Ce 2 O 3 by titration, ThO 2 by colorimetry, and La 2 O 3 by flame photometry respectively. For sample solution treated with hydrofluoric acid and sulfuric acid. Na 2 O and K 2 O were determined by atomic absorption spectrophotometry, TiO 2 and P 2 O 5 by colorimetry. Chemical analyses for four samples were carried out and gave consistent results. (author)

  6. Capacity analysis of an automated kit transportation system

    NARCIS (Netherlands)

    Zijm, W.H.M.; Adan, I.J.B.F.; Buitenhek, R.; Houtum, van G.J.J.A.N.

    2000-01-01

    In this paper, we present a capacity analysis of an automated transportation system in a flexible assembly factory. The transportation system, together with the workstations, is modeled as a network of queues with multiple job classes. Due to its complex nature, the steadystate behavior of this

  7. Automated image analysis for quantification of filamentous bacteria

    DEFF Research Database (Denmark)

    Fredborg, Marlene; Rosenvinge, Flemming Schønning; Spillum, Erik

    2015-01-01

    in systems relying on colorimetry or turbidometry (such as Vitek-2, Phoenix, MicroScan WalkAway). The objective was to examine an automated image analysis algorithm for quantification of filamentous bacteria using the 3D digital microscopy imaging system, oCelloScope. Results Three E. coli strains displaying...

  8. Automation of the Analysis and Classification of the Line Material

    Directory of Open Access Journals (Sweden)

    A. A. Machuev

    2011-03-01

    Full Text Available The work is devoted to the automation of the process of the analysis and verification of various formats of data presentation for what the special software is developed. Working out and testing the special software were made on an example of files with the typical expansions which features of structure are known in advance.

  9. An Automated Data Analysis Tool for Livestock Market Data

    Science.gov (United States)

    Williams, Galen S.; Raper, Kellie Curry

    2011-01-01

    This article describes an automated data analysis tool that allows Oklahoma Cooperative Extension Service educators to disseminate results in a timely manner. Primary data collected at Oklahoma Quality Beef Network (OQBN) certified calf auctions across the state results in a large amount of data per sale site. Sale summaries for an individual sale…

  10. An extended chemical analysis of gallstone

    OpenAIRE

    Chandran, P.; Kuchhal, N. K.; Garg, P.; Pundir, C. S.

    2007-01-01

    Chemical composition of gall stones is essential for aetiopathogensis of gallstone disease. We have reported quantitative chemical analysis of total cholesterol bilirubin, calcium, iron and inorganic phosphate in 120 gallstones from haryana. To extend this chemical analysis of gall stones by studying more cases and by analyzing more chemical constituents. A quantitative chemical analysis of total cholesterol, total bilirubin, fatty acids, triglycerides, phospholipids, bile acids, soluble prot...

  11. IMAGE CONSTRUCTION TO AUTOMATION OF PROJECTIVE TECHNIQUES FOR PSYCHOPHYSIOLOGICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Natalia Pavlova

    2018-04-01

    Full Text Available The search for a solution of automation of the process of assessment of a psychological analysis of the person drawings created by it from an available set of some templates are presented at this article. It will allow to reveal more effectively infringements of persons mentality. In particular, such decision can be used for work with children who possess the developed figurative thinking, but are not yet capable of an accurate statement of the thoughts and experiences. For automation of testing by using a projective method, we construct interactive environment for visualization of compositions of the several images and then analyse

  12. Automated procedure for performing computer security risk analysis

    International Nuclear Information System (INIS)

    Smith, S.T.; Lim, J.J.

    1984-05-01

    Computers, the invisible backbone of nuclear safeguards, monitor and control plant operations and support many materials accounting systems. Our automated procedure to assess computer security effectiveness differs from traditional risk analysis methods. The system is modeled as an interactive questionnaire, fully automated on a portable microcomputer. A set of modular event trees links the questionnaire to the risk assessment. Qualitative scores are obtained for target vulnerability, and qualitative impact measures are evaluated for a spectrum of threat-target pairs. These are then combined by a linguistic algebra to provide an accurate and meaningful risk measure. 12 references, 7 figures

  13. ORIGAMI Automator Primer. Automated ORIGEN Source Terms and Spent Fuel Storage Pool Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wieselquist, William A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Thompson, Adam B. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bowman, Stephen M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Peterson, Joshua L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-04-01

    Source terms and spent nuclear fuel (SNF) storage pool decay heat load analyses for operating nuclear power plants require a large number of Oak Ridge Isotope Generation and Depletion (ORIGEN) calculations. SNF source term calculations also require a significant amount of bookkeeping to track quantities such as core and assembly operating histories, spent fuel pool (SFP) residence times, heavy metal masses, and enrichments. The ORIGEN Assembly Isotopics (ORIGAMI) module in the SCALE code system provides a simple scheme for entering these data. However, given the large scope of the analysis, extensive scripting is necessary to convert formats and process data to create thousands of ORIGAMI input files (one per assembly) and to process the results into formats readily usable by follow-on analysis tools. This primer describes a project within the SCALE Fulcrum graphical user interface (GUI) called ORIGAMI Automator that was developed to automate the scripting and bookkeeping in large-scale source term analyses. The ORIGAMI Automator enables the analyst to (1) easily create, view, and edit the reactor site and assembly information, (2) automatically create and run ORIGAMI inputs, and (3) analyze the results from ORIGAMI. ORIGAMI Automator uses the standard ORIGEN binary concentrations files produced by ORIGAMI, with concentrations available at all time points in each assembly’s life. The GUI plots results such as mass, concentration, activity, and decay heat using a powerful new ORIGEN Post-Processing Utility for SCALE (OPUS) GUI component. This document includes a description and user guide for the GUI, a step-by-step tutorial for a simplified scenario, and appendices that document the file structures used.

  14. Combined process automation for large-scale EEG analysis.

    Science.gov (United States)

    Sfondouris, John L; Quebedeaux, Tabitha M; Holdgraf, Chris; Musto, Alberto E

    2012-01-01

    Epileptogenesis is a dynamic process producing increased seizure susceptibility. Electroencephalography (EEG) data provides information critical in understanding the evolution of epileptiform changes throughout epileptic foci. We designed an algorithm to facilitate efficient large-scale EEG analysis via linked automation of multiple data processing steps. Using EEG recordings obtained from electrical stimulation studies, the following steps of EEG analysis were automated: (1) alignment and isolation of pre- and post-stimulation intervals, (2) generation of user-defined band frequency waveforms, (3) spike-sorting, (4) quantification of spike and burst data and (5) power spectral density analysis. This algorithm allows for quicker, more efficient EEG analysis. Copyright © 2011 Elsevier Ltd. All rights reserved.

  15. Automated analysis and design of complex structures

    International Nuclear Information System (INIS)

    Wilson, E.L.

    1977-01-01

    The present application of optimum design appears to be restricted to components of the structure rather than to the total structural system. Since design normally involved many analysis of the system any improvement in the efficiency of the basic methods of analysis will allow more complicated systems to be designed by optimum methods. The evaluation of the risk and reliability of a structural system can be extremely important. Reliability studies have been made of many non-structural systems for which the individual components have been extensively tested and the service environment is known. For such systems the reliability studies are valid. For most structural systems, however, the properties of the components can only be estimated and statistical data associated with the potential loads is often minimum. Also, a potentially critical loading condition may be completely neglected in the study. For these reasons and the previous problems associated with the reliability of both linear and nonlinear analysis computer programs it appears to be premature to place a significant value on such studies for complex structures. With these comments as background the purpose of this paper is to discuss the following: the relationship of analysis to design; new methods of analysis; new of improved finite elements; effect of minicomputer on structural analysis methods; the use of system of microprocessors for nonlinear structural analysis; the role of interacting graphics systems in future analysis and design. This discussion will focus on the impact of new, inexpensive computer hardware on design and analysis methods

  16. A5: Automated Analysis of Adversarial Android Applications

    Science.gov (United States)

    2014-06-03

    A5: Automated Analysis of Adversarial Android Applications Timothy Vidas, Jiaqi Tan, Jay Nahata, Chaur Lih Tan, Nicolas Christin...detecting, on the device itself, that an application is malicious is much more complex without elevated privileges . In other words, given the...interface via website. Blasing et al. [7] describe another dynamic analysis system for Android . Their system focuses on classifying input applications as

  17. Chemical analysis by nuclear techniques

    International Nuclear Information System (INIS)

    Sohn, S. C.; Kim, W. H.; Park, Y. J.; Park, Y. J.; Song, B. C.; Jeon, Y. S.; Jee, K. Y.; Pyo, H. Y.

    2002-01-01

    This state art report consists of four parts, production of micro-particles, analysis of boron, alpha tracking method and development of neutron induced prompt gamma ray spectroscopy (NIPS) system. The various methods for the production of micro-paticles such as mechanical method, electrolysis method, chemical method, spray method were described in the first part. The second part contains sample treatment, separation and concentration, analytical method, and application of boron analysis. The third part contains characteristics of alpha track, track dectectors, pretreatment of sample, neutron irradiation, etching conditions for various detectors, observation of track on the detector, etc. The last part contains basic theory, neutron source, collimator, neutron shields, calibration of NIPS, and application of NIPS system

  18. Chemical analysis by nuclear techniques

    Energy Technology Data Exchange (ETDEWEB)

    Sohn, S. C.; Kim, W. H.; Park, Y. J.; Song, B. C.; Jeon, Y. S.; Jee, K. Y.; Pyo, H. Y

    2002-01-01

    This state art report consists of four parts, production of micro-particles, analysis of boron, alpha tracking method and development of neutron induced prompt gamma ray spectroscopy (NIPS) system. The various methods for the production of micro-paticles such as mechanical method, electrolysis method, chemical method, spray method were described in the first part. The second part contains sample treatment, separation and concentration, analytical method, and application of boron analysis. The third part contains characteristics of alpha track, track dectectors, pretreatment of sample, neutron irradiation, etching conditions for various detectors, observation of track on the detector, etc. The last part contains basic theory, neutron source, collimator, neutron shields, calibration of NIPS, and application of NIPS system.

  19. Cardiac imaging: working towards fully-automated machine analysis & interpretation.

    Science.gov (United States)

    Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido

    2017-03-01

    Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered: This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary: Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation.

  20. Automated acquisition and analysis of small angle X-ray scattering data

    International Nuclear Information System (INIS)

    Franke, Daniel; Kikhney, Alexey G.; Svergun, Dmitri I.

    2012-01-01

    Small Angle X-ray Scattering (SAXS) is a powerful tool in the study of biological macromolecules providing information about the shape, conformation, assembly and folding states in solution. Recent advances in robotic fluid handling make it possible to perform automated high throughput experiments including fast screening of solution conditions, measurement of structural responses to ligand binding, changes in temperature or chemical modifications. Here, an approach to full automation of SAXS data acquisition and data analysis is presented, which advances automated experiments to the level of a routine tool suitable for large scale structural studies. The approach links automated sample loading, primary data reduction and further processing, facilitating queuing of multiple samples for subsequent measurement and analysis and providing means of remote experiment control. The system was implemented and comprehensively tested in user operation at the BioSAXS beamlines X33 and P12 of EMBL at the DORIS and PETRA storage rings of DESY, Hamburg, respectively, but is also easily applicable to other SAXS stations due to its modular design.

  1. Automated software analysis of nuclear core discharge data

    International Nuclear Information System (INIS)

    Larson, T.W.; Halbig, J.K.; Howell, J.A.; Eccleston, G.W.; Klosterbuer, S.F.

    1993-03-01

    Monitoring the fueling process of an on-load nuclear reactor is a full-time job for nuclear safeguarding agencies. Nuclear core discharge monitors (CDMS) can provide continuous, unattended recording of the reactor's fueling activity for later, qualitative review by a safeguards inspector. A quantitative analysis of this collected data could prove to be a great asset to inspectors because more information can be extracted from the data and the analysis time can be reduced considerably. This paper presents a prototype for an automated software analysis system capable of identifying when fuel bundle pushes occurred and monitoring the power level of the reactor. Neural network models were developed for calculating the region on the reactor face from which the fuel was discharged and predicting the burnup. These models were created and tested using actual data collected from a CDM system at an on-load reactor facility. Collectively, these automated quantitative analysis programs could help safeguarding agencies to gain a better perspective on the complete picture of the fueling activity of an on-load nuclear reactor. This type of system can provide a cost-effective solution for automated monitoring of on-load reactors significantly reducing time and effort

  2. Tank Farm Operations Surveillance Automation Analysis

    International Nuclear Information System (INIS)

    MARQUEZ, D.L.

    2000-01-01

    The Nuclear Operations Project Services identified the need to improve manual tank farm surveillance data collection, review, distribution and storage practices often referred to as Operator Rounds. This document provides the analysis in terms of feasibility to improve the manual data collection methods by using handheld computer units, barcode technology, a database for storage and acquisitions, associated software, and operational procedures to increase the efficiency of Operator Rounds associated with surveillance activities

  3. Automated optics inspection analysis for NIF

    Energy Technology Data Exchange (ETDEWEB)

    Kegelmeyer, Laura M., E-mail: kegelmeyer1@llnl.gov [Lawrence Livermore National Laboratory, Livermore, CA (United States); Clark, Raelyn; Leach, Richard R.; McGuigan, David; Kamm, Victoria Miller; Potter, Daniel; Salmon, J. Thad; Senecal, Joshua; Conder, Alan; Nostrand, Mike; Whitman, Pamela K. [Lawrence Livermore National Laboratory, Livermore, CA (United States)

    2012-12-15

    The National Ignition Facility (NIF) is a high-energy laser facility comprised of 192 beamlines that house thousands of optics. These optics guide, amplify and tightly focus light onto a tiny target for fusion ignition research and high energy density physics experiments. The condition of these optics is key to the economic, efficient and maximally energetic performance of the laser. Our goal, and novel achievement, is to find on the optics any imperfections while they are tens of microns in size, track them through time to see if they grow and if so, remove the optic and repair the single site so the entire optic can then be re-installed for further use on the laser. This paper gives an overview of the image analysis used for detecting, measuring, and tracking sites of interest on an optic while it is installed on the beamline via in situ inspection and after it has been removed for maintenance. In this way, the condition of each optic is monitored throughout the optic's lifetime. This overview paper will summarize key algorithms and technical developments for custom image analysis and processing and highlight recent improvements. (Associated papers will include more details on these issues.) We will also discuss the use of OI Analysis for daily operation of the NIF laser and its extension to inspection of NIF targets.

  4. Micro photometer's automation for quantitative spectrograph analysis

    International Nuclear Information System (INIS)

    Gutierrez E, C.Y.A.

    1996-01-01

    A Microphotometer is used to increase the sharpness of dark spectral lines. Analyzing these lines one sample content and its concentration could be determined and the analysis is known as Quantitative Spectrographic Analysis. The Quantitative Spectrographic Analysis is carried out in 3 steps, as follows. 1. Emulsion calibration. This consists of gauging a photographic emulsion, to determine the intensity variations in terms of the incident radiation. For the procedure of emulsion calibration an adjustment with square minimum to the data obtained is applied to obtain a graph. It is possible to determine the density of dark spectral line against the incident light intensity shown by the microphotometer. 2. Working curves. The values of known concentration of an element against incident light intensity are plotted. Since the sample contains several elements, it is necessary to find a work curve for each one of them. 3. Analytical results. The calibration curve and working curves are compared and the concentration of the studied element is determined. The automatic data acquisition, calculation and obtaining of resulting, is done by means of a computer (PC) and a computer program. The conditioning signal circuits have the function of delivering TTL levels (Transistor Transistor Logic) to make the communication between the microphotometer and the computer possible. Data calculation is done using a computer programm

  5. Automated optics inspection analysis for NIF

    International Nuclear Information System (INIS)

    Kegelmeyer, Laura M.; Clark, Raelyn; Leach, Richard R.; McGuigan, David; Kamm, Victoria Miller; Potter, Daniel; Salmon, J. Thad; Senecal, Joshua; Conder, Alan; Nostrand, Mike; Whitman, Pamela K.

    2012-01-01

    The National Ignition Facility (NIF) is a high-energy laser facility comprised of 192 beamlines that house thousands of optics. These optics guide, amplify and tightly focus light onto a tiny target for fusion ignition research and high energy density physics experiments. The condition of these optics is key to the economic, efficient and maximally energetic performance of the laser. Our goal, and novel achievement, is to find on the optics any imperfections while they are tens of microns in size, track them through time to see if they grow and if so, remove the optic and repair the single site so the entire optic can then be re-installed for further use on the laser. This paper gives an overview of the image analysis used for detecting, measuring, and tracking sites of interest on an optic while it is installed on the beamline via in situ inspection and after it has been removed for maintenance. In this way, the condition of each optic is monitored throughout the optic's lifetime. This overview paper will summarize key algorithms and technical developments for custom image analysis and processing and highlight recent improvements. (Associated papers will include more details on these issues.) We will also discuss the use of OI Analysis for daily operation of the NIF laser and its extension to inspection of NIF targets.

  6. Automated reasoning applications to design validation and sneak function analysis

    International Nuclear Information System (INIS)

    Stratton, R.C.

    1984-01-01

    Argonne National Laboratory (ANL) is actively involved in the LMFBR Man-Machine Integration (MMI) Safety Program. The objective of this program is to enhance the operational safety and reliability of fast-breeder reactors by optimum integration of men and machines through the application of human factors principles and control engineering to the design, operation, and the control environment. ANL is developing methods to apply automated reasoning and computerization in the validation and sneak function analysis process. This project provides the element definitions and relations necessary for an automated reasoner (AR) to reason about design validation and sneak function analysis. This project also provides a demonstration of this AR application on an Experimental Breeder Reactor-II (EBR-II) system, the Argonne Cooling System

  7. Automated Asteroseismic Analysis of Solar-type Stars

    DEFF Research Database (Denmark)

    Karoff, Christoffer; Campante, T.L.; Chaplin, W.J.

    2010-01-01

    The rapidly increasing volume of asteroseismic observations on solar-type stars has revealed a need for automated analysis tools. The reason for this is not only that individual analyses of single stars are rather time consuming, but more importantly that these large volumes of observations open...... are calculated in a consistent way. Here we present a set of automated asterosesimic analysis tools. The main engine of these set of tools is an algorithm for modelling the autocovariance spectra of the stellar acoustic spectra allowing us to measure not only the frequency of maximum power and the large......, radius, luminosity, effective temperature, surface gravity and age based on grid modeling. All the tools take into account the window function of the observations which means that they work equally well for space-based photometry observations from e.g. the NASA Kepler satellite and ground-based velocity...

  8. Extended -Regular Sequence for Automated Analysis of Microarray Images

    Directory of Open Access Journals (Sweden)

    Jin Hee-Jeong

    2006-01-01

    Full Text Available Microarray study enables us to obtain hundreds of thousands of expressions of genes or genotypes at once, and it is an indispensable technology for genome research. The first step is the analysis of scanned microarray images. This is the most important procedure for obtaining biologically reliable data. Currently most microarray image processing systems require burdensome manual block/spot indexing work. Since the amount of experimental data is increasing very quickly, automated microarray image analysis software becomes important. In this paper, we propose two automated methods for analyzing microarray images. First, we propose the extended -regular sequence to index blocks and spots, which enables a novel automatic gridding procedure. Second, we provide a methodology, hierarchical metagrid alignment, to allow reliable and efficient batch processing for a set of microarray images. Experimental results show that the proposed methods are more reliable and convenient than the commercial tools.

  9. Experience based ageing analysis of NPP protection automation in Finland

    International Nuclear Information System (INIS)

    Simola, K.

    2000-01-01

    This paper describes three successive studies on ageing of protection automation of nuclear power plants. These studies were aimed at developing a methodology for an experience based ageing analysis, and applying it to identify the most critical components from ageing and safety points of view. The analyses resulted also to suggestions for improvement of data collection systems for the purpose of further ageing analyses. (author)

  10. An automated solution enrichment system for uranium analysis

    International Nuclear Information System (INIS)

    Jones, S.A.; Sparks, R.; Sampson, T.; Parker, J.; Horley, E.; Kelly, T.

    1993-01-01

    An automated Solution Enrichment system (SES) for analysis of Uranium and U-235 isotopes in process samples has been developed through a joint effort between Los Alamos National Laboratory and Martin Marietta Energy systems, Portsmouth Gaseous Diffusion Plant. This device features an advanced robotics system which in conjuction with stabilized passive gamma-ray and X-ray fluorescence detectors provides for rapid, non-destructive analyses of process samples for improved special nuclear material accountability and process control

  11. Automated Freedom from Interference Analysis for Automotive Software

    OpenAIRE

    Leitner-Fischer , Florian; Leue , Stefan; Liu , Sirui

    2016-01-01

    International audience; Freedom from Interference for automotive software systems developed according to the ISO 26262 standard means that a fault in a less safety critical software component will not lead to a fault in a more safety critical component. It is an important concern in the realm of functional safety for automotive systems. We present an automated method for the analysis of concurrency-related interferences based on the QuantUM approach and tool that we have previously developed....

  12. Discrimination between smiling faces: Human observers vs. automated face analysis.

    Science.gov (United States)

    Del Líbano, Mario; Calvo, Manuel G; Fernández-Martín, Andrés; Recio, Guillermo

    2018-05-11

    This study investigated (a) how prototypical happy faces (with happy eyes and a smile) can be discriminated from blended expressions with a smile but non-happy eyes, depending on type and intensity of the eye expression; and (b) how smile discrimination differs for human perceivers versus automated face analysis, depending on affective valence and morphological facial features. Human observers categorized faces as happy or non-happy, or rated their valence. Automated analysis (FACET software) computed seven expressions (including joy/happiness) and 20 facial action units (AUs). Physical properties (low-level image statistics and visual saliency) of the face stimuli were controlled. Results revealed, first, that some blended expressions (especially, with angry eyes) had lower discrimination thresholds (i.e., they were identified as "non-happy" at lower non-happy eye intensities) than others (especially, with neutral eyes). Second, discrimination sensitivity was better for human perceivers than for automated FACET analysis. As an additional finding, affective valence predicted human discrimination performance, whereas morphological AUs predicted FACET discrimination. FACET can be a valid tool for categorizing prototypical expressions, but is currently more limited than human observers for discrimination of blended expressions. Configural processing facilitates detection of in/congruence(s) across regions, and thus detection of non-genuine smiling faces (due to non-happy eyes). Copyright © 2018 Elsevier B.V. All rights reserved.

  13. An optimized method for automated analysis of algal pigments by HPLC

    NARCIS (Netherlands)

    van Leeuwe, M. A.; Villerius, L. A.; Roggeveld, J.; Visser, R. J. W.; Stefels, J.

    2006-01-01

    A recent development in algal pigment analysis by high-performance liquid chromatography (HPLC) is the application of automation. An optimization of a complete sampling and analysis protocol applied specifically in automation has not yet been performed. In this paper we show that automation can only

  14. Chemical Reactor Automation as a way to Optimize a Laboratory Scale Polymerization Process

    Science.gov (United States)

    Cruz-Campa, Jose L.; Saenz de Buruaga, Isabel; Lopez, Raymundo

    2004-10-01

    The automation of the registration and control of variables involved in a chemical reactor improves the reaction process by making it faster, optimized and without the influence of human error. The objective of this work is to register and control the involved variables (temperatures, reactive fluxes, weights, etc) in an emulsion polymerization reaction. The programs and control algorithms were developed in the language G in LabVIEW®. The designed software is able to send and receive RS232 codified data from the devices (pumps, temperature sensors, mixer, balances, and so on) to and from a personal Computer. The transduction from digital information to movement or measurement actions of the devices is done by electronic components included in the devices. Once the programs were done and proved, chemical reactions of emulsion polymerization were made to validate the system. Moreover, some advanced heat-estimation algorithms were implemented in order to know the heat caused by the reaction and the estimation and control of chemical variables in-line. All the information gotten from the reaction is stored in the PC. The information is then available and ready to use in any commercial data processor software. This work is now being used in a Research Center in order to make emulsion polymerizations under efficient and controlled conditions with reproducible results. The experiences obtained from this project may be used in the implementation of chemical estimation algorithms at pilot plant or industrial scale.

  15. Automated chemical kinetic modeling via hybrid reactive molecular dynamics and quantum chemistry simulations.

    Science.gov (United States)

    Döntgen, Malte; Schmalz, Felix; Kopp, Wassja A; Kröger, Leif C; Leonhard, Kai

    2018-06-13

    An automated scheme for obtaining chemical kinetic models from scratch using reactive molecular dynamics and quantum chemistry simulations is presented. This methodology combines the phase space sampling of reactive molecular dynamics with the thermochemistry and kinetics prediction capabilities of quantum mechanics. This scheme provides the NASA polynomial and modified Arrhenius equation parameters for all species and reactions that are observed during the simulation and supplies them in the ChemKin format. The ab initio level of theory for predictions is easily exchangeable and the presently used G3MP2 level of theory is found to reliably reproduce hydrogen and methane oxidation thermochemistry and kinetics data. Chemical kinetic models obtained with this approach are ready-to-use for, e.g., ignition delay time simulations, as shown for hydrogen combustion. The presented extension of the ChemTraYzer approach can be used as a basis for methodologically advancing chemical kinetic modeling schemes and as a black-box approach to generate chemical kinetic models.

  16. A Parallel Multiblock Structured Grid Method with Automated Interblocked Unstructured Grids for Chemically Reacting Flows

    Science.gov (United States)

    Spiegel, Seth Christian

    An automated method for using unstructured grids to patch non- C0 interfaces between structured blocks has been developed in conjunction with a finite-volume method for solving chemically reacting flows on unstructured grids. Although the standalone unstructured solver, FVFLO-NCSU, is capable of resolving flows for high-speed aeropropulsion devices with complex geometries, unstructured-mesh algorithms are inherently inefficient when compared to their structured counterparts. However, the advantages of structured algorithms in developing a flow solution in a timely manner can be negated by the amount of time required to develop a mesh for complex geometries. The global domain can be split up into numerous smaller blocks during the grid-generation process to alleviate some of the difficulties in creating these complex meshes. An even greater abatement can be found by allowing the nodes on abutting block interfaces to be nonmatching or non-C 0 continuous. One code capable of solving chemically reacting flows on these multiblock grids is VULCAN, which uses a nonconservative approach for patching non-C0 block interfaces. The developed automated unstructured-grid patching algorithm has been installed within VULCAN to provide it the capability of a fully conservative approach for patching non-C0 block interfaces. Additionally, the FVFLO-NCSU solver algorithms have been deeply intertwined with the VULCAN source code to solve chemically reacting flows on these unstructured patches. Finally, the CGNS software library was added to the VULCAN postprocessor so structured and unstructured data can be stored in a single compact file. This final upgrade to VULCAN has been successfully installed and verified using test cases with particular interest towards those involving grids with non- C0 block interfaces.

  17. Managing expectations: assessment of chemistry databases generated by automated extraction of chemical structures from patents.

    Science.gov (United States)

    Senger, Stefan; Bartek, Luca; Papadatos, George; Gaulton, Anna

    2015-12-01

    First public disclosure of new chemical entities often takes place in patents, which makes them an important source of information. However, with an ever increasing number of patent applications, manual processing and curation on such a large scale becomes even more challenging. An alternative approach better suited for this large corpus of documents is the automated extraction of chemical structures. A number of patent chemistry databases generated by using the latter approach are now available but little is known that can help to manage expectations when using them. This study aims to address this by comparing two such freely available sources, SureChEMBL and IBM SIIP (IBM Strategic Intellectual Property Insight Platform), with manually curated commercial databases. When looking at the percentage of chemical structures successfully extracted from a set of patents, using SciFinder as our reference, 59 and 51 % were also found in our comparison in SureChEMBL and IBM SIIP, respectively. When performing this comparison with compounds as starting point, i.e. establishing if for a list of compounds the databases provide the links between chemical structures and patents they appear in, we obtained similar results. SureChEMBL and IBM SIIP found 62 and 59 %, respectively, of the compound-patent pairs obtained from Reaxys. In our comparison of automatically generated vs. manually curated patent chemistry databases, the former successfully provided approximately 60 % of links between chemical structure and patents. It needs to be stressed that only a very limited number of patents and compound-patent pairs were used for our comparison. Nevertheless, our results will hopefully help to manage expectations of users of patent chemistry databases of this type and provide a useful framework for more studies like ours as well as guide future developments of the workflows used for the automated extraction of chemical structures from patents. The challenges we have encountered

  18. Prevalence of discordant microscopic changes with automated CBC analysis

    Directory of Open Access Journals (Sweden)

    Fabiano de Jesus Santos

    2014-12-01

    Full Text Available Introduction:The most common cause of diagnostic error is related to errors in laboratory tests as well as errors of results interpretation. In order to reduce them, the laboratory currently has modern equipment which provides accurate and reliable results. The development of automation has revolutionized the laboratory procedures in Brazil and worldwide.Objective:To determine the prevalence of microscopic changes present in blood slides concordant and discordant with results obtained using fully automated procedures.Materials and method:From January to July 2013, 1,000 hematological parameters slides were analyzed. Automated analysis was performed on last generation equipment, which methodology is based on electrical impedance, and is able to quantify all the figurative elements of the blood in a universe of 22 parameters. The microscopy was performed by two experts in microscopy simultaneously.Results:The data showed that only 42.70% were concordant, comparing with 57.30% discordant. The main findings among discordant were: Changes in red blood cells 43.70% (n = 250, white blood cells 38.46% (n = 220, and number of platelet 17.80% (n = 102.Discussion:The data show that some results are not consistent with clinical or physiological state of an individual, and cannot be explained because they have not been investigated, which may compromise the final diagnosis.Conclusion:It was observed that it is of fundamental importance that the microscopy qualitative analysis must be performed in parallel with automated analysis in order to obtain reliable results, causing a positive impact on the prevention, diagnosis, prognosis, and therapeutic follow-up.

  19. Automated Clean Chemistry for Bulk Analysis of Environmental Swipe Samples - FY17 Year End Report

    Energy Technology Data Exchange (ETDEWEB)

    Ticknor, Brian W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Metzger, Shalina C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); McBay, Eddy H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hexel, Cole R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Tevepaugh, Kayron N. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bostick, Debra A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-11-30

    Sample preparation methods for mass spectrometry are being automated using commercial-off-the-shelf (COTS) equipment to shorten lengthy and costly manual chemical purification procedures. This development addresses a serious need in the International Atomic Energy Agency’s Network of Analytical Laboratories (IAEA NWAL) to increase efficiency in the Bulk Analysis of Environmental Samples for Safeguards program with a method that allows unattended, overnight operation. In collaboration with Elemental Scientific Inc., the prepFAST-MC2 was designed based on COTS equipment. It was modified for uranium/plutonium separations using renewable columns packed with Eichrom TEVA and UTEVA resins, with a chemical separation method based on the Oak Ridge National Laboratory (ORNL) NWAL chemical procedure. The newly designed prepFAST-SR has had several upgrades compared with the original prepFAST-MC2. Both systems are currently installed in the Ultra-Trace Forensics Science Center at ORNL.

  20. Semi-automated retinal vessel analysis in nonmydriatic fundus photography.

    Science.gov (United States)

    Schuster, Alexander Karl-Georg; Fischer, Joachim Ernst; Vossmerbaeumer, Urs

    2014-02-01

    Funduscopic assessment of the retinal vessels may be used to assess the health status of microcirculation and as a component in the evaluation of cardiovascular risk factors. Typically, the evaluation is restricted to morphological appreciation without strict quantification. Our purpose was to develop and validate a software tool for semi-automated quantitative analysis of retinal vasculature in nonmydriatic fundus photography. matlab software was used to develop a semi-automated image recognition and analysis tool for the determination of the arterial-venous (A/V) ratio in the central vessel equivalent on 45° digital fundus photographs. Validity and reproducibility of the results were ascertained using nonmydriatic photographs of 50 eyes from 25 subjects recorded from a 3DOCT device (Topcon Corp.). Two hundred and thirty-three eyes of 121 healthy subjects were evaluated to define normative values. A software tool was developed using image thresholds for vessel recognition and vessel width calculation in a semi-automated three-step procedure: vessel recognition on the photograph and artery/vein designation, width measurement and calculation of central retinal vessel equivalents. Mean vessel recognition rate was 78%, vessel class designation rate 75% and reproducibility between 0.78 and 0.91. Mean A/V ratio was 0.84. Application on a healthy norm cohort showed high congruence with prior published manual methods. Processing time per image was one minute. Quantitative geometrical assessment of the retinal vasculature may be performed in a semi-automated manner using dedicated software tools. Yielding reproducible numerical data within a short time leap, this may contribute additional value to mere morphological estimates in the clinical evaluation of fundus photographs. © 2013 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  1. IsoMS: automated processing of LC-MS data generated by a chemical isotope labeling metabolomics platform.

    Science.gov (United States)

    Zhou, Ruokun; Tseng, Chiao-Li; Huan, Tao; Li, Liang

    2014-05-20

    A chemical isotope labeling or isotope coded derivatization (ICD) metabolomics platform uses a chemical derivatization method to introduce a mass tag to all of the metabolites having a common functional group (e.g., amine), followed by LC-MS analysis of the labeled metabolites. To apply this platform to metabolomics studies involving quantitative analysis of different groups of samples, automated data processing is required. Herein, we report a data processing method based on the use of a mass spectral feature unique to the chemical labeling approach, i.e., any differential-isotope-labeled metabolites are detected as peak pairs with a fixed mass difference in a mass spectrum. A software tool, IsoMS, has been developed to process the raw data generated from one or multiple LC-MS runs by peak picking, peak pairing, peak-pair filtering, and peak-pair intensity ratio calculation. The same peak pairs detected from multiple samples are then aligned to produce a CSV file that contains the metabolite information and peak ratios relative to a control (e.g., a pooled sample). This file can be readily exported for further data and statistical analysis, which is illustrated in an example of comparing the metabolomes of human urine samples collected before and after drinking coffee. To demonstrate that this method is reliable for data processing, five (13)C2-/(12)C2-dansyl labeled metabolite standards were analyzed by LC-MS. IsoMS was able to detect these metabolites correctly. In addition, in the analysis of a (13)C2-/(12)C2-dansyl labeled human urine, IsoMS detected 2044 peak pairs, and manual inspection of these peak pairs found 90 false peak pairs, representing a false positive rate of 4.4%. IsoMS for Windows running R is freely available for noncommercial use from www.mycompoundid.org/IsoMS.

  2. Using historical wafermap data for automated yield analysis

    International Nuclear Information System (INIS)

    Tobin, K.W.; Karnowski, T.P.; Gleason, S.S.; Jensen, D.; Lakhani, F.

    1999-01-01

    To be productive and profitable in a modern semiconductor fabrication environment, large amounts of manufacturing data must be collected, analyzed, and maintained. This includes data collected from in- and off-line wafer inspection systems and from the process equipment itself. This data is increasingly being used to design new processes, control and maintain tools, and to provide the information needed for rapid yield learning and prediction. Because of increasing device complexity, the amount of data being generated is outstripping the yield engineer close-quote s ability to effectively monitor and correct unexpected trends and excursions. The 1997 SIA National Technology Roadmap for Semiconductors highlights a need to address these issues through open-quotes automated data reduction algorithms to source defects from multiple data sources and to reduce defect sourcing time.close quotes SEMATECH and the Oak Ridge National Laboratory have been developing new strategies and technologies for providing the yield engineer with higher levels of assisted data reduction for the purpose of automated yield analysis. In this article, we will discuss the current state of the art and trends in yield management automation. copyright 1999 American Vacuum Society

  3. Automated Tracking of Cell Migration with Rapid Data Analysis.

    Science.gov (United States)

    DuChez, Brian J

    2017-09-01

    Cell migration is essential for many biological processes including development, wound healing, and metastasis. However, studying cell migration often requires the time-consuming and labor-intensive task of manually tracking cells. To accelerate the task of obtaining coordinate positions of migrating cells, we have developed a graphical user interface (GUI) capable of automating the tracking of fluorescently labeled nuclei. This GUI provides an intuitive user interface that makes automated tracking accessible to researchers with no image-processing experience or familiarity with particle-tracking approaches. Using this GUI, users can interactively determine a minimum of four parameters to identify fluorescently labeled cells and automate acquisition of cell trajectories. Additional features allow for batch processing of numerous time-lapse images, curation of unwanted tracks, and subsequent statistical analysis of tracked cells. Statistical outputs allow users to evaluate migratory phenotypes, including cell speed, distance, displacement, and persistence, as well as measures of directional movement, such as forward migration index (FMI) and angular displacement. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.

  4. Automated, Ultra-Sterile Solid Sample Handling and Analysis on a Chip

    Science.gov (United States)

    Mora, Maria F.; Stockton, Amanda M.; Willis, Peter A.

    2013-01-01

    There are no existing ultra-sterile lab-on-a-chip systems that can accept solid samples and perform complete chemical analyses without human intervention. The proposed solution is to demonstrate completely automated lab-on-a-chip manipulation of powdered solid samples, followed by on-chip liquid extraction and chemical analysis. This technology utilizes a newly invented glass micro-device for solid manipulation, which mates with existing lab-on-a-chip instrumentation. Devices are fabricated in a Class 10 cleanroom at the JPL MicroDevices Lab, and are plasma-cleaned before and after assembly. Solid samples enter the device through a drilled hole in the top. Existing micro-pumping technology is used to transfer milligrams of powdered sample into an extraction chamber where it is mixed with liquids to extract organic material. Subsequent chemical analysis is performed using portable microchip capillary electrophoresis systems (CE). These instruments have been used for ultra-highly sensitive (parts-per-trillion, pptr) analysis of organic compounds including amines, amino acids, aldehydes, ketones, carboxylic acids, and thiols. Fully autonomous amino acid analyses in liquids were demonstrated; however, to date there have been no reports of completely automated analysis of solid samples on chip. This approach utilizes an existing portable instrument that houses optics, high-voltage power supplies, and solenoids for fully autonomous microfluidic sample processing and CE analysis with laser-induced fluorescence (LIF) detection. Furthermore, the entire system can be sterilized and placed in a cleanroom environment for analyzing samples returned from extraterrestrial targets, if desired. This is an entirely new capability never demonstrated before. The ability to manipulate solid samples, coupled with lab-on-a-chip analysis technology, will enable ultraclean and ultrasensitive end-to-end analysis of samples that is orders of magnitude more sensitive than the ppb goal given

  5. AMDA: an R package for the automated microarray data analysis

    Directory of Open Access Journals (Sweden)

    Foti Maria

    2006-07-01

    Full Text Available Abstract Background Microarrays are routinely used to assess mRNA transcript levels on a genome-wide scale. Large amount of microarray datasets are now available in several databases, and new experiments are constantly being performed. In spite of this fact, few and limited tools exist for quickly and easily analyzing the results. Microarray analysis can be challenging for researchers without the necessary training and it can be time-consuming for service providers with many users. Results To address these problems we have developed an automated microarray data analysis (AMDA software, which provides scientists with an easy and integrated system for the analysis of Affymetrix microarray experiments. AMDA is free and it is available as an R package. It is based on the Bioconductor project that provides a number of powerful bioinformatics and microarray analysis tools. This automated pipeline integrates different functions available in the R and Bioconductor projects with newly developed functions. AMDA covers all of the steps, performing a full data analysis, including image analysis, quality controls, normalization, selection of differentially expressed genes, clustering, correspondence analysis and functional evaluation. Finally a LaTEX document is dynamically generated depending on the performed analysis steps. The generated report contains comments and analysis results as well as the references to several files for a deeper investigation. Conclusion AMDA is freely available as an R package under the GPL license. The package as well as an example analysis report can be downloaded in the Services/Bioinformatics section of the Genopolis http://www.genopolis.it/

  6. IFDOTMETER: A New Software Application for Automated Immunofluorescence Analysis.

    Science.gov (United States)

    Rodríguez-Arribas, Mario; Pizarro-Estrella, Elisa; Gómez-Sánchez, Rubén; Yakhine-Diop, S M S; Gragera-Hidalgo, Antonio; Cristo, Alejandro; Bravo-San Pedro, Jose M; González-Polo, Rosa A; Fuentes, José M

    2016-04-01

    Most laboratories interested in autophagy use different imaging software for managing and analyzing heterogeneous parameters in immunofluorescence experiments (e.g., LC3-puncta quantification and determination of the number and size of lysosomes). One solution would be software that works on a user's laptop or workstation that can access all image settings and provide quick and easy-to-use analysis of data. Thus, we have designed and implemented an application called IFDOTMETER, which can run on all major operating systems because it has been programmed using JAVA (Sun Microsystems). Briefly, IFDOTMETER software has been created to quantify a variety of biological hallmarks, including mitochondrial morphology and nuclear condensation. The program interface is intuitive and user-friendly, making it useful for users not familiar with computer handling. By setting previously defined parameters, the software can automatically analyze a large number of images without the supervision of the researcher. Once analysis is complete, the results are stored in a spreadsheet. Using software for high-throughput cell image analysis offers researchers the possibility of performing comprehensive and precise analysis of a high number of images in an automated manner, making this routine task easier. © 2015 Society for Laboratory Automation and Screening.

  7. Assessment of Automated Data Analysis Application on VVER Steam Generator Tubing

    International Nuclear Information System (INIS)

    Picek, E.; Barilar, D.

    2006-01-01

    INETEC - Institute for Nuclear Technology has developed software package named EddyOne having an option of automated analysis of bobbin coil eddy current data. During its development and site use some features were noticed preventing the wide use automatic analysis on VVER SG data. This article discuss these specific problems as well evaluates possible solutions. With regards to current state of automated analysis technology an overview of advantaged and disadvantages of automated analysis on VVER SG is summarized as well.(author)

  8. Automated radioanalytical system incorporating microwave-assisted sample preparation, chemical separation, and online radiometric detection for the monitoring of total 99Tc in nuclear waste processing streams.

    Science.gov (United States)

    Egorov, Oleg B; O'Hara, Matthew J; Grate, Jay W

    2012-04-03

    An automated fluidic instrument is described that rapidly determines the total (99)Tc content of aged nuclear waste samples, where the matrix is chemically and radiologically complex and the existing speciation of the (99)Tc is variable. The monitor links microwave-assisted sample preparation with an automated anion exchange column separation and detection using a flow-through solid scintillator detector. The sample preparation steps acidify the sample, decompose organics, and convert all Tc species to the pertechnetate anion. The column-based anion exchange procedure separates the pertechnetate from the complex sample matrix, so that radiometric detection can provide accurate measurement of (99)Tc. We developed a preprogrammed spike addition procedure to automatically determine matrix-matched calibration. The overall measurement efficiency that is determined simultaneously provides a self-diagnostic parameter for the radiochemical separation and overall instrument function. Continuous, automated operation was demonstrated over the course of 54 h, which resulted in the analysis of 215 samples plus 54 hly spike-addition samples, with consistent overall measurement efficiency for the operation of the monitor. A sample can be processed and measured automatically in just 12.5 min with a detection limit of 23.5 Bq/mL of (99)Tc in low activity waste (0.495 mL sample volume), with better than 10% RSD precision at concentrations above the quantification limit. This rapid automated analysis method was developed to support nuclear waste processing operations planned for the Hanford nuclear site.

  9. Postprocessing algorithm for automated analysis of pelvic intraoperative neuromonitoring signals

    Directory of Open Access Journals (Sweden)

    Wegner Celine

    2016-09-01

    Full Text Available Two dimensional pelvic intraoperative neuromonitoring (pIONM® is based on electric stimulation of autonomic nerves under observation of electromyography of internal anal sphincter (IAS and manometry of urinary bladder. The method provides nerve identification and verification of its’ functional integrity. Currently pIONM® is gaining increased attention in times where preservation of function is becoming more and more important. Ongoing technical and methodological developments in experimental and clinical settings require further analysis of the obtained signals. This work describes a postprocessing algorithm for pIONM® signals, developed for automated analysis of huge amount of recorded data. The analysis routine includes a graphical representation of the recorded signals in the time and frequency domain, as well as a quantitative evaluation by means of features calculated from the time and frequency domain. The produced plots are summarized automatically in a PowerPoint presentation. The calculated features are filled into a standardized Excel-sheet, ready for statistical analysis.

  10. Automated gamma spectrometry and data analysis on radiometric neutron dosimeters

    International Nuclear Information System (INIS)

    Matsumoto, W.Y.

    1983-01-01

    An automated gamma-ray spectrometry system was designed and implemented by the Westinghouse Hanford Company at the Hanford Engineering Development Laboratory (HEDL) to analyze radiometric neutron dosimeters. Unattended, automatic, 24 hour/day, 7 day/week operation with online data analysis and mainframe-computer compatible magnetic tape output are system features. The system was used to analyze most of the 4000-plus radiometric monitors (RM's) from extensive reactor characterization tests during startup and initial operation of th Fast Flux Test Facility (FFTF). The FFTF, operated by HEDL for the Department of Energy, incorporates a 400 MW(th) sodium-cooled fast reactor. Aumomated system hardware consists of a high purity germanium detector, a computerized multichannel analyzer data acquisition system (Nuclear Data, Inc. Model 6620) with two dual 2.5 Mbyte magnetic disk drives plus two 10.5 inch reel magnetic tape units for mass storage of programs/data and an automated Sample Changer-Positioner (ASC-P) run with a programmable controller. The ASC-P has a 200 sample capacity and 12 calibrated counting (analysis) positions ranging from 6 inches (15 cm) to more than 20 feet (6.1 m) from the detector. The system software was programmed in Fortran at HEDL, except for the Nuclear Data, Inc. Peak Search and Analysis Program and Disk Operating System (MIDAS+)

  11. Oak ridge national laboratory automated clean chemistry for bulk analysis of environmental swipe samples

    Energy Technology Data Exchange (ETDEWEB)

    Bostick, Debra A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hexel, Cole R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Ticknor, Brian W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Tevepaugh, Kayron N. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Metzger, Shalina C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-11-01

    To shorten the lengthy and costly manual chemical purification procedures, sample preparation methods for mass spectrometry are being automated using commercial-off-the-shelf (COTS) equipment. This addresses a serious need in the nuclear safeguards community to debottleneck the separation of U and Pu in environmental samples—currently performed by overburdened chemists—with a method that allows unattended, overnight operation. In collaboration with Elemental Scientific Inc., the prepFAST-MC2 was designed based on current COTS equipment that was modified for U/Pu separations utilizing Eichrom™ TEVA and UTEVA resins. Initial verification of individual columns yielded small elution volumes with consistent elution profiles and good recovery. Combined column calibration demonstrated ample separation without crosscontamination of the eluent. Automated packing and unpacking of the built-in columns initially showed >15% deviation in resin loading by weight, which can lead to inconsistent separations. Optimization of the packing and unpacking methods led to a reduction in the variability of the packed resin to less than 5% daily. The reproducibility of the automated system was tested with samples containing 30 ng U and 15 pg Pu, which were separated in a series with alternating reagent blanks. These experiments showed very good washout of both the resin and the sample from the columns as evidenced by low blank values. Analysis of the major and minor isotope ratios for U and Pu provided values well within data quality limits for the International Atomic Energy Agency. Additionally, system process blanks spiked with 233U and 244Pu tracers were separated using the automated system after it was moved outside of a clean room and yielded levels equivalent to clean room blanks, confirming that the system can produce high quality results without the need for expensive clean room infrastructure. Comparison of the amount of personnel time necessary for successful manual vs

  12. The Effect of Information Analysis Automation Display Content on Human Judgment Performance in Noisy Environments

    OpenAIRE

    Bass, Ellen J.; Baumgart, Leigh A.; Shepley, Kathryn Klein

    2012-01-01

    Displaying both the strategy that information analysis automation employs to makes its judgments and variability in the task environment may improve human judgment performance, especially in cases where this variability impacts the judgment performance of the information analysis automation. This work investigated the contribution of providing either information analysis automation strategy information, task environment information, or both, on human judgment performance in a domain where noi...

  13. Molecular activation analysis for chemical speciation studies

    International Nuclear Information System (INIS)

    Chai-Chifang

    1998-01-01

    The term of Molecular Activation Analysis (MAA) refers to an activation analysis method that is able to provide information about the chemical species of elements in system of interests, though its definition has remained to be assigned. Its development is strongly stimulated by the urgent need to know the chemical species of elements, because the total concentrations are often without any meaning when assessing health or environmental risks of trace elements.In practice, the MAA is a combination of conventional instrumental or radiochemical activation analysis and physical, chemical or biochemical separation techniques. The MAA is able to play a particular role in speciation studies. However, the critical point in the MAA is that it is not permitted to change the primitive chemical species of elements in systems, or the change has to be under control; in the meantime it is not allowed to form the 'new artifact' originally not present in systems. Some practical examples of MAA for chemical species research performed recently in our laboratory will be presented as follows: Chemical species of platinum group elements in sediment; Chemical species of iodine in marine algae; Chemical species of mercury in human tissues; Chemical species of selenium in corn; Chemical species of rare earth elements in natural plant, etc. The merits and limitations of MAA will be described as well. (author)

  14. Quantifying biodiversity using digital cameras and automated image analysis.

    Science.gov (United States)

    Roadknight, C. M.; Rose, R. J.; Barber, M. L.; Price, M. C.; Marshall, I. W.

    2009-04-01

    Monitoring the effects on biodiversity of extensive grazing in complex semi-natural habitats is labour intensive. There are also concerns about the standardization of semi-quantitative data collection. We have chosen to focus initially on automating the most time consuming aspect - the image analysis. The advent of cheaper and more sophisticated digital camera technology has lead to a sudden increase in the number of habitat monitoring images and information that is being collected. We report on the use of automated trail cameras (designed for the game hunting market) to continuously capture images of grazer activity in a variety of habitats at Moor House National Nature Reserve, which is situated in the North of England at an average altitude of over 600m. Rainfall is high, and in most areas the soil consists of deep peat (1m to 3m), populated by a mix of heather, mosses and sedges. The cameras have been continuously in operation over a 6 month period, daylight images are in full colour and night images (IR flash) are black and white. We have developed artificial intelligence based methods to assist in the analysis of the large number of images collected, generating alert states for new or unusual image conditions. This paper describes the data collection techniques, outlines the quantitative and qualitative data collected and proposes online and offline systems that can reduce the manpower overheads and increase focus on important subsets in the collected data. By converting digital image data into statistical composite data it can be handled in a similar way to other biodiversity statistics thus improving the scalability of monitoring experiments. Unsupervised feature detection methods and supervised neural methods were tested and offered solutions to simplifying the process. Accurate (85 to 95%) categorization of faunal content can be obtained, requiring human intervention for only those images containing rare animals or unusual (undecidable) conditions, and

  15. Automated reticle inspection data analysis for wafer fabs

    Science.gov (United States)

    Summers, Derek; Chen, Gong; Reese, Bryan; Hutchinson, Trent; Liesching, Marcus; Ying, Hai; Dover, Russell

    2009-04-01

    To minimize potential wafer yield loss due to mask defects, most wafer fabs implement some form of reticle inspection system to monitor photomask quality in high-volume wafer manufacturing environments. Traditionally, experienced operators review reticle defects found by an inspection tool and then manually classify each defect as 'pass, warn, or fail' based on its size and location. However, in the event reticle defects are suspected of causing repeating wafer defects on a completed wafer, potential defects on all associated reticles must be manually searched on a layer-by-layer basis in an effort to identify the reticle responsible for the wafer yield loss. This 'problem reticle' search process is a very tedious and time-consuming task and may cause extended manufacturing line-down situations. Often times, Process Engineers and other team members need to manually investigate several reticle inspection reports to determine if yield loss can be tied to a specific layer. Because of the very nature of this detailed work, calculation errors may occur resulting in an incorrect root cause analysis effort. These delays waste valuable resources that could be spent working on other more productive activities. This paper examines an automated software solution for converting KLA-Tencor reticle inspection defect maps into a format compatible with KLA-Tencor's Klarity Defect(R) data analysis database. The objective is to use the graphical charting capabilities of Klarity Defect to reveal a clearer understanding of defect trends for individual reticle layers or entire mask sets. Automated analysis features include reticle defect count trend analysis and potentially stacking reticle defect maps for signature analysis against wafer inspection defect data. Other possible benefits include optimizing reticle inspection sample plans in an effort to support "lean manufacturing" initiatives for wafer fabs.

  16. Trends and applications of integrated automated ultra-trace sample handling and analysis (T9)

    International Nuclear Information System (INIS)

    Kingston, H.M.S.; Ye Han; Stewart, L.; Link, D.

    2002-01-01

    new approach has been developed for the semiconductor industry; however, as with most new technologies its applicability extends to many other areas as well including environmental, pharmaceutical, clinical and industrial chemical processing. This instrumental system represents a fundamentally new approach. Sample preparation has been integrated as a key system element to enable automation of the instrument system. It has long been believed that an automated fully integrated system was not feasible if a powerful MS system were included. This application demonstrates one of the first fully automated and integrated sample preparation and mass spectrometric instrumental analyses systems applied to practical use. The system is also a broad and ambitious mass based analyzer capable not only for elements but also for direct speciated analysis. The complete analytical suite covering inorganic, organic, organo-metallic and speciated analytes is being applied for critical contamination control of semiconductor processes. As with new paradigms technology it will now extend from its current use into those other applications needing real-time fully automated multi-component analysis. Refs. 4 (author)

  17. Automated analysis of prerecorded evoked electromyographic activity from rat muscle.

    Science.gov (United States)

    Basarab-Horwath, I; Dewhurst, D G; Dixon, R; Meehan, A S; Odusanya, S

    1989-03-01

    An automated microprocessor-based data acquisition and analysis system has been developed specifically to quantify electromyographic (EMG) activity induced by the convulsant agent catechol in the anaesthetized rat. The stimulus and EMG response are recorded on magnetic tape. On playback, the stimulus triggers a digital oscilloscope and, via interface circuitry, a BBC B microcomputer. The myoelectric activity is digitized by the oscilloscope before being transferred under computer control via a RS232 link to the microcomputer. This system overcomes the problems of dealing with signals of variable latency and allows quantification of latency, amplitude, area and frequency of occurrence of specific components within the signal. The captured data can be used to generate either signal or superimposed high resolution graphic reproductions of the original waveforms. Although this system has been designed for a specific application, it could easily be modified to allow analysis of any complex waveform.

  18. Automated sensitivity analysis: New tools for modeling complex dynamic systems

    International Nuclear Information System (INIS)

    Pin, F.G.

    1987-01-01

    Sensitivity analysis is an established methodology used by researchers in almost every field to gain essential insight in design and modeling studies and in performance assessments of complex systems. Conventional sensitivity analysis methodologies, however, have not enjoyed the widespread use they deserve considering the wealth of information they can provide, partly because of their prohibitive cost or the large initial analytical investment they require. Automated systems have recently been developed at ORNL to eliminate these drawbacks. Compilers such as GRESS and EXAP now allow automatic and cost effective calculation of sensitivities in FORTRAN computer codes. In this paper, these and other related tools are described and their impact and applicability in the general areas of modeling, performance assessment and decision making for radioactive waste isolation problems are discussed

  19. Automated analysis of organic particles using cluster SIMS

    Energy Technology Data Exchange (ETDEWEB)

    Gillen, Greg; Zeissler, Cindy; Mahoney, Christine; Lindstrom, Abigail; Fletcher, Robert; Chi, Peter; Verkouteren, Jennifer; Bright, David; Lareau, Richard T.; Boldman, Mike

    2004-06-15

    Cluster primary ion bombardment combined with secondary ion imaging is used on an ion microscope secondary ion mass spectrometer for the spatially resolved analysis of organic particles on various surfaces. Compared to the use of monoatomic primary ion beam bombardment, the use of a cluster primary ion beam (SF{sub 5}{sup +} or C{sub 8}{sup -}) provides significant improvement in molecular ion yields and a reduction in beam-induced degradation of the analyte molecules. These characteristics of cluster bombardment, along with automated sample stage control and custom image analysis software are utilized to rapidly characterize the spatial distribution of trace explosive particles, narcotics and inkjet-printed microarrays on a variety of surfaces.

  20. Automated rice leaf disease detection using color image analysis

    Science.gov (United States)

    Pugoy, Reinald Adrian D. L.; Mariano, Vladimir Y.

    2011-06-01

    In rice-related institutions such as the International Rice Research Institute, assessing the health condition of a rice plant through its leaves, which is usually done as a manual eyeball exercise, is important to come up with good nutrient and disease management strategies. In this paper, an automated system that can detect diseases present in a rice leaf using color image analysis is presented. In the system, the outlier region is first obtained from a rice leaf image to be tested using histogram intersection between the test and healthy rice leaf images. Upon obtaining the outlier, it is then subjected to a threshold-based K-means clustering algorithm to group related regions into clusters. Then, these clusters are subjected to further analysis to finally determine the suspected diseases of the rice leaf.

  1. Automated analysis of damages for radiation in plastics surfaces; Analisis automatizado de danos por radiacion en superficies plasticas

    Energy Technology Data Exchange (ETDEWEB)

    Andrade, C; Camacho M, E; Tavera, L; Balcazar, M [ININ, 52045 Ocoyoacac, Estado de Mexico (Mexico)

    1990-02-15

    Analysis of damages done by the radiation in a polymer characterized by optic properties of polished surfaces, of uniformity and chemical resistance that the acrylic; resistant until the 150 centigrade grades of temperature, and with an approximate weight of half of the glass. An objective of this work is the development of a method that analyze in automated form the superficial damages induced by radiation in plastic materials means an images analyst. (Author)

  2. Automated analysis of invadopodia dynamics in live cells

    Directory of Open Access Journals (Sweden)

    Matthew E. Berginski

    2014-07-01

    Full Text Available Multiple cell types form specialized protein complexes that are used by the cell to actively degrade the surrounding extracellular matrix. These structures are called podosomes or invadopodia and collectively referred to as invadosomes. Due to their potential importance in both healthy physiology as well as in pathological conditions such as cancer, the characterization of these structures has been of increasing interest. Following early descriptions of invadopodia, assays were developed which labelled the matrix underneath metastatic cancer cells allowing for the assessment of invadopodia activity in motile cells. However, characterization of invadopodia using these methods has traditionally been done manually with time-consuming and potentially biased quantification methods, limiting the number of experiments and the quantity of data that can be analysed. We have developed a system to automate the segmentation, tracking and quantification of invadopodia in time-lapse fluorescence image sets at both the single invadopodia level and whole cell level. We rigorously tested the ability of the method to detect changes in invadopodia formation and dynamics through the use of well-characterized small molecule inhibitors, with known effects on invadopodia. Our results demonstrate the ability of this analysis method to quantify changes in invadopodia formation from live cell imaging data in a high throughput, automated manner.

  3. Automated and unbiased classification of chemical profiles from fungi using high performance liquid chromatography

    DEFF Research Database (Denmark)

    Hansen, Michael Edberg; Andersen, Birgitte; Smedsgaard, Jørn

    2005-01-01

    In this paper we present a method for unbiased/unsupervised classification and identification of closely related fungi, using chemical analysis of secondary metabolite profiles created by HPLC with UV diode array detection. For two chromatographic data matrices a vector of locally aligned full sp...

  4. Chemical analysis of the Fornax Dwarf galaxy

    NARCIS (Netherlands)

    Letarte, Bruno

    2007-01-01

    This thesis is entitled “Chemical Analysis of the Fornax Dwarf Galaxy”, and it’s main goal is to determine what are the chemical elements present in the stars of this galaxy in order to try and understand it’s evolution. Galaxies are not “static” objects, they move, form stars and can interact with

  5. Chemical analysis of reactor and commercial columbium

    International Nuclear Information System (INIS)

    Anon.

    1981-01-01

    The methods cover the chemical analysis of reactor and commercial columbium having chemical compositions within specified limits. The following analytical procedures are discussed along with apparatus, reagents, photometric practice, safety precautions, sampling, and rounding calculated values: nitrogen, by distillation (photometric) method; molybdenum and tungsten by the dithiol (photometric) method; iron by the 1,10-phenanthroline (photometric) method

  6. Automated uncertainty analysis methods in the FRAP computer codes

    International Nuclear Information System (INIS)

    Peck, S.O.

    1980-01-01

    A user oriented, automated uncertainty analysis capability has been incorporated in the Fuel Rod Analysis Program (FRAP) computer codes. The FRAP codes have been developed for the analysis of Light Water Reactor fuel rod behavior during steady state (FRAPCON) and transient (FRAP-T) conditions as part of the United States Nuclear Regulatory Commission's Water Reactor Safety Research Program. The objective of uncertainty analysis of these codes is to obtain estimates of the uncertainty in computed outputs of the codes is to obtain estimates of the uncertainty in computed outputs of the codes as a function of known uncertainties in input variables. This paper presents the methods used to generate an uncertainty analysis of a large computer code, discusses the assumptions that are made, and shows techniques for testing them. An uncertainty analysis of FRAP-T calculated fuel rod behavior during a hypothetical loss-of-coolant transient is presented as an example and carried through the discussion to illustrate the various concepts

  7. Optical MEMS for chemical analysis and biomedicine

    CERN Document Server

    Jiang, Hongrui

    2016-01-01

    This book describes the current state of optical MEMS in chemical and biomedical analysis and brings together current trends and highlights topics representing the most exciting progress in recent years in the field.

  8. The Effect of Information Analysis Automation Display Content on Human Judgment Performance in Noisy Environments

    Science.gov (United States)

    Bass, Ellen J.; Baumgart, Leigh A.; Shepley, Kathryn Klein

    2014-01-01

    Displaying both the strategy that information analysis automation employs to makes its judgments and variability in the task environment may improve human judgment performance, especially in cases where this variability impacts the judgment performance of the information analysis automation. This work investigated the contribution of providing either information analysis automation strategy information, task environment information, or both, on human judgment performance in a domain where noisy sensor data are used by both the human and the information analysis automation to make judgments. In a simplified air traffic conflict prediction experiment, 32 participants made probability of horizontal conflict judgments under different display content conditions. After being exposed to the information analysis automation, judgment achievement significantly improved for all participants as compared to judgments without any of the automation's information. Participants provided with additional display content pertaining to cue variability in the task environment had significantly higher aided judgment achievement compared to those provided with only the automation's judgment of a probability of conflict. When designing information analysis automation for environments where the automation's judgment achievement is impacted by noisy environmental data, it may be beneficial to show additional task environment information to the human judge in order to improve judgment performance. PMID:24847184

  9. The Effect of Information Analysis Automation Display Content on Human Judgment Performance in Noisy Environments.

    Science.gov (United States)

    Bass, Ellen J; Baumgart, Leigh A; Shepley, Kathryn Klein

    2013-03-01

    Displaying both the strategy that information analysis automation employs to makes its judgments and variability in the task environment may improve human judgment performance, especially in cases where this variability impacts the judgment performance of the information analysis automation. This work investigated the contribution of providing either information analysis automation strategy information, task environment information, or both, on human judgment performance in a domain where noisy sensor data are used by both the human and the information analysis automation to make judgments. In a simplified air traffic conflict prediction experiment, 32 participants made probability of horizontal conflict judgments under different display content conditions. After being exposed to the information analysis automation, judgment achievement significantly improved for all participants as compared to judgments without any of the automation's information. Participants provided with additional display content pertaining to cue variability in the task environment had significantly higher aided judgment achievement compared to those provided with only the automation's judgment of a probability of conflict. When designing information analysis automation for environments where the automation's judgment achievement is impacted by noisy environmental data, it may be beneficial to show additional task environment information to the human judge in order to improve judgment performance.

  10. Chemical analysis of water in hydrogeology

    International Nuclear Information System (INIS)

    Flakova, R.; Zenisova, Z.; Seman, M.

    2010-01-01

    The aim of the monograph is to give complete information on the chemical analysis of water hydrogeology not only for the students program of Geology study (Bachelor degree study), Engineering Geology and Hydrogeology (Master's degree study) and Engineering Geology (doctoral level study), but also for students from other colleges and universities schools in Slovakia, as well as in the Czech Republic, dealing with the chemical composition of water and its quality, from different perspectives. The benefit would be for professionals with hydrogeological, water and environmental practices, who can find there all the necessary information about proper water sampling, the units used in the chemical analysis of water, expressing the proper chemical composition of water in its various parameters through classification of chemical composition of the water up to the basic features of physical chemistry at thermodynamic calculations and hydrogeochemical modelling.

  11. Automated differentiation of computer models for sensitivity analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1990-01-01

    Sensitivity analysis of reactor physics computer models is an established discipline after more than twenty years of active development of generalized perturbations theory based on direct and adjoint methods. Many reactor physics models have been enhanced to solve for sensitivities of model results to model data. The calculated sensitivities are usually normalized first derivatives although some codes are capable of solving for higher-order sensitivities. The purpose of this paper is to report on the development and application of the GRESS system for automating the implementation of the direct and adjoint techniques into existing FORTRAN computer codes. The GRESS system was developed at ORNL to eliminate the costly man-power intensive effort required to implement the direct and adjoint techniques into already-existing FORTRAN codes. GRESS has been successfully tested for a number of codes over a wide range of applications and presently operates on VAX machines under both VMS and UNIX operating systems

  12. Automated differentiation of computer models for sensitivity analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1991-01-01

    Sensitivity analysis of reactor physics computer models is an established discipline after more than twenty years of active development of generalized perturbations theory based on direct and adjoint methods. Many reactor physics models have been enhanced to solve for sensitivities of model results to model data. The calculated sensitivities are usually normalized first derivatives, although some codes are capable of solving for higher-order sensitivities. The purpose of this paper is to report on the development and application of the GRESS system for automating the implementation of the direct and adjoint techniques into existing FORTRAN computer codes. The GRESS system was developed at ORNL to eliminate the costly man-power intensive effort required to implement the direct and adjoint techniques into already-existing FORTRAN codes. GRESS has been successfully tested for a number of codes over a wide range of applications and presently operates on VAX machines under both VMS and UNIX operating systems. (author). 9 refs, 1 tab

  13. Crowdsourcing and Automated Retinal Image Analysis for Diabetic Retinopathy.

    Science.gov (United States)

    Mudie, Lucy I; Wang, Xueyang; Friedman, David S; Brady, Christopher J

    2017-09-23

    As the number of people with diabetic retinopathy (DR) in the USA is expected to increase threefold by 2050, the need to reduce health care costs associated with screening for this treatable disease is ever present. Crowdsourcing and automated retinal image analysis (ARIA) are two areas where new technology has been applied to reduce costs in screening for DR. This paper reviews the current literature surrounding these new technologies. Crowdsourcing has high sensitivity for normal vs abnormal images; however, when multiple categories for severity of DR are added, specificity is reduced. ARIAs have higher sensitivity and specificity, and some commercial ARIA programs are already in use. Deep learning enhanced ARIAs appear to offer even more improvement in ARIA grading accuracy. The utilization of crowdsourcing and ARIAs may be a key to reducing the time and cost burden of processing images from DR screening.

  14. Automated uranium analysis by delayed-neutron counting

    International Nuclear Information System (INIS)

    Kunzendorf, H.; Loevborg, L.; Christiansen, E.M.

    1980-10-01

    Automated uranium analysis by fission-induced delayed-neutron counting is described. A short description is given of the instrumentation including transfer system, process control, irradiation and counting sites, and computer operations. Characteristic parameters of the facility (sample preparations, background, and standards) are discussed. A sensitivity of 817 +- 22 counts per 10 -6 g U is found using irradiation, delay, and counting times of 20 s, 5 s, and 10 s, respectively. Presicion is generally less than 1% for normal geological samples. Critical level and detection limits for 7.5 g samples are 8 and 16 ppb, respectively. The importance of some physical and elemental interferences are outlined. Dead-time corrections of measured count rates are necessary and a polynomical expression is used for count rates up to 10 5 . The presence of rare earth elements is regarded as the most important elemental interference. A typical application is given and other areas of application are described. (auther)

  15. Analysis of Automated Aircraft Conflict Resolution and Weather Avoidance

    Science.gov (United States)

    Love, John F.; Chan, William N.; Lee, Chu Han

    2009-01-01

    This paper describes an analysis of using trajectory-based automation to resolve both aircraft and weather constraints for near-term air traffic management decision making. The auto resolution algorithm developed and tested at NASA-Ames to resolve aircraft to aircraft conflicts has been modified to mitigate convective weather constraints. Modifications include adding information about the size of a gap between weather constraints to the routing solution. Routes that traverse gaps that are smaller than a specific size are not used. An evaluation of the performance of the modified autoresolver to resolve both conflicts with aircraft and weather was performed. Integration with the Center-TRACON Traffic Management System was completed to evaluate the effect of weather routing on schedule delays.

  16. Knowledge-based requirements analysis for automating software development

    Science.gov (United States)

    Markosian, Lawrence Z.

    1988-01-01

    We present a new software development paradigm that automates the derivation of implementations from requirements. In this paradigm, informally-stated requirements are expressed in a domain-specific requirements specification language. This language is machine-understable and requirements expressed in it are captured in a knowledge base. Once the requirements are captured, more detailed specifications and eventually implementations are derived by the system using transformational synthesis. A key characteristic of the process is that the required human intervention is in the form of providing problem- and domain-specific engineering knowledge, not in writing detailed implementations. We describe a prototype system that applies the paradigm in the realm of communication engineering: the prototype automatically generates implementations of buffers following analysis of the requirements on each buffer.

  17. Automated generation of burnup chain for reactor analysis applications

    International Nuclear Information System (INIS)

    Tran Viet Phu; Tran Hoai Nam; Akio Yamamoto; Tomohiro Endo

    2015-01-01

    This paper presents the development of an automated generation of a new burnup chain for reactor analysis applications. The JENDL FP Decay Data File 2011 and Fission Yields Data File 2011 were used as the data sources. The nuclides in the new chain are determined by restrictions of the half-life and cumulative yield of fission products or from a given list. Then, decay modes, branching ratios and fission yields are recalculated taking into account intermediate reactions. The new burnup chain is output according to the format for the SRAC code system. Verification was performed to evaluate the accuracy of the new burnup chain. The results show that the new burnup chain reproduces well the results of a reference one with 193 fission products used in SRAC. Further development and applications are being planned with the burnup chain code. (author)

  18. Automated generation of burnup chain for reactor analysis applications

    International Nuclear Information System (INIS)

    Tran, Viet-Phu; Tran, Hoai-Nam; Yamamoto, Akio; Endo, Tomohiro

    2017-01-01

    This paper presents the development of an automated generation of burnup chain for reactor analysis applications. Algorithms are proposed to reevaluate decay modes, branching ratios and effective fission product (FP) cumulative yields of a given list of important FPs taking into account intermediate reactions. A new burnup chain is generated using the updated data sources taken from the JENDL FP decay data file 2011 and Fission yields data file 2011. The new burnup chain is output according to the format for the SRAC code system. Verification has been performed to evaluate the accuracy of the new burnup chain. The results show that the new burnup chain reproduces well the results of a reference one with 193 fission products used in SRAC. Burnup calculations using the new burnup chain have also been performed based on UO_2 and MOX fuel pin cells and compared with a reference chain th2cm6fp193bp6T.

  19. Automated generation of burnup chain for reactor analysis applications

    Energy Technology Data Exchange (ETDEWEB)

    Tran, Viet-Phu [VINATOM, Hanoi (Viet Nam). Inst. for Nuclear Science and Technology; Tran, Hoai-Nam [Duy Tan Univ., Da Nang (Viet Nam). Inst. of Research and Development; Yamamoto, Akio; Endo, Tomohiro [Nagoya Univ., Nagoya-shi (Japan). Dept. of Materials, Physics and Energy Engineering

    2017-05-15

    This paper presents the development of an automated generation of burnup chain for reactor analysis applications. Algorithms are proposed to reevaluate decay modes, branching ratios and effective fission product (FP) cumulative yields of a given list of important FPs taking into account intermediate reactions. A new burnup chain is generated using the updated data sources taken from the JENDL FP decay data file 2011 and Fission yields data file 2011. The new burnup chain is output according to the format for the SRAC code system. Verification has been performed to evaluate the accuracy of the new burnup chain. The results show that the new burnup chain reproduces well the results of a reference one with 193 fission products used in SRAC. Burnup calculations using the new burnup chain have also been performed based on UO{sub 2} and MOX fuel pin cells and compared with a reference chain th2cm6fp193bp6T.

  20. Chemical methods of rock analysis

    National Research Council Canada - National Science Library

    Jeffery, P. G; Hutchison, D

    1981-01-01

    A practical guide to the methods in general use for the complete analysis of silicate rock material and for the determination of all those elements present in major, minor or trace amounts in silicate...

  1. Elemental misinterpretation in automated analysis of LIBS spectra.

    Science.gov (United States)

    Hübert, Waldemar; Ankerhold, Georg

    2011-07-01

    In this work, the Stark effect is shown to be mainly responsible for wrong elemental allocation by automated laser-induced breakdown spectroscopy (LIBS) software solutions. Due to broadening and shift of an elemental emission line affected by the Stark effect, its measured spectral position might interfere with the line position of several other elements. The micro-plasma is generated by focusing a frequency-doubled 200 mJ pulsed Nd/YAG laser on an aluminum target and furthermore on a brass sample in air at atmospheric pressure. After laser pulse excitation, we have measured the temporal evolution of the Al(II) ion line at 281.6 nm (4s(1)S-3p(1)P) during the decay of the laser-induced plasma. Depending on laser pulse power, the center of the measured line is red-shifted by 130 pm (490 GHz) with respect to the exact line position. In this case, the well-known spectral line positions of two moderate and strong lines of other elements coincide with the actual shifted position of the Al(II) line. Consequently, a time-resolving software analysis can lead to an elemental misinterpretation. To avoid a wrong interpretation of LIBS spectra in automated analysis software for a given LIBS system, we recommend using larger gate delays incorporating Stark broadening parameters and using a range of tolerance, which is non-symmetric around the measured line center. These suggestions may help to improve time-resolving LIBS software promising a smaller probability of wrong elemental identification and making LIBS more attractive for industrial applications.

  2. galaxieEST: addressing EST identity through automated phylogenetic analysis.

    Science.gov (United States)

    Nilsson, R Henrik; Rajashekar, Balaji; Larsson, Karl-Henrik; Ursing, Björn M

    2004-07-05

    Research involving expressed sequence tags (ESTs) is intricately coupled to the existence of large, well-annotated sequence repositories. Comparatively complete and satisfactory annotated public sequence libraries are, however, available only for a limited range of organisms, rendering the absence of sequences and gene structure information a tangible problem for those working with taxa lacking an EST or genome sequencing project. Paralogous genes belonging to the same gene family but distinguished by derived characteristics are particularly prone to misidentification and erroneous annotation; high but incomplete levels of sequence similarity are typically difficult to interpret and have formed the basis of many unsubstantiated assumptions of orthology. In these cases, a phylogenetic study of the query sequence together with the most similar sequences in the database may be of great value to the identification process. In order to facilitate this laborious procedure, a project to employ automated phylogenetic analysis in the identification of ESTs was initiated. galaxieEST is an open source Perl-CGI script package designed to complement traditional similarity-based identification of EST sequences through employment of automated phylogenetic analysis. It uses a series of BLAST runs as a sieve to retrieve nucleotide and protein sequences for inclusion in neighbour joining and parsimony analyses; the output includes the BLAST output, the results of the phylogenetic analyses, and the corresponding multiple alignments. galaxieEST is available as an on-line web service for identification of fungal ESTs and for download / local installation for use with any organism group at http://galaxie.cgb.ki.se/galaxieEST.html. By addressing sequence relatedness in addition to similarity, galaxieEST provides an integrative view on EST origin and identity, which may prove particularly useful in cases where similarity searches return one or more pertinent, but not full, matches and

  3. Utilization of chemical derivatives in activation analysis

    International Nuclear Information System (INIS)

    Ehmann, W.D.

    1990-01-01

    Derivative activation analysis (DAA) is a method to enhance the sensitivity of nuclear activation analysis for the more elusive elements. It may also allow a degree of chemical speciation for the element of interest. DAA uses a preirradiation chemical reaction on the sample to initiate the formation of, or an exchange with, a chemical complex which contains a surrogate element, M. As a result, the amount of the element or the chemical species to be determined, X, is now represented by measurement of the amount of the surrogate element, M, that is made part of, or released by the complex species. The surrogate element is selected for its superior properties for nuclear activation analysis and the absence of interference reaction in its final determination by instrumental neutron activation analysis (INAA) after some preconcentration or separation chemistry. Published DAA studies have been limited to neutron activation analysis. DAA can offer the analyst some important advantages. It can determine elements, functional groups, or chemical species which cannot be determined directly by INAA, fast neutron activation analysis (FNAA), prompt gamma neutron activation analysis (PGNAA), or charged particle activation analysis (CPAA) procedures. When compared with conventional RNAA, there are fewer precautions with respect to handling of intensely radioactive samples, since the chemistry is done before the irradiation. The preirradiation chemistry may also eliminate many interferences that might occur in INAA and, through use of an appropriate surrogate element, can place the analytical gamma-ray line in an interference-free region of the gamma-ray spectrum

  4. Automating X-ray Fluorescence Analysis for Rapid Astrobiology Surveys.

    Science.gov (United States)

    Thompson, David R; Flannery, David T; Lanka, Ravi; Allwood, Abigail C; Bue, Brian D; Clark, Benton C; Elam, W Timothy; Estlin, Tara A; Hodyss, Robert P; Hurowitz, Joel A; Liu, Yang; Wade, Lawrence A

    2015-11-01

    A new generation of planetary rover instruments, such as PIXL (Planetary Instrument for X-ray Lithochemistry) and SHERLOC (Scanning Habitable Environments with Raman Luminescence for Organics and Chemicals) selected for the Mars 2020 mission rover payload, aim to map mineralogical and elemental composition in situ at microscopic scales. These instruments will produce large spectral cubes with thousands of channels acquired over thousands of spatial locations, a large potential science yield limited mainly by the time required to acquire a measurement after placement. A secondary bottleneck also faces mission planners after downlink; analysts must interpret the complex data products quickly to inform tactical planning for the next command cycle. This study demonstrates operational approaches to overcome these bottlenecks by specialized early-stage science data processing. Onboard, simple real-time systems can perform a basic compositional assessment, recognizing specific features of interest and optimizing sensor integration time to characterize anomalies. On the ground, statistically motivated visualization can make raw uncalibrated data products more interpretable for tactical decision making. Techniques such as manifold dimensionality reduction can help operators comprehend large databases at a glance, identifying trends and anomalies in data. These onboard and ground-side analyses can complement a quantitative interpretation. We evaluate system performance for the case study of PIXL, an X-ray fluorescence spectrometer. Experiments on three representative samples demonstrate improved methods for onboard and ground-side automation and illustrate new astrobiological science capabilities unavailable in previous planetary instruments. Dimensionality reduction-Planetary science-Visualization.

  5. 14 CFR 1261.413 - Analysis of costs; automation; prevention of overpayments, delinquencies, or defaults.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 5 2010-01-01 2010-01-01 false Analysis of costs; automation; prevention of overpayments, delinquencies, or defaults. 1261.413 Section 1261.413 Aeronautics and Space NATIONAL...) § 1261.413 Analysis of costs; automation; prevention of overpayments, delinquencies, or defaults. The...

  6. Spectroscopic Chemical Analysis Methods and Apparatus

    Science.gov (United States)

    Hug, William F. (Inventor); Reid, Ray D. (Inventor); Bhartia, Rohit (Inventor); Lane, Arthur L. (Inventor)

    2018-01-01

    Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted along with photoluminescence spectroscopy (i.e. fluorescence and/or phosphorescence spectroscopy) to provide high levels of sensitivity and specificity in the same instrument.

  7. Extended automated separation techniques in destructive neutron activation analysis; application to various biological materials, including human tissues and blood

    International Nuclear Information System (INIS)

    Tjioe, P.S.; Goeij, J.J.M. de; Houtman, J.P.W.

    1976-09-01

    Neutron activation analysis may be performed as a multi-element and low-level technique for many important trace elements in biological materials, provided that post-irradiation chemical separations are applied. This paper describes a chemical separation consisting of automated procedures for destruction, distillation, and anion-chromatography. The system developed enables the determination of 14 trace elements in biological materials, viz. antimony, arsenic, bromine, cadmium, chromium, cobalt, copper, gold, iron, mercury, molybdenum, nickel, selenium, and zinc. The aspects of sample preparation, neutron irradiation, gamma-spectrum evaluation, and blank-value contribution are also discussed

  8. Automated Image Analysis of Offshore Infrastructure Marine Biofouling

    Directory of Open Access Journals (Sweden)

    Kate Gormley

    2018-01-01

    Full Text Available In the UK, some of the oldest oil and gas installations have been in the water for over 40 years and have considerable colonisation by marine organisms, which may lead to both industry challenges and/or potential biodiversity benefits (e.g., artificial reefs. The project objective was to test the use of an automated image analysis software (CoralNet on images of marine biofouling from offshore platforms on the UK continental shelf, with the aim of (i training the software to identify the main marine biofouling organisms on UK platforms; (ii testing the software performance on 3 platforms under 3 different analysis criteria (methods A–C; (iii calculating the percentage cover of marine biofouling organisms and (iv providing recommendations to industry. Following software training with 857 images, and testing of three platforms, results showed that diversity of the three platforms ranged from low (in the central North Sea to moderate (in the northern North Sea. The two central North Sea platforms were dominated by the plumose anemone Metridium dianthus; and the northern North Sea platform showed less obvious species domination. Three different analysis criteria were created, where the method of selection of points, number of points assessed and confidence level thresholds (CT varied: (method A random selection of 20 points with CT 80%, (method B stratified random of 50 points with CT of 90% and (method C a grid approach of 100 points with CT of 90%. Performed across the three platforms, the results showed that there were no significant differences across the majority of species and comparison pairs. No significant difference (across all species was noted between confirmed annotations methods (A, B and C. It was considered that the software performed well for the classification of the main fouling species in the North Sea. Overall, the study showed that the use of automated image analysis software may enable a more efficient and consistent

  9. Automated Communications Analysis System using Latent Semantic Analysis

    National Research Council Canada - National Science Library

    Foltz, Peter W

    2006-01-01

    ... and during the debriefing process to assess knowledge proficiency. In this report, the contractor describes prior research on communication analysis and how it can inform assessment of individual and team cognitive processing...

  10. Automated modelling of complex refrigeration cycles through topological structure analysis

    International Nuclear Information System (INIS)

    Belman-Flores, J.M.; Riesco-Avila, J.M.; Gallegos-Munoz, A.; Navarro-Esbri, J.; Aceves, S.M.

    2009-01-01

    We have developed a computational method for analysis of refrigeration cycles. The method is well suited for automated analysis of complex refrigeration systems. The refrigerator is specified through a description of flows representing thermodynamic sates at system locations; components that modify the thermodynamic state of a flow; and controls that specify flow characteristics at selected points in the diagram. A system of equations is then established for the refrigerator, based on mass, energy and momentum balances for each of the system components. Controls specify the values of certain system variables, thereby reducing the number of unknowns. It is found that the system of equations for the refrigerator may contain a number of redundant or duplicate equations, and therefore further equations are necessary for a full characterization. The number of additional equations is related to the number of loops in the cycle, and this is calculated by a matrix-based topological method. The methodology is demonstrated through an analysis of a two-stage refrigeration cycle.

  11. Automated computer analysis of plasma-streak traces from SCYLLAC

    International Nuclear Information System (INIS)

    Whitman, R.L.; Jahoda, F.C.; Kruger, R.P.

    1977-01-01

    An automated computer analysis technique that locates and references the approximate centroid of single- or dual-streak traces from the Los Alamos Scientific Laboratory SCYLLAC facility is described. The technique also determines the plasma-trace width over a limited self-adjusting region. The plasma traces are recorded with streak cameras on Polaroid film, then scanned and digitized for processing. The analysis technique uses scene segmentation to separate the plasma trace from a reference fiducial trace. The technique employs two methods of peak detection; one for the plasma trace and one for the fiducial trace. The width is obtained using an edge-detection, or slope, method. Timing data are derived from the intensity modulation of the fiducial trace. To smooth (despike) the output graphs showing the plasma-trace centroid and width, a technique of ''twicing'' developed by Tukey was employed. In addition, an interactive sorting algorithm allows retrieval of the centroid, width, and fiducial data from any test shot plasma for post analysis. As yet, only a limited set of sixteen plasma traces has been processed using this technique

  12. Automated computer analysis of plasma-streak traces from SCYLLAC

    International Nuclear Information System (INIS)

    Whiteman, R.L.; Jahoda, F.C.; Kruger, R.P.

    1977-11-01

    An automated computer analysis technique that locates and references the approximate centroid of single- or dual-streak traces from the Los Alamos Scientific Laboratory SCYLLAC facility is described. The technique also determines the plasma-trace width over a limited self-adjusting region. The plasma traces are recorded with streak cameras on Polaroid film, then scanned and digitized for processing. The analysis technique uses scene segmentation to separate the plasma trace from a reference fiducial trace. The technique employs two methods of peak detection; one for the plasma trace and one for the fiducial trace. The width is obtained using an edge-detection, or slope, method. Timing data are derived from the intensity modulation of the fiducial trace. To smooth (despike) the output graphs showing the plasma-trace centroid and width, a technique of ''twicing'' developed by Tukey was employed. In addition, an interactive sorting algorithm allows retrieval of the centroid, width, and fiducial data from any test shot plasma for post analysis. As yet, only a limited set of the plasma traces has been processed with this technique

  13. Development of a software for INAA analysis automation

    International Nuclear Information System (INIS)

    Zahn, Guilherme S.; Genezini, Frederico A.; Figueiredo, Ana Maria G.; Ticianelli, Regina B.

    2013-01-01

    In this work, a software to automate the post-counting tasks in comparative INAA has been developed that aims to become more flexible than the available options, integrating itself with some of the routines currently in use in the IPEN Activation Analysis Laboratory and allowing the user to choose between a fully-automatic analysis or an Excel-oriented one. The software makes use of the Genie 2000 data importing and analysis routines and stores each 'energy-counts-uncertainty' table as a separate ASCII file that can be used later on if required by the analyst. Moreover, it generates an Excel-compatible CSV (comma separated values) file with only the relevant results from the analyses for each sample or comparator, as well as the results of the concentration calculations and the results obtained with four different statistical tools (unweighted average, weighted average, normalized residuals and Rajeval technique), allowing the analyst to double-check the results. Finally, a 'summary' CSV file is also produced, with the final concentration results obtained for each element in each sample. (author)

  14. Intelligent Control in Automation Based on Wireless Traffic Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kurt Derr; Milos Manic

    2007-08-01

    Wireless technology is a central component of many factory automation infrastructures in both the commercial and government sectors, providing connectivity among various components in industrial realms (distributed sensors, machines, mobile process controllers). However wireless technologies provide more threats to computer security than wired environments. The advantageous features of Bluetooth technology resulted in Bluetooth units shipments climbing to five million per week at the end of 2005 [1, 2]. This is why the real-time interpretation and understanding of Bluetooth traffic behavior is critical in both maintaining the integrity of computer systems and increasing the efficient use of this technology in control type applications. Although neuro-fuzzy approaches have been applied to wireless 802.11 behavior analysis in the past, a significantly different Bluetooth protocol framework has not been extensively explored using this technology. This paper presents a new neurofuzzy traffic analysis algorithm of this still new territory of Bluetooth traffic. Further enhancements of this algorithm are presented along with the comparison against the traditional, numerical approach. Through test examples, interesting Bluetooth traffic behavior characteristics were captured, and the comparative elegance of this computationally inexpensive approach was demonstrated. This analysis can be used to provide directions for future development and use of this prevailing technology in various control type applications, as well as making the use of it more secure.

  15. Intelligent Control in Automation Based on Wireless Traffic Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kurt Derr; Milos Manic

    2007-09-01

    Wireless technology is a central component of many factory automation infrastructures in both the commercial and government sectors, providing connectivity among various components in industrial realms (distributed sensors, machines, mobile process controllers). However wireless technologies provide more threats to computer security than wired environments. The advantageous features of Bluetooth technology resulted in Bluetooth units shipments climbing to five million per week at the end of 2005 [1, 2]. This is why the real-time interpretation and understanding of Bluetooth traffic behavior is critical in both maintaining the integrity of computer systems and increasing the efficient use of this technology in control type applications. Although neuro-fuzzy approaches have been applied to wireless 802.11 behavior analysis in the past, a significantly different Bluetooth protocol framework has not been extensively explored using this technology. This paper presents a new neurofuzzy traffic analysis algorithm of this still new territory of Bluetooth traffic. Further enhancements of this algorithm are presented along with the comparison against the traditional, numerical approach. Through test examples, interesting Bluetooth traffic behavior characteristics were captured, and the comparative elegance of this computationally inexpensive approach was demonstrated. This analysis can be used to provide directions for future development and use of this prevailing technology in various control type applications, as well as making the use of it more secure.

  16. Automated CBED processing: Sample thickness estimation based on analysis of zone-axis CBED pattern

    Energy Technology Data Exchange (ETDEWEB)

    Klinger, M., E-mail: klinger@post.cz; Němec, M.; Polívka, L.; Gärtnerová, V.; Jäger, A.

    2015-03-15

    An automated processing of convergent beam electron diffraction (CBED) patterns is presented. The proposed methods are used in an automated tool for estimating the thickness of transmission electron microscopy (TEM) samples by matching an experimental zone-axis CBED pattern with a series of patterns simulated for known thicknesses. The proposed tool detects CBED disks, localizes a pattern in detected disks and unifies the coordinate system of the experimental pattern with the simulated one. The experimental pattern is then compared disk-by-disk with a series of simulated patterns each corresponding to different known thicknesses. The thickness of the most similar simulated pattern is then taken as the thickness estimate. The tool was tested on [0 1 1] Si, [0 1 0] α-Ti and [0 1 1] α-Ti samples prepared using different techniques. Results of the presented approach were compared with thickness estimates based on analysis of CBED patterns in two beam conditions. The mean difference between these two methods was 4.1% for the FIB-prepared silicon samples, 5.2% for the electro-chemically polished titanium and 7.9% for Ar{sup +} ion-polished titanium. The proposed techniques can also be employed in other established CBED analyses. Apart from the thickness estimation, it can potentially be used to quantify lattice deformation, structure factors, symmetry, defects or extinction distance. - Highlights: • Automated TEM sample thickness estimation using zone-axis CBED is presented. • Computer vision and artificial intelligence are employed in CBED processing. • This approach reduces operator effort, analysis time and increases repeatability. • Individual parts can be employed in other analyses of CBED/diffraction pattern.

  17. A fully automated fast analysis system for capillary gas chromatography. Part 1. Automation of system control

    NARCIS (Netherlands)

    Snijders, H.M.J.; Rijks, J.P.E.M.; Bombeeck, A.J.; Rijks, J.A.; Sandra, P.; Lee, M.L.

    1992-01-01

    This paper is dealing with the design, the automation and evaluation of a high speed capillary gas chromatographic system. A combination of software and hardware was developed for a new cold trap/reinjection device that allows selective solvent eliminating and on column sample enrichment and an

  18. Space Environment Automated Alerts and Anomaly Analysis Assistant (SEA^5) for NASA

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to develop a comprehensive analysis and dissemination system (Space Environment Automated Alerts  & Anomaly Analysis Assistant: SEA5) that will...

  19. Automated analysis for detecting beams in laser wakefield simulations

    International Nuclear Information System (INIS)

    Ushizima, Daniela M.; Rubel, Oliver; Prabhat, Mr.; Weber, Gunther H.; Bethel, E. Wes; Aragon, Cecilia R.; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Hamann, Bernd; Messmer, Peter; Hagen, Hans

    2008-01-01

    Laser wakefield particle accelerators have shown the potential to generate electric fields thousands of times higher than those of conventional accelerators. The resulting extremely short particle acceleration distance could yield a potential new compact source of energetic electrons and radiation, with wide applications from medicine to physics. Physicists investigate laser-plasma internal dynamics by running particle-in-cell simulations; however, this generates a large dataset that requires time-consuming, manual inspection by experts in order to detect key features such as beam formation. This paper describes a framework to automate the data analysis and classification of simulation data. First, we propose a new method to identify locations with high density of particles in the space-time domain, based on maximum extremum point detection on the particle distribution. We analyze high density electron regions using a lifetime diagram by organizing and pruning the maximum extrema as nodes in a minimum spanning tree. Second, we partition the multivariate data using fuzzy clustering to detect time steps in a experiment that may contain a high quality electron beam. Finally, we combine results from fuzzy clustering and bunch lifetime analysis to estimate spatially confined beams. We demonstrate our algorithms successfully on four different simulation datasets

  20. GWATCH: a web platform for automated gene association discovery analysis

    Science.gov (United States)

    2014-01-01

    Background As genome-wide sequence analyses for complex human disease determinants are expanding, it is increasingly necessary to develop strategies to promote discovery and validation of potential disease-gene associations. Findings Here we present a dynamic web-based platform – GWATCH – that automates and facilitates four steps in genetic epidemiological discovery: 1) Rapid gene association search and discovery analysis of large genome-wide datasets; 2) Expanded visual display of gene associations for genome-wide variants (SNPs, indels, CNVs), including Manhattan plots, 2D and 3D snapshots of any gene region, and a dynamic genome browser illustrating gene association chromosomal regions; 3) Real-time validation/replication of candidate or putative genes suggested from other sources, limiting Bonferroni genome-wide association study (GWAS) penalties; 4) Open data release and sharing by eliminating privacy constraints (The National Human Genome Research Institute (NHGRI) Institutional Review Board (IRB), informed consent, The Health Insurance Portability and Accountability Act (HIPAA) of 1996 etc.) on unabridged results, which allows for open access comparative and meta-analysis. Conclusions GWATCH is suitable for both GWAS and whole genome sequence association datasets. We illustrate the utility of GWATCH with three large genome-wide association studies for HIV-AIDS resistance genes screened in large multicenter cohorts; however, association datasets from any study can be uploaded and analyzed by GWATCH. PMID:25374661

  1. Automated forensic DNA purification optimized for FTA card punches and identifiler STR-based PCR analysis.

    Science.gov (United States)

    Tack, Lois C; Thomas, Michelle; Reich, Karl

    2007-03-01

    Forensic labs globally face the same problem-a growing need to process a greater number and wider variety of samples for DNA analysis. The same forensic lab can be tasked all at once with processing mixed casework samples from crime scenes, convicted offender samples for database entry, and tissue from tsunami victims for identification. Besides flexibility in the robotic system chosen for forensic automation, there is a need, for each sample type, to develop new methodology that is not only faster but also more reliable than past procedures. FTA is a chemical treatment of paper, unique to Whatman Bioscience, and is used for the stabilization and storage of biological samples. Here, the authors describe optimization of the Whatman FTA Purification Kit protocol for use with the AmpFlSTR Identifiler PCR Amplification Kit.

  2. Chemical analysis of high purity graphite

    International Nuclear Information System (INIS)

    1993-03-01

    The Sub-Committee on Chemical Analysis of Graphite was organized in April 1989, under the Committee on Chemical Analysis of Nuclear Fuels and Reactor Materials, JAERI. The Sub-Committee carried out collaborative analyses among eleven participating laboratories for the certification of the Certified Reference Materials (CRMs), JAERI-G5 and G6, after developing and evaluating analytical methods during the period of September 1989 to March 1992. The certified values were given for ash, boron and silicon in the CRM based on the collaborative analysis. The values for ten elements (Al, Ca, Cr, Fe, Mg, Mo, Ni, Sr, Ti, V) were not certified, but given for information. Preparation, homogeneity testing and chemical analyses for certification of reference materials were described in this paper. (author) 52 refs

  3. Cost and Benefit Analysis of an Automated Nursing Administration System: A Methodology*

    OpenAIRE

    Rieder, Karen A.

    1984-01-01

    In order for a nursing service administration to select the appropriate automated system for its requirements, a systematic process of evaluating alternative approaches must be completed. This paper describes a methodology for evaluating and comparing alternative automated systems based upon an economic analysis which includes two major categories of criteria: costs and benefits.

  4. 40 CFR 13.19 - Analysis of costs; automation; prevention of overpayments, delinquencies or defaults.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 1 2010-07-01 2010-07-01 false Analysis of costs; automation; prevention of overpayments, delinquencies or defaults. 13.19 Section 13.19 Protection of Environment...; automation; prevention of overpayments, delinquencies or defaults. (a) The Administrator may periodically...

  5. Molecular activation analysis for chemical species studies

    International Nuclear Information System (INIS)

    Chai Zhifang; Mao Xueying; Wang Yuqi; Sun Jingxin; Qian Qingfang; Hou Xiaolin; Zhang Peiqun; Chen Chunying; Feng Weiyu; Ding Wenjun; Li Xiaolin; Li Chunsheng; Dai Xiongxin

    2001-01-01

    The Molecular Activation Analysis (MAA) mainly refers to an activation analysis method that is able to provide information about the chemical species of elements in systems of interest, though its exact definition has remained to be assigned. Its development is strongly stimulated by the urgent need to know the chemical species of elements, because the bulk contents or concentrations are often insignificant for judging biological, environmental or geochemical effects of elements. In this paper, the features, methodology and limitation of MAA were outlined. Further, the up-to-date MAA progress made in our laboratory was introduced as well. (author)

  6. Improved automated analysis of radon (222Rn) and thoron (220Rn) in natural waters.

    Science.gov (United States)

    Dimova, Natasha; Burnett, William C; Lane-Smith, Derek

    2009-11-15

    Natural radon ((222)Rn) and thoron ((220)Rn) can be used as tracers of various chemical and physical processes in the environment. We present here results from an extended series of laboratory experiments intended to improve the automated analysis of (222)Rn and (220)Rn in water using a modified RAD AQUA (Durridge Inc.) system. Previous experience with similar equipment showed that it takes about 30-40 min for the system to equilibrate to radon-in-water concentration increases and even longer for the response to return to baseline after a sharp spike. While the original water/gas exchanger setup was built only for radon-in-water measurement, our goal here is to provide an automated system capable of high resolution and good sensitivity for both radon- and thoron-in-water detections. We found that faster water flow rates substantially improved the response for both isotopes while thoron is detected most efficiently at airflow rates of 3 L/min. Our results show that the optimum conditions for fastest response and sensitivity for both isotopes are at water flow rates up to 17 L/min and an airflow rate of 3 L/min through the detector. Applications for such measurements include prospecting for naturally occurring radioactive material (NORM) in pipelines and locating points of groundwater/surface water interaction.

  7. Chemical analysis by nuclear methods. v. 2

    International Nuclear Information System (INIS)

    Alfassi, Z.B.

    1998-01-01

    'Chemical analysis by Nuclear Methods' is an effort of some renowned authors in field of nuclear chemistry and radiochemistry which is compiled by Alfassi, Z.B. and translated into Farsi version collected in two volumes. The second volume consists of the following chapters: Detecting ion recoil scattering and elastic scattering are dealt in the eleventh chapter, the twelfth chapter is devoted to nuclear reaction analysis using charged particles, X-ray emission is discussed at thirteenth chapter, the fourteenth chapter is about using ion microprobes, X-ray fluorescence analysis is discussed in the fifteenth chapter, alpha, beta and gamma ray scattering in chemical analysis are dealt in chapter sixteen, Moessbauer spectroscopy and positron annihilation are discussed in chapter seventeen and eighteen; The last two chapters are about isotope dilution analysis and radioimmunoassay

  8. An extended chemical analysis of gallstone.

    Science.gov (United States)

    Chandran, P; Kuchhal, N K; Garg, P; Pundir, C S

    2007-09-01

    Chemical composition of gall stones is essential for aetiopathogensis of gallstone disease. We have reported quantitative chemical analysis of total cholesterol bilirubin, calcium, iron and inorganic phosphate in 120 gallstones from haryana. To extend this chemical analysis of gall stones by studying more cases and by analyzing more chemical constituents. A quantitative chemical analysis of total cholesterol, total bilirubin, fatty acids, triglycerides, phospholipids, bile acids, soluble proteins, sodium potassium, magnesium, copper, oxalate and chlorides of biliary calculi (52 cholesterol, 76 mixed and 72 pigment) retrieved from surgical operation of 200 patients from Haryana state was carried out. Total cholesterol as the major component and total bilirubin, phospholipids, triglycerides, bile acids, fatty acids (esterified), soluble protein, calcium, magnesium, iron, copper, sodium, potassium, inorganic phosphate, oxalate and chloride as minor components were found in all types of calculi. The cholesterol stones had higher content of total cholesterol, phospholipids, fatty acids (esterified), inorganic phosphate and copper compared to mixed and pigment stones. The mixed stones had higher content of iron and triglycerides than to cholesterol and pigment stones. The pigment stones were richer in total bilirubin, bile acids, calcium, oxalate, magnesium, sodium, potassium, chloride and soluble protein compared to cholesterol and mixed stones. Although total cholesterol was a major component of cholesterol, mixed and pigment gall stone in Haryana, the content of most of the other lipids, cations and anions was different in different gall stones indicating their different mechanism of formation.

  9. Application of automated image analysis to coal petrography

    Science.gov (United States)

    Chao, E.C.T.; Minkin, J.A.; Thompson, C.L.

    1982-01-01

    The coal petrologist seeks to determine the petrographic characteristics of organic and inorganic coal constituents and their lateral and vertical variations within a single coal bed or different coal beds of a particular coal field. Definitive descriptions of coal characteristics and coal facies provide the basis for interpretation of depositional environments, diagenetic changes, and burial history and determination of the degree of coalification or metamorphism. Numerous coal core or columnar samples must be studied in detail in order to adequately describe and define coal microlithotypes, lithotypes, and lithologic facies and their variations. The large amount of petrographic information required can be obtained rapidly and quantitatively by use of an automated image-analysis system (AIAS). An AIAS can be used to generate quantitative megascopic and microscopic modal analyses for the lithologic units of an entire columnar section of a coal bed. In our scheme for megascopic analysis, distinctive bands 2 mm or more thick are first demarcated by visual inspection. These bands consist of either nearly pure microlithotypes or lithotypes such as vitrite/vitrain or fusite/fusain, or assemblages of microlithotypes. Megascopic analysis with the aid of the AIAS is next performed to determine volume percentages of vitrite, inertite, minerals, and microlithotype mixtures in bands 0.5 to 2 mm thick. The microlithotype mixtures are analyzed microscopically by use of the AIAS to determine their modal composition in terms of maceral and optically observable mineral components. Megascopic and microscopic data are combined to describe the coal unit quantitatively in terms of (V) for vitrite, (E) for liptite, (I) for inertite or fusite, (M) for mineral components other than iron sulfide, (S) for iron sulfide, and (VEIM) for the composition of the mixed phases (Xi) i = 1,2, etc. in terms of the maceral groups vitrinite V, exinite E, inertinite I, and optically observable mineral

  10. Automated absolute activation analysis with californium-252 sources

    International Nuclear Information System (INIS)

    MacMurdo, K.W.; Bowman, W.W.

    1978-09-01

    A 100-mg 252 Cf neutron activation analysis facility is used routinely at the Savannah River Laboratory for multielement analysis of many solid and liquid samples. An absolute analysis technique converts counting data directly to elemental concentration without the use of classical comparative standards and flux monitors. With the totally automated pneumatic sample transfer system, cyclic irradiation-decay-count regimes can be pre-selected for up to 40 samples, and samples can be analyzed with the facility unattended. An automatic data control system starts and stops a high-resolution gamma-ray spectrometer and/or a delayed-neutron detector; the system also stores data and controls output modes. Gamma ray data are reduced by three main programs in the IBM 360/195 computer: the 4096-channel spectrum and pertinent experimental timing, counting, and sample data are stored on magnetic tape; the spectrum is then reduced to a list of significant photopeak energies, integrated areas, and their associated statistical errors; and the third program assigns gamma ray photopeaks to the appropriate neutron activation product(s) by comparing photopeak energies to tabulated gamma ray energies. Photopeak areas are then converted to elemental concentration by using experimental timing and sample data, calculated elemental neutron capture rates, absolute detector efficiencies, and absolute spectroscopic decay data. Calculational procedures have been developed so that fissile material can be analyzed by cyclic neutron activation and delayed-neutron counting procedures. These calculations are based on a 6 half-life group model of delayed neutron emission; calculations include corrections for delayed neutron interference from 17 O. Detection sensitivities of 239 Pu were demonstrated with 15-g samples at a throughput of up to 140 per day. Over 40 elements can be detected at the sub-ppM level

  11. Hepatic fat quantification using automated six-point Dixon: Comparison with conventional chemical shift based sequences and computed tomography.

    Science.gov (United States)

    Shimizu, Kie; Namimoto, Tomohiro; Nakagawa, Masataka; Morita, Kosuke; Oda, Seitaro; Nakaura, Takeshi; Utsunomiya, Daisuke; Yamashita, Yasuyuki

    To compare automated six-point Dixon (6-p-Dixon) MRI comparing with dual-echo chemical-shift-imaging (CSI) and CT for hepatic fat fraction in phantoms and clinical study. Phantoms and fifty-nine patients were examined both MRI and CT for quantitative fat measurements. In phantom study, linear regression between fat concentration and 6-p-Dixon showed good agreement. In clinical study, linear regression between 6-p-Dixon and dual-echo CSI showed good agreement. CT attenuation value was strongly correlated with 6-p-Dixon (R 2 =0.852; PDixon and dual-echo CSI were accurate correlation with CT attenuation value of liver parenchyma. 6-p-Dixon has the potential for automated hepatic fat quantification. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Semi-automated analysis of three-dimensional track images

    International Nuclear Information System (INIS)

    Meesen, G.; Poffijn, A.

    2001-01-01

    In the past, three-dimensional (3-d) track images in solid state detectors were difficult to obtain. With the introduction of the confocal scanning laser microscope it is now possible to record 3-d track images in a non-destructive way. These 3-d track images can latter be used to measure typical track parameters. Preparing the detectors and recording the 3-d images however is only the first step. The second step in this process is enhancing the image quality by means of deconvolution techniques to obtain the maximum possible resolution. The third step is extracting the typical track parameters. This can be done on-screen by an experienced operator. For large sets of data however, this manual technique is not desirable. This paper will present some techniques to analyse 3-d track data in an automated way by means of image analysis routines. Advanced thresholding techniques guarantee stable results in different recording situations. By using pre-knowledge about the track shape, reliable object identification is obtained. In case of ambiguity, manual intervention is possible

  13. High-Throughput Analysis and Automation for Glycomics Studies.

    Science.gov (United States)

    Shubhakar, Archana; Reiding, Karli R; Gardner, Richard A; Spencer, Daniel I R; Fernandes, Daryl L; Wuhrer, Manfred

    This review covers advances in analytical technologies for high-throughput (HTP) glycomics. Our focus is on structural studies of glycoprotein glycosylation to support biopharmaceutical realization and the discovery of glycan biomarkers for human disease. For biopharmaceuticals, there is increasing use of glycomics in Quality by Design studies to help optimize glycan profiles of drugs with a view to improving their clinical performance. Glycomics is also used in comparability studies to ensure consistency of glycosylation both throughout product development and between biosimilars and innovator drugs. In clinical studies there is as well an expanding interest in the use of glycomics-for example in Genome Wide Association Studies-to follow changes in glycosylation patterns of biological tissues and fluids with the progress of certain diseases. These include cancers, neurodegenerative disorders and inflammatory conditions. Despite rising activity in this field, there are significant challenges in performing large scale glycomics studies. The requirement is accurate identification and quantitation of individual glycan structures. However, glycoconjugate samples are often very complex and heterogeneous and contain many diverse branched glycan structures. In this article we cover HTP sample preparation and derivatization methods, sample purification, robotization, optimized glycan profiling by UHPLC, MS and multiplexed CE, as well as hyphenated techniques and automated data analysis tools. Throughout, we summarize the advantages and challenges with each of these technologies. The issues considered include reliability of the methods for glycan identification and quantitation, sample throughput, labor intensity, and affordability for large sample numbers.

  14. Technical and economic viability of automated highway systems : preliminary analysis

    Science.gov (United States)

    1997-01-01

    Technical and economic investigations of automated highway systems (AHS) are addressed. It has generally been accepted that such systems show potential to alleviate urban traffic congestion, so most of the AHS research has been focused instead on tec...

  15. Computer automated mass spectrometer for isotope analysis on gas samples

    International Nuclear Information System (INIS)

    Pamula, A.; Kaucsar, M.; Fatu, C.; Ursu, D.; Vonica, D.; Bendea, D.; Muntean, F.

    1998-01-01

    A low resolution, high precision instrument was designed and realized in the mass spectrometry laboratory of the Institute of Isotopic and Molecular Technology, Cluj-Napoca. The paper presents the vacuum system, the sample inlet system, the ion source, the magnetic analyzer and the ion collector. The instrument is almost completely automated. There are described the analog-to-digital conversion circuits, the local control microcomputer, the automation systems and the performance checking. (authors)

  16. Failure analysis on a chemical waste pipe

    International Nuclear Information System (INIS)

    Ambler, J.R.

    1985-01-01

    A failure analysis of a chemical waste pipe illustrates how nuclear technology can spin off metallurgical consultant services. The pipe, made of zirconium alloy (Zr-2.5 wt percent Nb, UNS 60705), had cracked in several places, all at butt welds. A combination of fractography and metallography indicated delayed hydride cracking

  17. Calibrating Detailed Chemical Analysis of M dwarfs

    Science.gov (United States)

    Veyette, Mark; Muirhead, Philip Steven; Mann, Andrew; Brewer, John; Allard, France; Homeier, Derek

    2018-01-01

    The ability to perform detailed chemical analysis of Sun-like F-, G-, and K-type stars is a powerful tool with many applications including studying the chemical evolution of the Galaxy, assessing membership in stellar kinematic groups, and constraining planet formation theories. Unfortunately, complications in modeling cooler stellar atmospheres has hindered similar analysis of M-dwarf stars. Large surveys of FGK abundances play an important role in developing methods to measure the compositions of M dwarfs by providing benchmark FGK stars that have widely-separated M dwarf companions. These systems allow us to empirically calibrate metallicity-sensitive features in M dwarf spectra. However, current methods to measure metallicity in M dwarfs from moderate-resolution spectra are limited to measuring overall metallicity and largely rely on astrophysical abundance correlations in stellar populations. In this talk, I will discuss how large, homogeneous catalogs of precise FGK abundances are crucial to advancing chemical analysis of M dwarfs beyond overall metallicity to direct measurements of individual elemental abundances. I will present a new method to analyze high-resolution, NIR spectra of M dwarfs that employs an empirical calibration of synthetic M dwarf spectra to infer effective temperature, Fe abundance, and Ti abundance. This work is a step toward detailed chemical analysis of M dwarfs at a similar precision achieved for FGK stars.

  18. Automated NMR relaxation dispersion data analysis using NESSY

    Directory of Open Access Journals (Sweden)

    Gooley Paul R

    2011-10-01

    Full Text Available Abstract Background Proteins are dynamic molecules with motions ranging from picoseconds to longer than seconds. Many protein functions, however, appear to occur on the micro to millisecond timescale and therefore there has been intense research of the importance of these motions in catalysis and molecular interactions. Nuclear Magnetic Resonance (NMR relaxation dispersion experiments are used to measure motion of discrete nuclei within the micro to millisecond timescale. Information about conformational/chemical exchange, populations of exchanging states and chemical shift differences are extracted from these experiments. To ensure these parameters are correctly extracted, accurate and careful analysis of these experiments is necessary. Results The software introduced in this article is designed for the automatic analysis of relaxation dispersion data and the extraction of the parameters mentioned above. It is written in Python for multi platform use and highest performance. Experimental data can be fitted to different models using the Levenberg-Marquardt minimization algorithm and different statistical tests can be used to select the best model. To demonstrate the functionality of this program, synthetic data as well as NMR data were analyzed. Analysis of these data including the generation of plots and color coded structures can be performed with minimal user intervention and using standard procedures that are included in the program. Conclusions NESSY is easy to use open source software to analyze NMR relaxation data. The robustness and standard procedures are demonstrated in this article.

  19. Steam generator automated eddy current data analysis: A benchmarking study. Final report

    International Nuclear Information System (INIS)

    Brown, S.D.

    1998-12-01

    The eddy current examination of steam generator tubes is a very demanding process. Challenges include: complex signal analysis, massive amount of data to be reviewed quickly with extreme precision and accuracy, shortages of data analysts during peak periods, and the desire to reduce examination costs. One method to address these challenges is by incorporating automation into the data analysis process. Specific advantages, which automated data analysis has the potential to provide, include the ability to analyze data more quickly, consistently and accurately than can be performed manually. Also, automated data analysis can potentially perform the data analysis function with significantly smaller levels of analyst staffing. Despite the clear advantages that an automated data analysis system has the potential to provide, no automated system has been produced and qualified that can perform all of the functions that utility engineers demand. This report investigates the current status of automated data analysis, both at the commercial and developmental level. A summary of the various commercial and developmental data analysis systems is provided which includes the signal processing methodologies used and, where available, the performance data obtained for each system. Also, included in this report is input from seventeen research organizations regarding the actions required and obstacles to be overcome in order to bring automatic data analysis from the laboratory into the field environment. In order to provide assistance with ongoing and future research efforts in the automated data analysis arena, the most promising approaches to signal processing are described in this report. These approaches include: wavelet applications, pattern recognition, template matching, expert systems, artificial neural networks, fuzzy logic, case based reasoning and genetic algorithms. Utility engineers and NDE researchers can use this information to assist in developing automated data

  20. Automating dChip: toward reproducible sharing of microarray data analysis

    Directory of Open Access Journals (Sweden)

    Li Cheng

    2008-05-01

    Full Text Available Abstract Background During the past decade, many software packages have been developed for analysis and visualization of various types of microarrays. We have developed and maintained the widely used dChip as a microarray analysis software package accessible to both biologist and data analysts. However, challenges arise when dChip users want to analyze large number of arrays automatically and share data analysis procedures and parameters. Improvement is also needed when the dChip user support team tries to identify the causes of reported analysis errors or bugs from users. Results We report here implementation and application of the dChip automation module. Through this module, dChip automation files can be created to include menu steps, parameters, and data viewpoints to run automatically. A data-packaging function allows convenient transfer from one user to another of the dChip software, microarray data, and analysis procedures, so that the second user can reproduce the entire analysis session of the first user. An analysis report file can also be generated during an automated run, including analysis logs, user comments, and viewpoint screenshots. Conclusion The dChip automation module is a step toward reproducible research, and it can prompt a more convenient and reproducible mechanism for sharing microarray software, data, and analysis procedures and results. Automation data packages can also be used as publication supplements. Similar automation mechanisms could be valuable to the research community if implemented in other genomics and bioinformatics software packages.

  1. Robotics/Automated Systems Task Analysis and Description of Required Job Competencies Report. Task Analysis and Description of Required Job Competencies of Robotics/Automated Systems Technicians.

    Science.gov (United States)

    Hull, Daniel M.; Lovett, James E.

    This task analysis report for the Robotics/Automated Systems Technician (RAST) curriculum project first provides a RAST job description. It then discusses the task analysis, including the identification of tasks, the grouping of tasks according to major areas of specialty, and the comparison of the competencies to existing or new courses to…

  2. Development of an automated flow injection analysis system for determination of phosphate in nutrient solutions.

    Science.gov (United States)

    Karadağ, Sevinç; Görüşük, Emine M; Çetinkaya, Ebru; Deveci, Seda; Dönmez, Koray B; Uncuoğlu, Emre; Doğu, Mustafa

    2018-01-25

    A fully automated flow injection analysis (FIA) system was developed for determination of phosphate ion in nutrient solutions. This newly developed FIA system is a portable, rapid and sensitive measuring instrument that allows on-line analysis and monitoring of phosphate ion concentration in nutrient solutions. The molybdenum blue method, which is widely used in FIA phosphate analysis, was adapted to the developed FIA system. The method is based on the formation of ammonium Mo(VI) ion by reaction of ammonium molybdate with the phosphate ion present in the medium. The Mo(VI) ion then reacts with ascorbic acid and is reduced to the spectrometrically measurable Mo(V) ion. New software specific for flow analysis was developed in the LabVIEW development environment to control all the components of the FIA system. The important factors affecting the analytical signal were identified as reagent flow rate, injection volume and post-injection flow path length, and they were optimized using Box-Behnken experimental design and response surface methodology. The optimum point for the maximum analytical signal was calculated as 0.50 mL min -1 reagent flow rate, 100 µL sample injection volume and 60 cm post-injection flow path length. The proposed FIA system had a sampling frequency of 100 samples per hour over a linear working range of 3-100 mg L -1 (R 2  = 0.9995). The relative standard deviation (RSD) was 1.09% and the limit of detection (LOD) was 0.34 mg L -1 . Various nutrient solutions from a tomato-growing hydroponic greenhouse were analyzed with the developed FIA system and the results were found to be in good agreement with vanadomolybdate chemical method findings. © 2018 Society of Chemical Industry. © 2018 Society of Chemical Industry.

  3. Probabilistic risk analysis in chemical engineering

    International Nuclear Information System (INIS)

    Schmalz, F.

    1991-01-01

    In risk analysis in the chemical industry, recognising potential risks is considered more important than assessing their quantitative extent. Even in assessing risks, emphasis is not on the probability involved but on the possible extent. Qualitative assessment has proved valuable here. Probabilistic methods are used in individual cases where the wide implications make it essential to be able to assess the reliability of safety precautions. In this case, assessment therefore centres on the reliability of technical systems and not on the extent of a chemical risk. 7 figs

  4. Microfabricated Gas Phase Chemical Analysis Systems

    International Nuclear Information System (INIS)

    FRYE-MASON, GREGORY CHARLES; HELLER, EDWIN J.; HIETALA, VINCENT M.; KOTTENSTETTE, RICHARD; LEWIS, PATRICK R.; MANGINELL, RONALD P.; MATZKE, CAROLYN M.; WONG, CHUNGNIN C.

    1999-01-01

    A portable, autonomous, hand-held chemical laboratory ((micro)ChemLab(trademark)) is being developed for trace detection (ppb) of chemical warfare (CW) agents and explosives in real-world environments containing high concentrations of interfering compounds. Microfabrication is utilized to provide miniature, low-power components that are characterized by rapid, sensitive and selective response. Sensitivity and selectivity are enhanced using two parallel analysis channels, each containing the sequential connection of a front-end sample collector/concentrator, a gas chromatographic (GC) separator, and a surface acoustic wave (SAW) detector. Component design and fabrication and system performance are described

  5. Development of a fully automated online mixing system for SAXS protein structure analysis

    DEFF Research Database (Denmark)

    Nielsen, Søren Skou; Arleth, Lise

    2010-01-01

    This thesis presents the development of an automated high-throughput mixing and exposure system for Small-Angle Scattering analysis on a synchrotron using polymer microfluidics. Software and hardware for both automated mixing, exposure control on a beamline and automated data reduction...... and preliminary analysis is presented. Three mixing systems that have been the corner stones of the development process are presented including a fully functioning high-throughput microfluidic system that is able to produce and expose 36 mixed samples per hour using 30 μL of sample volume. The system is tested...

  6. Automated Design and Analysis Tool for CEV Structural and TPS Components, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation of the proposed effort is a unique automated process for the analysis, design, and sizing of CEV structures and TPS. This developed process will...

  7. Automated Design and Analysis Tool for CLV/CEV Composite and Metallic Structural Components, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation of the proposed effort is a unique automated process for the analysis, design, and sizing of CLV/CEV composite and metallic structures. This developed...

  8. An Analysis of Automated Solutions for the Certification and Accreditation of Navy Medicine Information Assets

    National Research Council Canada - National Science Library

    Gonzales, Dominic V

    2005-01-01

    ... improve Navy Medicine's current C AND A security posture. The primary research reviewed C AND A policy and included a comparative analysis of two cutting edge automated C AND A tools namely, Xacta and eMASS...

  9. Manual versus Automated Narrative Analysis of Agrammatic Production Patterns: The Northwestern Narrative Language Analysis and Computerized Language Analysis

    Science.gov (United States)

    Hsu, Chien-Ju; Thompson, Cynthia K.

    2018-01-01

    Purpose: The purpose of this study is to compare the outcomes of the manually coded Northwestern Narrative Language Analysis (NNLA) system, which was developed for characterizing agrammatic production patterns, and the automated Computerized Language Analysis (CLAN) system, which has recently been adopted to analyze speech samples of individuals…

  10. Automated computation of autonomous spectral submanifolds for nonlinear modal analysis

    Science.gov (United States)

    Ponsioen, Sten; Pedergnana, Tiemo; Haller, George

    2018-04-01

    We discuss an automated computational methodology for computing two-dimensional spectral submanifolds (SSMs) in autonomous nonlinear mechanical systems of arbitrary degrees of freedom. In our algorithm, SSMs, the smoothest nonlinear continuations of modal subspaces of the linearized system, are constructed up to arbitrary orders of accuracy, using the parameterization method. An advantage of this approach is that the construction of the SSMs does not break down when the SSM folds over its underlying spectral subspace. A further advantage is an automated a posteriori error estimation feature that enables a systematic increase in the orders of the SSM computation until the required accuracy is reached. We find that the present algorithm provides a major speed-up, relative to numerical continuation methods, in the computation of backbone curves, especially in higher-dimensional problems. We illustrate the accuracy and speed of the automated SSM algorithm on lower- and higher-dimensional mechanical systems.

  11. Economic and workflow analysis of a blood bank automated system.

    Science.gov (United States)

    Shin, Kyung-Hwa; Kim, Hyung Hoi; Chang, Chulhun L; Lee, Eun Yup

    2013-07-01

    This study compared the estimated costs and times required for ABO/Rh(D) typing and unexpected antibody screening using an automated system and manual methods. The total cost included direct and labor costs. Labor costs were calculated on the basis of the average operator salaries and unit values (minutes), which was the hands-on time required to test one sample. To estimate unit values, workflows were recorded on video, and the time required for each process was analyzed separately. The unit values of ABO/Rh(D) typing using the manual method were 5.65 and 8.1 min during regular and unsocial working hours, respectively. The unit value was less than 3.5 min when several samples were tested simultaneously. The unit value for unexpected antibody screening was 2.6 min. The unit values using the automated method for ABO/Rh(D) typing, unexpected antibody screening, and both simultaneously were all 1.5 min. The total cost of ABO/Rh(D) typing of only one sample using the automated analyzer was lower than that of testing only one sample using the manual technique but higher than that of testing several samples simultaneously. The total cost of unexpected antibody screening using an automated analyzer was less than that using the manual method. ABO/Rh(D) typing using an automated analyzer incurs a lower unit value and cost than that using the manual technique when only one sample is tested at a time. Unexpected antibody screening using an automated analyzer always incurs a lower unit value and cost than that using the manual technique.

  12. Toward Automated Inventory Modeling in Life Cycle Assessment: The Utility of Semantic Data Modeling to Predict Real-WorldChemical Production

    Science.gov (United States)

    A set of coupled semantic data models, i.e., ontologies, are presented to advance a methodology towards automated inventory modeling of chemical manufacturing in life cycle assessment. The cradle-to-gate life cycle inventory for chemical manufacturing is a detailed collection of ...

  13. Data management in large-scale collaborative toxicity studies: how to file experimental data for automated statistical analysis.

    Science.gov (United States)

    Stanzel, Sven; Weimer, Marc; Kopp-Schneider, Annette

    2013-06-01

    High-throughput screening approaches are carried out for the toxicity assessment of a large number of chemical compounds. In such large-scale in vitro toxicity studies several hundred or thousand concentration-response experiments are conducted. The automated evaluation of concentration-response data using statistical analysis scripts saves time and yields more consistent results in comparison to data analysis performed by the use of menu-driven statistical software. Automated statistical analysis requires that concentration-response data are available in a standardised data format across all compounds. To obtain consistent data formats, a standardised data management workflow must be established, including guidelines for data storage, data handling and data extraction. In this paper two procedures for data management within large-scale toxicological projects are proposed. Both procedures are based on Microsoft Excel files as the researcher's primary data format and use a computer programme to automate the handling of data files. The first procedure assumes that data collection has not yet started whereas the second procedure can be used when data files already exist. Successful implementation of the two approaches into the European project ACuteTox is illustrated. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. Robofurnace: A semi-automated laboratory chemical vapor deposition system for high-throughput nanomaterial synthesis and process discovery

    International Nuclear Information System (INIS)

    Oliver, C. Ryan; Westrick, William; Koehler, Jeremy; Brieland-Shoultz, Anna; Anagnostopoulos-Politis, Ilias; Cruz-Gonzalez, Tizoc; Hart, A. John

    2013-01-01

    Laboratory research and development on new materials, such as nanostructured thin films, often utilizes manual equipment such as tube furnaces due to its relatively low cost and ease of setup. However, these systems can be prone to inconsistent outcomes due to variations in standard operating procedures and limitations in performance such as heating and cooling rates restrict the parameter space that can be explored. Perhaps more importantly, maximization of research throughput and the successful and efficient translation of materials processing knowledge to production-scale systems, relies on the attainment of consistent outcomes. In response to this need, we present a semi-automated lab-scale chemical vapor deposition (CVD) furnace system, called “Robofurnace.” Robofurnace is an automated CVD system built around a standard tube furnace, which automates sample insertion and removal and uses motion of the furnace to achieve rapid heating and cooling. The system has a 10-sample magazine and motorized transfer arm, which isolates the samples from the lab atmosphere and enables highly repeatable placement of the sample within the tube. The system is designed to enable continuous operation of the CVD reactor, with asynchronous loading/unloading of samples. To demonstrate its performance, Robofurnace is used to develop a rapid CVD recipe for carbon nanotube (CNT) forest growth, achieving a 10-fold improvement in CNT forest mass density compared to a benchmark recipe using a manual tube furnace. In the long run, multiple systems like Robofurnace may be linked to share data among laboratories by methods such as Twitter. Our hope is Robofurnace and like automation will enable machine learning to optimize and discover relationships in complex material synthesis processes

  15. Robofurnace: A semi-automated laboratory chemical vapor deposition system for high-throughput nanomaterial synthesis and process discovery

    Energy Technology Data Exchange (ETDEWEB)

    Oliver, C. Ryan; Westrick, William; Koehler, Jeremy; Brieland-Shoultz, Anna; Anagnostopoulos-Politis, Ilias; Cruz-Gonzalez, Tizoc [Department of Mechanical Engineering, University of Michigan, Ann Arbor, Michigan 48109 (United States); Hart, A. John, E-mail: ajhart@mit.edu [Department of Mechanical Engineering, University of Michigan, Ann Arbor, Michigan 48109 (United States); Department of Mechanical Engineering, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139 (United States)

    2013-11-15

    Laboratory research and development on new materials, such as nanostructured thin films, often utilizes manual equipment such as tube furnaces due to its relatively low cost and ease of setup. However, these systems can be prone to inconsistent outcomes due to variations in standard operating procedures and limitations in performance such as heating and cooling rates restrict the parameter space that can be explored. Perhaps more importantly, maximization of research throughput and the successful and efficient translation of materials processing knowledge to production-scale systems, relies on the attainment of consistent outcomes. In response to this need, we present a semi-automated lab-scale chemical vapor deposition (CVD) furnace system, called “Robofurnace.” Robofurnace is an automated CVD system built around a standard tube furnace, which automates sample insertion and removal and uses motion of the furnace to achieve rapid heating and cooling. The system has a 10-sample magazine and motorized transfer arm, which isolates the samples from the lab atmosphere and enables highly repeatable placement of the sample within the tube. The system is designed to enable continuous operation of the CVD reactor, with asynchronous loading/unloading of samples. To demonstrate its performance, Robofurnace is used to develop a rapid CVD recipe for carbon nanotube (CNT) forest growth, achieving a 10-fold improvement in CNT forest mass density compared to a benchmark recipe using a manual tube furnace. In the long run, multiple systems like Robofurnace may be linked to share data among laboratories by methods such as Twitter. Our hope is Robofurnace and like automation will enable machine learning to optimize and discover relationships in complex material synthesis processes.

  16. Automated modal parameter estimation using correlation analysis and bootstrap sampling

    Science.gov (United States)

    Yaghoubi, Vahid; Vakilzadeh, Majid K.; Abrahamsson, Thomas J. S.

    2018-02-01

    The estimation of modal parameters from a set of noisy measured data is a highly judgmental task, with user expertise playing a significant role in distinguishing between estimated physical and noise modes of a test-piece. Various methods have been developed to automate this procedure. The common approach is to identify models with different orders and cluster similar modes together. However, most proposed methods based on this approach suffer from high-dimensional optimization problems in either the estimation or clustering step. To overcome this problem, this study presents an algorithm for autonomous modal parameter estimation in which the only required optimization is performed in a three-dimensional space. To this end, a subspace-based identification method is employed for the estimation and a non-iterative correlation-based method is used for the clustering. This clustering is at the heart of the paper. The keys to success are correlation metrics that are able to treat the problems of spatial eigenvector aliasing and nonunique eigenvectors of coalescent modes simultaneously. The algorithm commences by the identification of an excessively high-order model from frequency response function test data. The high number of modes of this model provides bases for two subspaces: one for likely physical modes of the tested system and one for its complement dubbed the subspace of noise modes. By employing the bootstrap resampling technique, several subsets are generated from the same basic dataset and for each of them a model is identified to form a set of models. Then, by correlation analysis with the two aforementioned subspaces, highly correlated modes of these models which appear repeatedly are clustered together and the noise modes are collected in a so-called Trashbox cluster. Stray noise modes attracted to the mode clusters are trimmed away in a second step by correlation analysis. The final step of the algorithm is a fuzzy c-means clustering procedure applied to

  17. EddyOne automated analysis of PWR/WWER steam generator tubes eddy current data

    International Nuclear Information System (INIS)

    Nadinic, B.; Vanjak, Z.

    2004-01-01

    INETEC Institute for Nuclear Technology developed software package called Eddy One which has option of automated analysis of bobbin coil eddy current data. During its development and on site use, many valuable lessons were learned which are described in this article. In accordance with previous, the following topics are covered: General requirements for automated analysis of bobbin coil eddy current data; Main approaches to automated analysis; Multi rule algorithms for data screening; Landmark detection algorithms as prerequisite for automated analysis (threshold algorithms and algorithms based on neural network principles); Field experience with Eddy One software; Development directions (use of artificial intelligence with self learning abilities for indication detection and sizing); Automated analysis software qualification; Conclusions. Special emphasis is given on results obtained on different types of steam generators, condensers and heat exchangers. Such results are then compared with results obtained by other automated software vendors giving clear advantage to INETEC approach. It has to be pointed out that INETEC field experience was collected also on WWER steam generators what is for now unique experience.(author)

  18. Prajna: adding automated reasoning to the visual- analysis process.

    Science.gov (United States)

    Swing, E

    2010-01-01

    Developers who create applications for knowledge representation must contend with challenges in both the abundance of data and the variety of toolkits, architectures, and standards for representing it. Prajna is a flexible Java toolkit designed to overcome these challenges with an extensible architecture that supports both visualization and automated reasoning.

  19. Automated analysis of angle closure from anterior chamber angle images.

    Science.gov (United States)

    Baskaran, Mani; Cheng, Jun; Perera, Shamira A; Tun, Tin A; Liu, Jiang; Aung, Tin

    2014-10-21

    To evaluate a novel software capable of automatically grading angle closure on EyeCam angle images in comparison with manual grading of images, with gonioscopy as the reference standard. In this hospital-based, prospective study, subjects underwent gonioscopy by a single observer, and EyeCam imaging by a different operator. The anterior chamber angle in a quadrant was classified as closed if the posterior trabecular meshwork could not be seen. An eye was classified as having angle closure if there were two or more quadrants of closure. Automated grading of the angle images was performed using customized software. Agreement between the methods was ascertained by κ statistic and comparison of area under receiver operating characteristic curves (AUC). One hundred forty subjects (140 eyes) were included, most of whom were Chinese (102/140, 72.9%) and women (72/140, 51.5%). Angle closure was detected in 61 eyes (43.6%) with gonioscopy in comparison with 59 eyes (42.1%, P = 0.73) using manual grading, and 67 eyes (47.9%, P = 0.24) with automated grading of EyeCam images. The agreement for angle closure diagnosis between gonioscopy and both manual (κ = 0.88; 95% confidence interval [CI), 0.81-0.96) and automated grading of EyeCam images was good (κ = 0.74; 95% CI, 0.63-0.85). The AUC for detecting eyes with gonioscopic angle closure was comparable for manual and automated grading (AUC 0.974 vs. 0.954, P = 0.31) of EyeCam images. Customized software for automated grading of EyeCam angle images was found to have good agreement with gonioscopy. Human observation of the EyeCam images may still be needed to avoid gross misclassification, especially in eyes with extensive angle closure. Copyright 2014 The Association for Research in Vision and Ophthalmology, Inc.

  20. Chemical analysis of refractories by plasma spectrometry

    International Nuclear Information System (INIS)

    Coutinho, C.A.

    1990-01-01

    X-ray spectrometry has been, since the last two or three decades, the traditional procedure for the chemical analysis of refractories, due to its high degree of accuracy and speed to produce analytical results. An interesting alternative to X-ray fluorescence is provided by the Inductively Coupled Plasma Spectrometry technique, for those laboratories where wet chemistry facilities are already available or process control is not required at high speed, or investiment costs have to be low. This paper presents results obtained by plasma spectroscopy for the analysis of silico - aluminous refractories, showing calibration curves, precion and detection limits. Considerations and comparisons with X-ray fluorescence are also made. (author) [pt

  1. Automated X-ray image analysis for cargo security: Critical review and future promise.

    Science.gov (United States)

    Rogers, Thomas W; Jaccard, Nicolas; Morton, Edward J; Griffin, Lewis D

    2017-01-01

    We review the relatively immature field of automated image analysis for X-ray cargo imagery. There is increasing demand for automated analysis methods that can assist in the inspection and selection of containers, due to the ever-growing volumes of traded cargo and the increasing concerns that customs- and security-related threats are being smuggled across borders by organised crime and terrorist networks. We split the field into the classical pipeline of image preprocessing and image understanding. Preprocessing includes: image manipulation; quality improvement; Threat Image Projection (TIP); and material discrimination and segmentation. Image understanding includes: Automated Threat Detection (ATD); and Automated Contents Verification (ACV). We identify several gaps in the literature that need to be addressed and propose ideas for future research. Where the current literature is sparse we borrow from the single-view, multi-view, and CT X-ray baggage domains, which have some characteristics in common with X-ray cargo.

  2. Chemical detection, identification, and analysis system

    International Nuclear Information System (INIS)

    Morel, R.S.; Gonzales, D.; Mniszewski, S.

    1990-01-01

    The chemical detection, identification, and analysis system (CDIAS) has three major goals. The first is to display safety information regarding chemical environment before personnel entry. The second is to archive personnel exposure to the environment. Third, the system assists users in identifying the stage of a chemical process in progress and suggests safety precautions associated with that process. In addition to these major goals, the system must be sufficiently compact to provide transportability, and it must be extremely simple to use in order to keep user interaction at a minimum. The system created to meet these goals includes several pieces of hardware and the integration of four software packages. The hardware consists of a low-oxygen, carbon monoxide, explosives, and hydrogen sulfide detector; an ion mobility spectrometer for airborne vapor detection; and a COMPAQ 386/20 portable computer. The software modules are a graphics kernel, an expert system shell, a data-base management system, and an interface management system. A supervisory module developed using the interface management system coordinates the interaction of the other software components. The system determines the safety of the environment using conventional data acquisition and analysis techniques. The low-oxygen, carbon monoxide, hydrogen sulfide, explosives, and vapor detectors are monitored for hazardous levels, and warnings are issued accordingly

  3. Extending and automating a Systems-Theoretic hazard analysis for requirements generation and analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, John (Massachusetts Institute of Technology)

    2012-05-01

    Systems Theoretic Process Analysis (STPA) is a powerful new hazard analysis method designed to go beyond traditional safety techniques - such as Fault Tree Analysis (FTA) - that overlook important causes of accidents like flawed requirements, dysfunctional component interactions, and software errors. While proving to be very effective on real systems, no formal structure has been defined for STPA and its application has been ad-hoc with no rigorous procedures or model-based design tools. This report defines a formal mathematical structure underlying STPA and describes a procedure for systematically performing an STPA analysis based on that structure. A method for using the results of the hazard analysis to generate formal safety-critical, model-based system and software requirements is also presented. Techniques to automate both the analysis and the requirements generation are introduced, as well as a method to detect conflicts between the safety and other functional model-based requirements during early development of the system.

  4. SWOT Analysis of Automation for Cash and Accounts Control in Construction

    OpenAIRE

    Mariya Deriy

    2013-01-01

    The possibility has been analyzed as to computerization of control over accounting and information systems data in terms of cash and payments in company practical activity provided that the problem is solved of the existence of well-functioning single computer network between different units of a developing company. Current state of the control organization and possibility of its automation has been observed. SWOT analysis of control automation to identify its strengths and weaknesses, obstac...

  5. Automated Source Code Analysis to Identify and Remove Software Security Vulnerabilities: Case Studies on Java Programs

    OpenAIRE

    Natarajan Meghanathan

    2013-01-01

    The high-level contribution of this paper is to illustrate the development of generic solution strategies to remove software security vulnerabilities that could be identified using automated tools for source code analysis on software programs (developed in Java). We use the Source Code Analyzer and Audit Workbench automated tools, developed by HP Fortify Inc., for our testing purposes. We present case studies involving a file writer program embedded with features for password validation, and ...

  6. Redefining the Practice of Peer Review Through Intelligent Automation-Part 3: Automated Report Analysis and Data Reconciliation.

    Science.gov (United States)

    Reiner, Bruce I

    2018-02-01

    One method for addressing existing peer review limitations is the assignment of peer review cases on a completely blinded basis, in which the peer reviewer would create an independent report which can then be cross-referenced with the primary reader report of record. By leveraging existing computerized data mining techniques, one could in theory automate and objectify the process of report data extraction, classification, and analysis, while reducing time and resource requirements intrinsic to manual peer review report analysis. Once inter-report analysis has been performed, resulting inter-report discrepancies can be presented to the radiologist of record for review, along with the option to directly communicate with the peer reviewer through an electronic data reconciliation tool aimed at collaboratively resolving inter-report discrepancies and improving report accuracy. All associated report and reconciled data could in turn be recorded in a referenceable peer review database, which provides opportunity for context and user-specific education and decision support.

  7. Some comments on misuse of terms related to chemical analysis

    International Nuclear Information System (INIS)

    Steinnes, E.

    2007-01-01

    Complet text of publication follows. I have been involved in scientific studies involving chemical analysis for more than 49 years. Over this period I have observed an increasing tendency to incorrect use of terms 'analysis' and 'determination' and the corresponding verbum forms. According to correct terminology in English, samples are analyzed, analytes (e.g., trace elements) are determined. However, too often expressions such as 'analysis of copper in blood' are seen in the literature, especially in papers written by non-chemists. The reason why I am raising this point at the present time in that I observed the problem in several recent titles of papers published over the last few years in the Journal of Radioanalytical and Nuclear Chemistry: Preconcentration and neutron activation analysis of thorium and uranium in natural waters. Use of activated carbon as pre-separation agent in NAA of selenium, cobalt and iodine. Recent developments in the analysis of transuranics (Np, Pu, Am) in sea water. Automated radiochemical analysis of total 99 Tc in aged nuclear waste processing streams. Photon activation analysis of carbon in glasses for fiber amplifiers by using the flow method for the rapid separation of 11 C. Preconcentration neutron activation analysis of lanthanides by cloudpoint extraction using PAN. Analysis of the chemical elements in leaves infected by fumagina by X-ray fluorescence technique. Rapid method for 226 Ra and 228 Ra analysis in water samples. The above list is far from exhaustive. I believe that this incorrect use of terminology should be avoided at least in the titles of scientific papers, in Journal of Radioanalytical and Nuclear Chemistry as well as in other scientific journals. In some of the above cases replacing 'of' with 'for the determination of', or just with 'for', would have solved the problem. In other cases it would be preferable to reverse the order of words in the sentence, such as e.g., 'Determination of selenium, cobalt and

  8. The ECLSS Advanced Automation Project Evolution and Technology Assessment

    Science.gov (United States)

    Dewberry, Brandon S.; Carnes, James R.; Lukefahr, Brenda D.; Rogers, John S.; Rochowiak, Daniel M.; Mckee, James W.; Benson, Brian L.

    1990-01-01

    Viewgraphs on Environmental Control and Life Support System (ECLSS) advanced automation project evolution and technology assessment are presented. Topics covered include: the ECLSS advanced automation project; automatic fault diagnosis of ECLSS subsystems descriptions; in-line, real-time chemical and microbial fluid analysis; and object-oriented, distributed chemical and microbial modeling of regenerative environmental control systems description.

  9. Automics: an integrated platform for NMR-based metabonomics spectral processing and data analysis

    Directory of Open Access Journals (Sweden)

    Qu Lijia

    2009-03-01

    Full Text Available Abstract Background Spectral processing and post-experimental data analysis are the major tasks in NMR-based metabonomics studies. While there are commercial and free licensed software tools available to assist these tasks, researchers usually have to use multiple software packages for their studies because software packages generally focus on specific tasks. It would be beneficial to have a highly integrated platform, in which these tasks can be completed within one package. Moreover, with open source architecture, newly proposed algorithms or methods for spectral processing and data analysis can be implemented much more easily and accessed freely by the public. Results In this paper, we report an open source software tool, Automics, which is specifically designed for NMR-based metabonomics studies. Automics is a highly integrated platform that provides functions covering almost all the stages of NMR-based metabonomics studies. Automics provides high throughput automatic modules with most recently proposed algorithms and powerful manual modules for 1D NMR spectral processing. In addition to spectral processing functions, powerful features for data organization, data pre-processing, and data analysis have been implemented. Nine statistical methods can be applied to analyses including: feature selection (Fisher's criterion, data reduction (PCA, LDA, ULDA, unsupervised clustering (K-Mean and supervised regression and classification (PLS/PLS-DA, KNN, SIMCA, SVM. Moreover, Automics has a user-friendly graphical interface for visualizing NMR spectra and data analysis results. The functional ability of Automics is demonstrated with an analysis of a type 2 diabetes metabolic profile. Conclusion Automics facilitates high throughput 1D NMR spectral processing and high dimensional data analysis for NMR-based metabonomics applications. Using Automics, users can complete spectral processing and data analysis within one software package in most cases

  10. Automics: an integrated platform for NMR-based metabonomics spectral processing and data analysis.

    Science.gov (United States)

    Wang, Tao; Shao, Kang; Chu, Qinying; Ren, Yanfei; Mu, Yiming; Qu, Lijia; He, Jie; Jin, Changwen; Xia, Bin

    2009-03-16

    Spectral processing and post-experimental data analysis are the major tasks in NMR-based metabonomics studies. While there are commercial and free licensed software tools available to assist these tasks, researchers usually have to use multiple software packages for their studies because software packages generally focus on specific tasks. It would be beneficial to have a highly integrated platform, in which these tasks can be completed within one package. Moreover, with open source architecture, newly proposed algorithms or methods for spectral processing and data analysis can be implemented much more easily and accessed freely by the public. In this paper, we report an open source software tool, Automics, which is specifically designed for NMR-based metabonomics studies. Automics is a highly integrated platform that provides functions covering almost all the stages of NMR-based metabonomics studies. Automics provides high throughput automatic modules with most recently proposed algorithms and powerful manual modules for 1D NMR spectral processing. In addition to spectral processing functions, powerful features for data organization, data pre-processing, and data analysis have been implemented. Nine statistical methods can be applied to analyses including: feature selection (Fisher's criterion), data reduction (PCA, LDA, ULDA), unsupervised clustering (K-Mean) and supervised regression and classification (PLS/PLS-DA, KNN, SIMCA, SVM). Moreover, Automics has a user-friendly graphical interface for visualizing NMR spectra and data analysis results. The functional ability of Automics is demonstrated with an analysis of a type 2 diabetes metabolic profile. Automics facilitates high throughput 1D NMR spectral processing and high dimensional data analysis for NMR-based metabonomics applications. Using Automics, users can complete spectral processing and data analysis within one software package in most cases. Moreover, with its open source architecture, interested

  11. Automated defect spatial signature analysis for semiconductor manufacturing process

    Science.gov (United States)

    Tobin, Jr., Kenneth W.; Gleason, Shaun S.; Karnowski, Thomas P.; Sari-Sarraf, Hamed

    1999-01-01

    An apparatus and method for performing automated defect spatial signature alysis on a data set representing defect coordinates and wafer processing information includes categorizing data from the data set into a plurality of high level categories, classifying the categorized data contained in each high level category into user-labeled signature events, and correlating the categorized, classified signature events to a present or incipient anomalous process condition.

  12. Automated handling for SAF batch furnace and chemistry analysis operations

    International Nuclear Information System (INIS)

    Bowen, W.W.; Sherrell, D.L.; Wiemers, M.J.

    1981-01-01

    The Secure Automated Fabrication Program is developing a remotely operated breeder reactor fuel pin fabrication line. The equipment will be installed in the Fuels and Materials Examination Facility being constructed at Hanford, Washington. Production is scheduled to start in mid-1986. The application of small pneumatically operated industrial robots for loading and unloading product into and out of batch furnaces and for distribution and handling of chemistry samples is described

  13. Alert management for home healthcare based on home automation analysis.

    Science.gov (United States)

    Truong, T T; de Lamotte, F; Diguet, J-Ph; Said-Hocine, F

    2010-01-01

    Rising healthcare for elder and disabled people can be controlled by offering people autonomy at home by means of information technology. In this paper, we present an original and sensorless alert management solution which performs multimedia and home automation service discrimination and extracts highly regular home activities as sensors for alert management. The results of simulation data, based on real context, allow us to evaluate our approach before application to real data.

  14. Improving Automated Lexical and Discourse Analysis of Online Chat Dialog

    Science.gov (United States)

    2007-09-01

    chatbots ”. Chatbots are automated user software independent of the chat room system that assist human participants, provide entertainment to the chat...both the chat room system and chatbots as well as information provided by the system and chatbots were often preceded by either the token “.” or...personal chatbots . Finally, we also classified chatbot responses as system dialog acts. The Yes/No Question chat dialog act is simply a question that

  15. Prototype Software for Automated Structural Analysis of Systems

    DEFF Research Database (Denmark)

    Jørgensen, A.; Izadi-Zamanabadi, Roozbeh; Kristensen, M.

    2004-01-01

    In this paper we present a prototype software tool that is developed to analyse the structural model of automated systems in order to identify redundant information that is hence utilized for Fault detection and Isolation (FDI) purposes. The dedicated algorithms in this software tool use a tri......-partite graph that represents the structural model of the system. A component-based approach has been used to address issues such as system complexity and recon¯gurability possibilities....

  16. Prototype Software for Automated Structural Analysis of Systems

    DEFF Research Database (Denmark)

    Jørgensen, A.; Izadi-Zamanabadi, Roozbeh; Kristensen, M.

    2004-01-01

    In this paper we present a prototype software tool that is developed to analyse the structural model of automated systems in order to identify redundant information that is hence utilized for Fault detection and Isolation (FDI) purposes. The dedicated algorithms in this software tool use a tri......-partite graph that represents the structural model of the system. A component-based approach has been used to address issues such as system complexity and reconfigurability possibilities....

  17. Fuzzy Emotional Semantic Analysis and Automated Annotation of Scene Images

    Directory of Open Access Journals (Sweden)

    Jianfang Cao

    2015-01-01

    Full Text Available With the advances in electronic and imaging techniques, the production of digital images has rapidly increased, and the extraction and automated annotation of emotional semantics implied by images have become issues that must be urgently addressed. To better simulate human subjectivity and ambiguity for understanding scene images, the current study proposes an emotional semantic annotation method for scene images based on fuzzy set theory. A fuzzy membership degree was calculated to describe the emotional degree of a scene image and was implemented using the Adaboost algorithm and a back-propagation (BP neural network. The automated annotation method was trained and tested using scene images from the SUN Database. The annotation results were then compared with those based on artificial annotation. Our method showed an annotation accuracy rate of 91.2% for basic emotional values and 82.4% after extended emotional values were added, which correspond to increases of 5.5% and 8.9%, respectively, compared with the results from using a single BP neural network algorithm. Furthermore, the retrieval accuracy rate based on our method reached approximately 89%. This study attempts to lay a solid foundation for the automated emotional semantic annotation of more types of images and therefore is of practical significance.

  18. Automated multivariate analysis of comprehensive two-dimensional gas chromatograms of petroleum

    DEFF Research Database (Denmark)

    Skov, Søren Furbo

    of separated compounds makes the analysis of GCGC chromatograms tricky, as there are too much data for manual analysis , and automated analysis is not always trouble-free: Manual checking of the results is often necessary. In this work, I will investigate the possibility of another approach to analysis of GCGC...... impossible to find it. For a special class of models, multi-way models, unique solutions often exist, meaning that the underlying phenomena can be found. I have tested this class of models on GCGC data from petroleum and conclude that more work is needed before they can be automated. I demonstrate how...

  19. Development of Process Automation in the Neutron Activation Analysis Facility in Malaysian Nuclear Agency

    International Nuclear Information System (INIS)

    Yussup, N.; Azman, A.; Ibrahim, M.M.; Rahman, N.A.A.; Che Sohashaari, S.; Atan, M.N.; Hamzah, M.A.; Mokhtar, M.; Khalid, M.A.; Salim, N.A.A.; Hamzah, M.S.

    2018-01-01

    Neutron Activation Analysis (NAA) has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s. Most of the procedures established from sample registration to analysis are performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient. Hence, system automation is developed in order to provide an effective method to replace redundant manual data entries and produce faster sample analysis and calculation process. This report explains NAA process in Nuclear Malaysia and describes the automation development in detail which includes sample registration software, automatic sample changer system which consists of hardware and software; and sample analysis software. (author)

  20. The development of chemical speciation analysis

    International Nuclear Information System (INIS)

    Martin, R.; Santana, J.L.; Lima, L.; De La Rosa, D.; Melchor, K.

    2003-01-01

    The knowledge of many metals species on the environmental, its bioaccumulation, quantification and its effect in human body has been studied by a wide researchers groups in the last two decades. The development of speciation analysis has an vertiginous advance close to the developing of novel analytical techniques. Separation and quantification at low level is a problem that's has been afford by a coupling of high resolution chromatographic techniques like HPLC and HRGC with a specific method of detection (ICP-MS or CV-AAS). This methodological approach make possible the success in chemical speciation nowadays

  1. Laser chemical analysis: the recent developments

    International Nuclear Information System (INIS)

    Mauchien, P.

    1997-01-01

    This paper gives a general overview and describes the principles of the main laser-based techniques for physical and chemical analysis, and of their recent developments. Analytical techniques using laser radiations were actually developed at the end of the 1970's. The recent evolutions concern the 3 principal techniques of laser spectroscopy currently used: Raman, fluorescence (atomic and molecular) and ablation (ICP laser ablation-plasma coupling, optical emission spectroscopy on laser-induced plasma). The description of these different techniques is illustrated with some examples of applications. (J.S.)

  2. Activation and chemical analysis of drinking waters

    International Nuclear Information System (INIS)

    Sharma, H.K.; Mittal, V.K.; Sahota, H.S.

    1989-01-01

    Ground water samples from Patiala city have been analysed for 22 trace elements using neutron activation analysis and for seven chemical parameters using standard techniques. It was found that alkali and alkaline earth metals have high concentrations in all samples whereas the concentrations of toxic metals are low in the majority of samples. However, chromium and cadmium concentrations are higher in ground water taken from the industrial belt of the city. This indicates that the overall level of pollution is low, but that some measures are still needed to inhibit various industries from polluting the ground water. (author)

  3. Development of chemical equilibrium analysis code 'CHEEQ'

    International Nuclear Information System (INIS)

    Nagai, Shuichiro

    2006-08-01

    'CHEEQ' code which calculates the partial pressure and the mass of the system consisting of ideal gas and pure condensed phase compounds, was developed. Characteristics of 'CHEEQ' code are as follows. All the chemical equilibrium equations were described by the formation reactions from the mono-atomic gases in order to simplify the code structure and input preparation. Chemical equilibrium conditions, Σν i μ i =0 for the gaseous compounds and precipitated condensed phase compounds and Σν i μ i > 0 for the non-precipitated condensed phase compounds, were applied. Where, ν i and μ i are stoichiometric coefficient and chemical potential of component i. Virtual solid model was introduced to perform the calculation of constant partial pressure condition. 'CHEEQ' was consisted of following 3 parts, (1) analysis code, zc132. f. (2) thermodynamic data base, zmdb01 and (3) input data file, zindb. 'CHEEQ' code can calculate the system which consisted of elements (max.20), condensed phase compounds (max.100) and gaseous compounds. (max.200). Thermodynamic data base, zmdb01 contains about 1000 elements and compounds, and 200 of them were Actinide elements and their compounds. This report describes the basic equations, the outline of the solution procedure and instructions to prepare the input data and to evaluate the calculation results. (author)

  4. VALIDATION GUIDELINES FOR LABORATORIES PERFORMING FORENSIC ANALYSIS OF CHEMICAL TERRORISM

    Science.gov (United States)

    The Scientific Working Group on Forensic Analysis of Chemical Terrorism (SWGFACT) has developed the following guidelines for laboratories engaged in the forensic analysis of chemical evidence associated with terrorism. This document provides a baseline framework and guidance for...

  5. Development of an Automated Technique for Failure Modes and Effect Analysis

    DEFF Research Database (Denmark)

    Blanke, M.; Borch, Ole; Allasia, G.

    1999-01-01

    Advances in automation have provided integration of monitoring and control functions to enhance the operator's overview and ability to take remedy actions when faults occur. Automation in plant supervision is technically possible with integrated automation systems as platforms, but new design...... methods are needed to cope efficiently with the complexity and to ensure that the functionality of a supervisor is correct and consistent. In particular these methods are expected to significantly improve fault tolerance of the designed systems. The purpose of this work is to develop a software module...... implementing an automated technique for Failure Modes and Effects Analysis (FMEA). This technique is based on the matrix formulation of FMEA for the investigation of failure propagation through a system. As main result, this technique will provide the design engineer with decision tables for fault handling...

  6. Automated longitudinal intra-subject analysis (ALISA) for diffusion MRI tractography

    DEFF Research Database (Denmark)

    Aarnink, Saskia H; Vos, Sjoerd B; Leemans, Alexander

    2014-01-01

    the inter-subject and intra-subject automation in this situation are intended for subjects without gross pathology. In this work, we propose such an automated longitudinal intra-subject analysis (dubbed ALISA) approach, and assessed whether ALISA could preserve the same level of reliability as obtained....... The major disadvantage of manual FT segmentations, unfortunately, is that placing regions-of-interest for tract selection can be very labor-intensive and time-consuming. Although there are several methods that can identify specific WM fiber bundles in an automated way, manual FT segmentations across...... multiple subjects performed by a trained rater with neuroanatomical expertise are generally assumed to be more accurate. However, for longitudinal DTI analyses it may still be beneficial to automate the FT segmentation across multiple time points, but then for each individual subject separately. Both...

  7. Development of an automated technique for failure modes and effect analysis

    DEFF Research Database (Denmark)

    Blanke, Mogens; Borch, Ole; Bagnoli, F.

    1999-01-01

    Advances in automation have provided integration of monitoring and control functions to enhance the operator's overview and ability to take remedy actions when faults occur. Automation in plant supervision is technically possible with integrated automation systems as platforms, but new design...... methods are needed to cope efficiently with the complexity and to ensure that the functionality of a supervisor is correct and consistent. In particular these methods are expected to significantly improve fault tolerance of the designed systems. The purpose of this work is to develop a software module...... implementing an automated technique for Failure Modes and Effects Analysis (FMEA). This technique is based on the matrix formulation of FMEA for the investigation of failure propagation through a system. As main result, this technique will provide the design engineer with decision tables for fault handling...

  8. Engineering Mathematical Analysis Method for Productivity Rate in Linear Arrangement Serial Structure Automated Flow Assembly Line

    Directory of Open Access Journals (Sweden)

    Tan Chan Sin

    2015-01-01

    Full Text Available Productivity rate (Q or production rate is one of the important indicator criteria for industrial engineer to improve the system and finish good output in production or assembly line. Mathematical and statistical analysis method is required to be applied for productivity rate in industry visual overviews of the failure factors and further improvement within the production line especially for automated flow line since it is complicated. Mathematical model of productivity rate in linear arrangement serial structure automated flow line with different failure rate and bottleneck machining time parameters becomes the basic model for this productivity analysis. This paper presents the engineering mathematical analysis method which is applied in an automotive company which possesses automated flow assembly line in final assembly line to produce motorcycle in Malaysia. DCAS engineering and mathematical analysis method that consists of four stages known as data collection, calculation and comparison, analysis, and sustainable improvement is used to analyze productivity in automated flow assembly line based on particular mathematical model. Variety of failure rate that causes loss of productivity and bottleneck machining time is shown specifically in mathematic figure and presents the sustainable solution for productivity improvement for this final assembly automated flow line.

  9. Probabilistic Neural Networks for Chemical Sensor Array Pattern Recognition: Comparison Studies, Improvements and Automated Outlier Rejection

    National Research Council Canada - National Science Library

    Shaffer, Ronald E

    1998-01-01

    For application to chemical sensor arrays, the ideal pattern recognition is accurate, fast, simple to train, robust to outliers, has low memory requirements, and has the ability to produce a measure...

  10. A chemical sensor and biosensor based totally automated water quality monitor for extended space flight: Step 1

    Science.gov (United States)

    Smith, Robert S.

    1993-01-01

    The result of a literature search to consider what technologies should be represented in a totally automated water quality monitor for extended space flight is presented. It is the result of the first summer in a three year JOVE project. The next step will be to build a test platform at the Authors' school, St. John Fisher College. This will involve undergraduates in NASA related research. The test flow injection analysis system will be used to test the detection limit of sensors and the performance of sensors in groups. Sensor companies and research groups will be encouraged to produce sensors which are not currently available and are needed for this project.

  11. Web-based automation of green building rating index and life cycle cost analysis

    Science.gov (United States)

    Shahzaib Khan, Jam; Zakaria, Rozana; Aminuddin, Eeydzah; IzieAdiana Abidin, Nur; Sahamir, Shaza Rina; Ahmad, Rosli; Nafis Abas, Darul

    2018-04-01

    Sudden decline in financial markets and economic meltdown has slow down adaptation and lowered interest of investors towards green certified buildings due to their higher initial costs. Similarly, it is essential to fetch investor’s attention towards more development of green buildings through automated tools for the construction projects. Though, historical dearth is found on the automation of green building rating tools that brings up an essential gap to develop an automated analog computerized programming tool. This paper present a proposed research aim to develop an integrated web-based automated analog computerized programming that applies green building rating assessment tool, green technology and life cycle cost analysis. It also emphasizes to identify variables of MyCrest and LCC to be integrated and developed in a framework then transformed into automated analog computerized programming. A mix methodology of qualitative and quantitative survey and its development portray the planned to carry MyCrest-LCC integration to an automated level. In this study, the preliminary literature review enriches better understanding of Green Building Rating Tools (GBRT) integration to LCC. The outcome of this research is a pave way for future researchers to integrate other efficient tool and parameters that contributes towards green buildings and future agendas.

  12. Comparative analysis of automation of production process with industrial robots in Asia/Australia and Europe

    Directory of Open Access Journals (Sweden)

    I. Karabegović

    2017-01-01

    Full Text Available The term "INDUSTRY 4.0" or "fourth industrial revolution" was first introduced at the fair in 2011 in Hannover. It comes from the high-tech strategy of the German Federal Government that promotes automation-computerization to complete smart automation, meaning the introduction of a method of self-automation, self-configuration, self-diagnosing and fixing the problem, knowledge and intelligent decision-making. Any automation, including smart, cannot be imagined without industrial robots. Along with the fourth industrial revolution, ‘’robotic revolution’’ is taking place in Japan. Robotic revolution refers to the development and research of robotic technology with the aim of using robots in all production processes, and the use of robots in real life, to be of service to a man in daily life. Knowing these facts, an analysis was conducted of the representation of industrial robots in the production processes on the two continents of Europe and Asia /Australia, as well as research that industry is ready for the introduction of intelligent automation with the goal of establishing future smart factories. The paper gives a representation of the automation of production processes in Europe and Asia/Australia, with predictions for the future.

  13. Systems analysis of past, present, and future chemical terrorism scenarios.

    Energy Technology Data Exchange (ETDEWEB)

    Hoette, Trisha Marie

    2012-03-01

    Throughout history, as new chemical threats arose, strategies for the defense against chemical attacks have also evolved. As a part of an Early Career Laboratory Directed Research and Development project, a systems analysis of past, present, and future chemical terrorism scenarios was performed to understand how the chemical threats and attack strategies change over time. For the analysis, the difficulty in executing chemical attack was evaluated within a framework of three major scenario elements. First, historical examples of chemical terrorism were examined to determine how the use of chemical threats, versus other weapons, contributed to the successful execution of the attack. Using the same framework, the future of chemical terrorism was assessed with respect to the impact of globalization and new technologies. Finally, the efficacy of the current defenses against contemporary chemical terrorism was considered briefly. The results of this analysis justify the need for continued diligence in chemical defense.

  14. Process automation

    International Nuclear Information System (INIS)

    Moser, D.R.

    1986-01-01

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs

  15. Automated daily quality control analysis for mammography in a multi-unit imaging center.

    Science.gov (United States)

    Sundell, Veli-Matti; Mäkelä, Teemu; Meaney, Alexander; Kaasalainen, Touko; Savolainen, Sauli

    2018-01-01

    Background The high requirements for mammography image quality necessitate a systematic quality assurance process. Digital imaging allows automation of the image quality analysis, which can potentially improve repeatability and objectivity compared to a visual evaluation made by the users. Purpose To develop an automatic image quality analysis software for daily mammography quality control in a multi-unit imaging center. Material and Methods An automated image quality analysis software using the discrete wavelet transform and multiresolution analysis was developed for the American College of Radiology accreditation phantom. The software was validated by analyzing 60 randomly selected phantom images from six mammography systems and 20 phantom images with different dose levels from one mammography system. The results were compared to a visual analysis made by four reviewers. Additionally, long-term image quality trends of a full-field digital mammography system and a computed radiography mammography system were investigated. Results The automated software produced feature detection levels comparable to visual analysis. The agreement was good in the case of fibers, while the software detected somewhat more microcalcifications and characteristic masses. Long-term follow-up via a quality assurance web portal demonstrated the feasibility of using the software for monitoring the performance of mammography systems in a multi-unit imaging center. Conclusion Automated image quality analysis enables monitoring the performance of digital mammography systems in an efficient, centralized manner.

  16. Completely automated modal analysis procedure based on the combination of different OMA methods

    Science.gov (United States)

    Ripamonti, Francesco; Bussini, Alberto; Resta, Ferruccio

    2018-03-01

    In this work a completely automated output-only Modal Analysis procedure is presented and all its benefits are listed. Based on the merging of different Operational Modal Analysis methods and a statistical approach, the identification process has been improved becoming more robust and giving as results only the real natural frequencies, damping ratios and mode shapes of the system. The effect of the temperature can be taken into account as well, leading to the creation of a better tool for automated Structural Health Monitoring. The algorithm has been developed and tested on a numerical model of a scaled three-story steel building present in the laboratories of Politecnico di Milano.

  17. 3D Assembly Group Analysis for Cognitive Automation

    Directory of Open Access Journals (Sweden)

    Christian Brecher

    2012-01-01

    Full Text Available A concept that allows the cognitive automation of robotic assembly processes is introduced. An assembly cell comprised of two robots was designed to verify the concept. For the purpose of validation a customer-defined part group consisting of Hubelino bricks is assembled. One of the key aspects for this process is the verification of the assembly group. Hence a software component was designed that utilizes the Microsoft Kinect to perceive both depth and color data in the assembly area. This information is used to determine the current state of the assembly group and is compared to a CAD model for validation purposes. In order to efficiently resolve erroneous situations, the results are interactively accessible to a human expert. The implications for an industrial application are demonstrated by transferring the developed concepts to an assembly scenario for switch-cabinet systems.

  18. An approach for automated analysis of particle holograms

    Science.gov (United States)

    Stanton, A. C.; Caulfield, H. J.; Stewart, G. W.

    1984-01-01

    A simple method for analyzing droplet holograms is proposed that is readily adaptable to automation using modern image digitizers and analyzers for determination of the number, location, and size distributions of spherical or nearly spherical droplets. The method determines these parameters by finding the spatial location of best focus of the droplet images. With this location known, the particle size may be determined by direct measurement of image area in the focal plane. Particle velocity and trajectory may be determined by comparison of image locations at different instants in time. The method is tested by analyzing digitized images from a reconstructed in-line hologram, and the results show that the method is more accurate than a time-consuming plane-by-plane search for sharpest focus.

  19. Automated Multivariate Optimization Tool for Energy Analysis: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, P. G.; Griffith, B. T.; Long, N.; Torcellini, P. A.; Crawley, D.

    2006-07-01

    Building energy simulations are often used for trial-and-error evaluation of ''what-if'' options in building design--a limited search for an optimal solution, or ''optimization''. Computerized searching has the potential to automate the input and output, evaluate many options, and perform enough simulations to account for the complex interactions among combinations of options. This paper describes ongoing efforts to develop such a tool. The optimization tool employs multiple modules, including a graphical user interface, a database, a preprocessor, the EnergyPlus simulation engine, an optimization engine, and a simulation run manager. Each module is described and the overall application architecture is summarized.

  20. Fatigue analysis through automated cycle counting using ThermAND

    International Nuclear Information System (INIS)

    Burton, G.R.; Ding, Y.; Scovil, A.; Yetisir, M.

    2008-01-01

    The potential for fatigue damage due to thermal transients is one of the degradation mechanisms that needs to be managed for plant components. The original design of CANDU stations accounts for projected fatigue usage for specific components over a specified design lifetime. Fatigue design calculations were based on estimates of the number and severity of expected transients for 30 years operation at 80% power. Many CANDU plants are now approaching the end of their design lives and are being considered for extended operation. Industry practice is to have a comprehensive fatigue management program in place for extended operation beyond the original design life. A CANDU-specific framework for fatigue management has recently been developed to identify the options for implementation, and the critical components and locations requiring long-term fatigue monitoring. An essential element of fatigue monitoring is to identify, count and monitor the number of plant transients to ensure that the number assumed in the original design is not exceeded. The number and severity of actual CANDU station thermal transients at key locations in critical systems have been assessed using ThermAND, AECL's health monitor for systems and components, based on archived station operational data. The automated cycle counting has demonstrated that actual transients are generally less numerous than the quantity assumed in the design basis, and are almost always significantly less severe. This paper will discuss the methodology to adapt ThermAND for automated cycle counting of specific system transients, illustrate and test this capability for cycle-based fatigue monitoring using CANDU station data, report the results, and provide data for stress-based fatigue calculations. (author)

  1. Comparison of manual & automated analysis methods for corneal endothelial cell density measurements by specular microscopy.

    Science.gov (United States)

    Huang, Jianyan; Maram, Jyotsna; Tepelus, Tudor C; Modak, Cristina; Marion, Ken; Sadda, SriniVas R; Chopra, Vikas; Lee, Olivia L

    2017-08-07

    To determine the reliability of corneal endothelial cell density (ECD) obtained by automated specular microscopy versus that of validated manual methods and factors that predict such reliability. Sharp central images from 94 control and 106 glaucomatous eyes were captured with Konan specular microscope NSP-9900. All images were analyzed by trained graders using Konan CellChek Software, employing the fully- and semi-automated methods as well as Center Method. Images with low cell count (input cells number <100) and/or guttata were compared with the Center and Flex-Center Methods. ECDs were compared and absolute error was used to assess variation. The effect on ECD of age, cell count, cell size, and cell size variation was evaluated. No significant difference was observed between the Center and Flex-Center Methods in corneas with guttata (p=0.48) or low ECD (p=0.11). No difference (p=0.32) was observed in ECD of normal controls <40 yrs old between the fully-automated method and manual Center Method. However, in older controls and glaucomatous eyes, ECD was overestimated by the fully-automated method (p=0.034) and semi-automated method (p=0.025) as compared to manual method. Our findings show that automated analysis significantly overestimates ECD in the eyes with high polymegathism and/or large cell size, compared to the manual method. Therefore, we discourage reliance upon the fully-automated method alone to perform specular microscopy analysis, particularly if an accurate ECD value is imperative. Copyright © 2017. Published by Elsevier España, S.L.U.

  2. Computational Chemical Synthesis Analysis and Pathway Design

    Directory of Open Access Journals (Sweden)

    Fan Feng

    2018-06-01

    Full Text Available With the idea of retrosynthetic analysis, which was raised in the 1960s, chemical synthesis analysis and pathway design have been transformed from a complex problem to a regular process of structural simplification. This review aims to summarize the developments of computer-assisted synthetic analysis and design in recent years, and how machine-learning algorithms contributed to them. LHASA system started the pioneering work of designing semi-empirical reaction modes in computers, with its following rule-based and network-searching work not only expanding the databases, but also building new approaches to indicating reaction rules. Programs like ARChem Route Designer replaced hand-coded reaction modes with automatically-extracted rules, and programs like Chematica changed traditional designing into network searching. Afterward, with the help of machine learning, two-step models which combine reaction rules and statistical methods became the main stream. Recently, fully data-driven learning methods using deep neural networks which even do not require any prior knowledge, were applied into this field. Up to now, however, these methods still cannot replace experienced human organic chemists due to their relatively low accuracies. Future new algorithms with the aid of powerful computational hardware will make this topic promising and with good prospects.

  3. Digital image analysis applied to industrial nondestructive evaluation and automated parts assembly

    International Nuclear Information System (INIS)

    Janney, D.H.; Kruger, R.P.

    1979-01-01

    Many ideas of image enhancement and analysis are relevant to the needs of the nondestructive testing engineer. These ideas not only aid the engineer in the performance of his current responsibilities, they also open to him new areas of industrial development and automation which are logical extensions of classical testing problems. The paper begins with a tutorial on the fundamentals of computerized image enhancement as applied to nondestructive testing, then progresses through pattern recognition and automated inspection to automated, or robotic, assembly procedures. It is believed that such procedures are cost-effective in many instances, and are but the logical extension of those techniques now commonly used, but often limited to analysis of data from quality-assurance images. Many references are given in order to help the reader who wishes to pursue a given idea further

  4. FIA-automated system used to electrochemically measure nitrite and its interfering chemicals through a 1-2 DAB / Au electrode: gain of sensitivity at upper potentials

    Science.gov (United States)

    Almeida, F. L.; dos Santos Filho, S. G.; Fontes, M. B. A.

    2013-03-01

    The measurement of nitrite and its interfering-chemicals (paracetamol, ascorbic acid and uric acid) was performed employing a Flow-injection Analysis (FIA) system, which was automated using solenoid valves and air-pump. It is very important to quantify nitrite from river water, food and biologic fluids due to its antibacterial capacity in moderated concentrations, or its toxicity for human health even at low concentrations (> 20 μmol L-1 in blood fluids). Electrodes of the electrochemical planar sensor were defined by silk-screen technology. The measuring electrode was made from gold paste covered with 1-2 cis Diaminobenzene (DAB), which allowed good selectivity, linearity, repeatability, stability and optimized gain of sensitivity at 0.5 VAg/AgCl Nafion®117 (6.93 μA mol-1 L mm-2) compared to 0.3 VAg/AgCl Nafion® 117. The reference electrode was obtained from silver/palladium paste modified with chloride and covered with Nafion® 117. The auxiliary electrode was made from platinum paste. It was noteworthy that nitrite response adds to the response of the studied interfering-chemicals and it is predominant for concentrations lower than 175 μmol L-1.

  5. Semi-automated analysis of EEG spikes in the preterm fetal sheep using wavelet analysis

    International Nuclear Information System (INIS)

    Walbran, A.C.; Unsworth, C.P.; Gunn, A.J.; Benett, L.

    2010-01-01

    Full text: Presentation Preference Oral Presentation Perinatal hypoxia plays a key role in the cause of brain injury in premature infants. Cerebral hypothermia commenced in the latent phase of evolving injury (first 6-8 h post hypoxic-ischemic insult) is the lead candidate for treatment however currently there is no means to identify which infants can benefit from treatment. Recent studies suggest that epileptiform transients in latent phase are predictive of neural outcome. To quantify this, an automated means of EEG analysis is required as EEG monitoring produces vast amounts of data which is timely to analyse manually. We have developed a semi-automated EEG spike detection method which employs a discretized version of the continuous wavelet transform (CWT). EEG data was obtained from a fetal sheep at approximately 0.7 of gestation. Fetal asphyxia was maintained for 25 min and the EEG recorded for 8 h before and after asphyxia. The CWT was calculated followed by the power of the wavelet transform coefficients. Areas of high power corresponded to spike waves so thresholding was employed to identify the spikes. The performance of the method was found have a good sensitivity and selectivity, thus demonstrating that this method is a simple, robust and potentially effective spike detection algorithm.

  6. Automated Discovery of New Chemical Reactions and Accurate Calculation of Their Rates

    Science.gov (United States)

    2015-06-02

    chemistry calculations are run. The product matrices P are obtained and converted to block structure by simple linear algebra operations...in the system, i.e. 0 , =∑ ji ija Usually in elementary reactions |aij|ɛ since the change by two implies a significant chemical process, for...instance, formation or rupture of a double bond in a single elementary step. After applying the reaction matrix A, the product matrix P can then be

  7. Comparative Analysis on Chemical Composition of Bentonite Clays ...

    African Journals Online (AJOL)

    2017-09-12

    Sep 12, 2017 ... Comparative Analysis on Chemical Composition of Bentonite Clays. Obtained from Ashaka and ... versatile material for geotechnical engineering and as well as their demand for ..... A PhD thesis submitted to the Chemical ...

  8. Automated striatal uptake analysis of 18F-FDOPA PET images applied to Parkinson's disease patients

    International Nuclear Information System (INIS)

    Chang Icheng; Lue Kunhan; Hsieh Hungjen; Liu Shuhsin; Kao, Chinhao K.

    2011-01-01

    6-[ 18 F]Fluoro-L-DOPA (FDOPA) is a radiopharmaceutical valuable for assessing the presynaptic dopaminergic function when used with positron emission tomography (PET). More specifically, the striatal-to-occipital ratio (SOR) of FDOPA uptake images has been extensively used as a quantitative parameter in these PET studies. Our aim was to develop an easy, automated method capable of performing objective analysis of SOR in FDOPA PET images of Parkinson's disease (PD) patients. Brain images from FDOPA PET studies of 21 patients with PD and 6 healthy subjects were included in our automated striatal analyses. Images of each individual were spatially normalized into an FDOPA template. Subsequently, the image slice with the highest level of basal ganglia activity was chosen among the series of normalized images. Also, the immediate preceding and following slices of the chosen image were then selected. Finally, the summation of these three images was used to quantify and calculate the SOR values. The results obtained by automated analysis were compared with manual analysis by a trained and experienced image processing technologist. The SOR values obtained from the automated analysis had a good agreement and high correlation with manual analysis. The differences in caudate, putamen, and striatum were -0.023, -0.029, and -0.025, respectively; correlation coefficients 0.961, 0.957, and 0.972, respectively. We have successfully developed a method for automated striatal uptake analysis of FDOPA PET images. There was no significant difference between the SOR values obtained from this method and using manual analysis. Yet it is an unbiased time-saving and cost-effective program and easy to implement on a personal computer. (author)

  9. Novel chemical analysis for thin films

    International Nuclear Information System (INIS)

    Usui, Toshio; Kamei, Masayuki; Aoki, Yuji; Morishita, Tadataka; Tanaka, Shoji

    1991-01-01

    Scanning electron microscopy and total-reflection-angle X-ray spectroscopy (SEM-TRAXS) was applied for fluorescence X-ray analysis of 50A- and 125A-thick Au thin films on Si(100). The intensity of the AuM line (2.15 keV) emitted from the Au thin films varied as a function of the take-off angle (θ t ) with respect to the film surface; the intensity of AuM line from the 125A-thick Au thin film was 1.5 times as large as that of SiK α line (1.74 keV) emitted from the Si substrate when θ t = 0deg-3deg, in the vicinity of a critical angle for total external reflection of the AuM line at Si (0.81deg). In addition, the intensity of the AuM line emitted from the 50A-thick Au thin film was also sufficiently strong for chemical analysis. (author)

  10. Automated analysis of small animal PET studies through deformable registration to an atlas

    International Nuclear Information System (INIS)

    Gutierrez, Daniel F.; Zaidi, Habib

    2012-01-01

    This work aims to develop a methodology for automated atlas-guided analysis of small animal positron emission tomography (PET) data through deformable registration to an anatomical mouse model. A non-rigid registration technique is used to put into correspondence relevant anatomical regions of rodent CT images from combined PET/CT studies to corresponding CT images of the Digimouse anatomical mouse model. The latter provides a pre-segmented atlas consisting of 21 anatomical regions suitable for automated quantitative analysis. Image registration is performed using a package based on the Insight Toolkit allowing the implementation of various image registration algorithms. The optimal parameters obtained for deformable registration were applied to simulated and experimental mouse PET/CT studies. The accuracy of the image registration procedure was assessed by segmenting mouse CT images into seven regions: brain, lungs, heart, kidneys, bladder, skeleton and the rest of the body. This was accomplished prior to image registration using a semi-automated algorithm. Each mouse segmentation was transformed using the parameters obtained during CT to CT image registration. The resulting segmentation was compared with the original Digimouse atlas to quantify image registration accuracy using established metrics such as the Dice coefficient and Hausdorff distance. PET images were then transformed using the same technique and automated quantitative analysis of tracer uptake performed. The Dice coefficient and Hausdorff distance show fair to excellent agreement and a mean registration mismatch distance of about 6 mm. The results demonstrate good quantification accuracy in most of the regions, especially the brain, but not in the bladder, as expected. Normalized mean activity estimates were preserved between the reference and automated quantification techniques with relative errors below 10 % in most of the organs considered. The proposed automated quantification technique is

  11. Automated data acquisition and analysis system for inventory verification

    International Nuclear Information System (INIS)

    Sorenson, R.J.; Kaye, J.H.

    1974-03-01

    A real-time system is proposed which would allow CLO Safeguards Branch to conduct a meaningful inventory verification using a variety of NDA instruments. The overall system would include the NDA instruments, automated data handling equipment, and a vehicle to house and transport the instruments and equipment. For the purpose of the preliminary cost estimate a specific data handling system and vehicle were required. A Tracor Northern TN-11 data handling system including a PDP-11 minicomputer and a measurement vehicle similar to the Commission's Regulatory Region I van were used. The basic system is currently estimated to cost about $100,000, and future add-ons which would expand the systems' capabilities are estimated to cost about $40,000. The concept of using a vehicle in order to permanently rack mount the data handling equipmentoffers a number of benefits such as control of equipment environment and allowance for improvements, expansion, and flexibility in the system. Justification is also presented for local design and assembly of the overall system. A summary of the demonstration system which illustrates the advantages and feasibility of the overall system is included in this discussion. Two ideas are discussed which are not considered to be viable alternatives to the proposed system: addition of the data handling capabilities to the semiportable ''cart'' and use of a telephone link to a large computer center

  12. Development of a novel and automated fluorescent immunoassay for the analysis of beta-lactam antibiotics

    NARCIS (Netherlands)

    Benito-Pena, E.; Moreno-Bondi, M.C.; Orellana, G.; Maquieira, K.; Amerongen, van A.

    2005-01-01

    An automated immunosensor for the rapid and sensitive analysis of penicillin type -lactam antibiotics has been developed and optimized. An immunogen was prepared by coupling the common structure of the penicillanic -lactam antibiotics, i.e., 6-aminopenicillanic acid to keyhole limpet hemocyanin.

  13. Automated analysis of small animal PET studies through deformable registration to an atlas

    NARCIS (Netherlands)

    Gutierrez, Daniel F.; Zaidi, Habib

    This work aims to develop a methodology for automated atlas-guided analysis of small animal positron emission tomography (PET) data through deformable registration to an anatomical mouse model. A non-rigid registration technique is used to put into correspondence relevant anatomical regions of

  14. UAV : Warnings From Multiple Automated Static Analysis Tools At A Glance

    NARCIS (Netherlands)

    Buckers, T.B.; Cao, C.S.; Doesburg, M.S.; Gong, Boning; Wang, Sunwei; Beller, M.M.; Zaidman, A.E.; Pinzger, Martin; Bavota, Gabriele; Marcus, Andrian

    2017-01-01

    Automated Static Analysis Tools (ASATs) are an integral part of today’s software quality assurance practices. At present, a plethora of ASATs exist, each with different strengths. However, there is little guidance for developers on which of these ASATs to choose and combine for a project. As a

  15. Substructure analysis techniques and automation. [to eliminate logistical data handling and generation chores

    Science.gov (United States)

    Hennrich, C. W.; Konrath, E. J., Jr.

    1973-01-01

    A basic automated substructure analysis capability for NASTRAN is presented which eliminates most of the logistical data handling and generation chores that are currently associated with the method. Rigid formats are proposed which will accomplish this using three new modules, all of which can be added to level 16 with a relatively small effort.

  16. Automated voxel-based analysis of brain perfusion SPECT for vasospasm after subarachnoid haemorrhage

    International Nuclear Information System (INIS)

    Iwabuchi, S.; Yokouchi, T.; Hayashi, M.; Kimura, H.; Tomiyama, A.; Hirata, Y.; Saito, N.; Harashina, J.; Nakayama, H.; Sato, K.; Aoki, K.; Samejima, H.; Ueda, M.; Terada, H.; Hamazaki, K.

    2008-01-01

    We evaluated regional cerebral blood flow (rCBF) during vasospasm after subarachnoid haemorrhage ISAH) using automated voxel-based analysis of brain perfusion single-photon emission computed tomography (SPELT). Brain perfusion SPECT was performed 7 to 10 days after onset of SAH. Automated voxel-based analysis of SPECT used a Z-score map that was calculated by comparing the patients data with a control database. In cases where computed tomography (CT) scans detected an ischemic region due to vasospasm, automated voxel-based analysis of brain perfusion SPECT revealed dramatically reduced rCBF (Z-score ≤ -4). No patients with mildly or moderately diminished rCBF (Z-score > -3) progressed to cerebral infarction. Some patients with a Z-score < -4 did not progress to cerebral infarction after active treatment with a angioplasty. Three-dimensional images provided detailed anatomical information and helped us to distinguish surgical sequelae from vasospasm. In conclusion, automated voxel-based analysis of brain perfusion SPECT using a Z-score map is helpful in evaluating decreased rCBF due to vasospasm. (author)

  17. Automated Analysis of ARM Binaries using the Low-Level Virtual Machine Compiler Framework

    Science.gov (United States)

    2011-03-01

    Maintenance ABACAS offers a level of flexibility in software development that would be very useful later in the software engineering life cycle. New... Blackjacking : security threats to blackberry devices, PDAs and cell phones in the enterprise. Indianapolis, Indiana, U.S.A.: Wiley Publishing, 2007...AUTOMATED ANALYSIS OF ARM BINARIES USING THE LOW- LEVEL VIRTUAL MACHINE COMPILER FRAMEWORK THESIS Jeffrey B. Scott

  18. Application of fluorescence-based semi-automated AFLP analysis in barley and wheat

    DEFF Research Database (Denmark)

    Schwarz, G.; Herz, M.; Huang, X.Q.

    2000-01-01

    of semi-automated codominant analysis for hemizygous AFLP markers in an F-2 population was too low, proposing the use of dominant allele-typing defaults. Nevertheless, the efficiency of genetic mapping, especially of complex plant genomes, will be accelerated by combining the presented genotyping......Genetic mapping and the selection of closely linked molecular markers for important agronomic traits require efficient, large-scale genotyping methods. A semi-automated multifluorophore technique was applied for genotyping AFLP marker loci in barley and wheat. In comparison to conventional P-33...

  19. Automated Diatom Analysis Applied to Traditional Light Microscopy: A Proof-of-Concept Study

    Science.gov (United States)

    Little, Z. H. L.; Bishop, I.; Spaulding, S. A.; Nelson, H.; Mahoney, C.

    2017-12-01

    Diatom identification and enumeration by high resolution light microscopy is required for many areas of research and water quality assessment. Such analyses, however, are both expertise and labor-intensive. These challenges motivate the need for an automated process to efficiently and accurately identify and enumerate diatoms. Improvements in particle analysis software have increased the likelihood that diatom enumeration can be automated. VisualSpreadsheet software provides a possible solution for automated particle analysis of high-resolution light microscope diatom images. We applied the software, independent of its complementary FlowCam hardware, to automated analysis of light microscope images containing diatoms. Through numerous trials, we arrived at threshold settings to correctly segment 67% of the total possible diatom valves and fragments from broad fields of view. (183 light microscope images were examined containing 255 diatom particles. Of the 255 diatom particles present, 216 diatoms valves and fragments of valves were processed, with 170 properly analyzed and focused upon by the software). Manual analysis of the images yielded 255 particles in 400 seconds, whereas the software yielded a total of 216 particles in 68 seconds, thus highlighting that the software has an approximate five-fold efficiency advantage in particle analysis time. As in past efforts, incomplete or incorrect recognition was found for images with multiple valves in contact or valves with little contrast. The software has potential to be an effective tool in assisting taxonomists with diatom enumeration by completing a large portion of analyses. Benefits and limitations of the approach are presented to allow for development of future work in image analysis and automated enumeration of traditional light microscope images containing diatoms.

  20. Development of a fully automated open-column chemical-separation system—COLUMNSPIDER—and its application to Sr-Nd-Pb isotope analyses of igneous rock samples

    Science.gov (United States)

    Miyazaki, Takashi; Vaglarov, Bogdan Stefanov; Takei, Masakazu; Suzuki, Masahiro; Suzuki, Hiroaki; Ohsawa, Kouzou; Chang, Qing; Takahashi, Toshiro; Hirahara, Yuka; Hanyu, Takeshi; Kimura, Jun-Ichi; Tatsumi, Yoshiyuki

    A fully automated open-column resin-bed chemical-separation system, named COLUMNSPIDER, has been developed. The system consists of a programmable micropipetting robot that dispenses chemical reagents and sample solutions into an open-column resin bed for elemental separation. After the initial set up of resin columns, chemical reagents, and beakers for the separated chemical components, all separation procedures are automated. As many as ten samples can be eluted in parallel in a single automated run. Many separation procedures, such as radiogenic isotope ratio analyses for Sr and Nd, involve the use of multiple column separations with different resin columns, chemical reagents, and beakers of various volumes. COLUMNSPIDER completes these separations using multiple runs. Programmable functions, including the positioning of the micropipetter, reagent volume, and elution time, enable flexible operation. Optimized movements for solution take-up and high-efficiency column flushing allow the system to perform as precisely as when carried out manually by a skilled operator. Procedural blanks, examined for COLUMNSPIDER separations of Sr, Nd, and Pb, are low and negligible. The measured Sr, Nd, and Pb isotope ratios for JB-2 and Nd isotope ratios for JB-3 and BCR-2 rock standards all fall within the ranges reported previously in high-accuracy analyses. COLUMNSPIDER is a versatile tool for the efficient elemental separation of igneous rock samples, a process that is both labor intensive and time consuming.

  1. Scoring of radiation-induced micronuclei in cytokinesis-blocked human lymphocytes by automated image analysis

    International Nuclear Information System (INIS)

    Verhaegen, F.; Seuntjens, J.; Thierens, H.

    1994-01-01

    The micronucleus assay in human lymphocytes is, at present, frequently used to assess chromosomal damage caused by ionizing radiation or mutagens. Manual scoring of micronuclei (MN) by trained personnel is very time-consuming, tiring work, and the results depend on subjective interpretation of scoring criteria. More objective scoring can be accomplished only if the test can be automated. Furthermore, an automated system allows scoring of large numbers of cells, thereby increasing the statistical significance of the results. This is of special importance for screening programs for low doses of chromosome-damaging agents. In this paper, the first results of our effort to automate the micronucleus assay with an image-analysis system are represented. The method we used is described in detail, and the results are compared to those of other groups. Our system is able to detect 88% of the binucleated lymphocytes on the slides. The procedure consists of a fully automated localization of binucleated cells and counting of the MN within these cells, followed by a simple and fast manual operation in which the false positives are removed. Preliminary measurements for blood samples irradiated with a dose of 1 Gy X-rays indicate that the automated system can find 89% ± 12% of the micronuclei within the binucleated cells compared to a manual screening. 18 refs., 8 figs., 1 tab

  2. Driver-centred vehicle automation: using network analysis for agent-based modelling of the driver in highly automated driving systems.

    Science.gov (United States)

    Banks, Victoria A; Stanton, Neville A

    2016-11-01

    To the average driver, the concept of automation in driving infers that they can become completely 'hands and feet free'. This is a common misconception, however, one that has been shown through the application of Network Analysis to new Cruise Assist technologies that may feature on our roads by 2020. Through the adoption of a Systems Theoretic approach, this paper introduces the concept of driver-initiated automation which reflects the role of the driver in highly automated driving systems. Using a combination of traditional task analysis and the application of quantitative network metrics, this agent-based modelling paper shows how the role of the driver remains an integral part of the driving system implicating the need for designers to ensure they are provided with the tools necessary to remain actively in-the-loop despite giving increasing opportunities to delegate their control to the automated subsystems. Practitioner Summary: This paper describes and analyses a driver-initiated command and control system of automation using representations afforded by task and social networks to understand how drivers remain actively involved in the task. A network analysis of different driver commands suggests that such a strategy does maintain the driver in the control loop.

  3. Evaluation of damping estimates by automated Operational Modal Analysis for offshore wind turbine tower vibrations

    DEFF Research Database (Denmark)

    Bajrić, Anela; Høgsberg, Jan Becker; Rüdinger, Finn

    2018-01-01

    Reliable predictions of the lifetime of offshore wind turbine structures are influenced by the limited knowledge concerning the inherent level of damping during downtime. Error measures and an automated procedure for covariance driven Operational Modal Analysis (OMA) techniques has been proposed....... In order to obtain algorithmic independent answers, three identification techniques are compared: Eigensystem Realization Algorithm (ERA), covariance driven Stochastic Subspace Identification (COV-SSI) and the Enhanced Frequency Domain Decomposition (EFDD). Discrepancies between automated identification...... techniques are discussed and illustrated with respect to signal noise, measurement time, vibration amplitudes and stationarity of the ambient response. The best bias-variance error trade-off of damping estimates is obtained by the COV-SSI. The proposed automated procedure is validated by real vibration...

  4. Evaluation of automated analysis of 15N and total N in plant material and soil

    DEFF Research Database (Denmark)

    Jensen, E.S.

    1991-01-01

    Simultaneous determination of N-15 and total N using an automated nitrogen analyser interfaced to a continuous-flow isotope ratio mass spectrometer (ANA-MS method) was evaluated. The coefficient of variation (CV) of repeated analyses of homogeneous standards and samples at natural abundance...... was lower than 0.1%. The CV of repeated analyses of N-15-labelled plant material and soil samples varied between 0.3% and 1.1%. The reproducibility of repeated total N analyses using the automated method was comparable to results obtained with a semi-micro Kjeldahl procedure. However, the automated method...... analysis showed that the recovery of inorganic N in the NH3 trap was lower when the N was diffused from water than from 2 M KCl. The results also indicated that different proportions of the NO3- and the NH4+ in aqueous solution were recovered in the trap after combined diffusion. The method is most suited...

  5. ROBOCAL: An automated NDA [nondestructive analysis] calorimetry and gamma isotopic system

    International Nuclear Information System (INIS)

    Hurd, J.R.; Powell, W.D.; Ostenak, C.A.

    1989-01-01

    ROBOCAL, which is presently being developed and tested at Los Alamos National Laboratory, is a full-scale, prototype robotic system for remote calorimetric and gamma-ray analysis of special nuclear materials. It integrates a fully automated, multidrawer, vertical stacker-retriever system for staging unmeasured nuclear materials, and a fully automated gantry robot for computer-based selection and transfer of nuclear materials to calorimetric and gamma-ray measurement stations. Since ROBOCAL is designed for minimal operator intervention, a completely programmed user interface is provided to interact with the automated mechanical and assay systems. The assay system is designed to completely integrate calorimetric and gamma-ray data acquisition and to perform state-of-the-art analyses on both homogeneous and heterogeneous distributions of nuclear materials in a wide variety of matrices

  6. Artificial neural networks for automation of Rutherford backscattering spectroscopy experiments and data analysis

    International Nuclear Information System (INIS)

    Barradas, N.P.; Vieira, A.; Patricio, R.

    2002-01-01

    We present an algorithm based on artificial neural networks able to determine optimized experimental conditions for Rutherford backscattering measurements of Ge-implanted Si. The algorithm can be implemented for any other element implanted into a lighter substrate. It is foreseeable that the method developed in this work can be applied to still many other systems. The algorithm presented is a push-button black box, and does not require any human intervention. It is thus suited for automated control of an experimental setup, given an interface to the relevant hardware. Once the experimental conditions are optimized, the algorithm analyzes the final data obtained, and determines the desired parameters. The method is thus also suited for automated analysis of the data. The algorithm presented can be easily extended to other ion beam analysis techniques. Finally, it is suggested how the artificial neural networks required for automated control and analysis of experiments could be automatically generated. This would be suited for automated generation of the required computer code. Thus could RBS be done without experimentalists, data analysts, or programmers, with only technicians to keep the machines running

  7. Semi-automated digital image analysis of patellofemoral joint space width from lateral knee radiographs

    Energy Technology Data Exchange (ETDEWEB)

    Grochowski, S.J. [Mayo Clinic, Department of Orthopedic Surgery, Rochester (United States); Amrami, K.K. [Mayo Clinic, Department of Radiology, Rochester (United States); Kaufman, K. [Mayo Clinic, Department of Orthopedic Surgery, Rochester (United States); Mayo Clinic/Foundation, Biomechanics Laboratory, Department of Orthopedic Surgery, Charlton North L-110L, Rochester (United States)

    2005-10-01

    To design a semi-automated program to measure minimum patellofemoral joint space width (JSW) using standing lateral view radiographs. Lateral patellofemoral knee radiographs were obtained from 35 asymptomatic subjects. The radiographs were analyzed to report both the repeatability of the image analysis program and the reproducibility of JSW measurements within a 2 week period. The results were also compared with manual measurements done by an experienced musculoskeletal radiologist. The image analysis program was shown to have an excellent coefficient of repeatability of 0.18 and 0.23 mm for intra- and inter-observer measurements respectively. The manual method measured a greater minimum JSW than the automated method. Reproducibility between days was comparable to other published results, but was less satisfactory for both manual and semi-automated measurements. The image analysis program had an inter-day coefficient of repeatability of 1.24 mm, which was lower than 1.66 mm for the manual method. A repeatable semi-automated method for measurement of the patellofemoral JSW from radiographs has been developed. The method is more accurate than manual measurements. However, the between-day reproducibility is higher than the intra-day reproducibility. Further investigation of the protocol for obtaining sequential lateral knee radiographs is needed in order to reduce the between-day variability. (orig.)

  8. Recent developments in the dissolution and automated analysis of plutonium and uranium for safeguards measurements

    International Nuclear Information System (INIS)

    Jackson, D.D.; Marsh, S.F.; Rein, J.E.; Waterbury, G.R.

    1975-01-01

    The status of a program to develop assay methods for plutonium and uranium for safeguards purposes is presented. The current effort is directed more toward analyses of scrap-type material with an end goal of precise automated methods that also will be applicable to product materials. A guiding philosophy for the analysis of scrap-type materials, characterized by heterogeneity and difficult dissolution, is relatively fast dissolution treatment to effect 90 percent or more solubilization of the uranium and plutonium, analysis of the soluble fraction by precise automated methods, and gamma-counting assay of any residue fraction using simple techniques. A Teflon-container metal-shell apparatus provides acid dissolutions of typical fuel cycle materials at temperatures to 275 0 C and pressures to 340 atm. Gas--solid reactions at elevated temperatures separate uranium from refractory materials by the formation of volatile uranium compounds. The condensed compounds then are dissolved in acid for subsequent analysis. An automated spectrophotometer is used for the determination of uranium and plutonium. The measurement range is 1 to 14 mg of either element with a relative standard deviation of 0.5 percent over most of the range. The throughput rate is 5 min per sample. A second-generation automated instrument is being developed for the determination of plutonium. A precise and specific electroanalytical method is used as its operational basis. (auth)

  9. Thermodynamic analysis of chemical heat pumps

    International Nuclear Information System (INIS)

    Obermeier, Jonas; Müller, Karsten; Arlt, Wolfgang

    2015-01-01

    Thermal energy storages and heat pump units represent an important part of high efficient renewable energy systems. By using thermally driven, reversible chemical reactions a combination of thermal energy storage and heat pump can be realized. The influences of thermophysical properties of the involved components on the efficiency of a heat pump cycle is analysed and the relevance of the thermodynamic driving force is worked out. In general, the behaviour of energetic and exergetic efficiency is contrary. In a real cycle, higher enthalpies of reaction decrease the energetic efficiency but increase the exergetic efficiency. Higher enthalpies of reaction allow for lower offsets from equilibrium state for a default thermodynamic driving force of the reaction. - Highlights: • A comprehensive efficiency analysis of gas-solid heat pumps is proposed. • Link between thermodynamic driving force and equilibrium drop is shown. • Calculation of the equilibrium drop based on thermochemical properties. • Reaction equilibria of the decomposition reaction of salt hydrates. • Contrary behavior of energetic and exergetic efficiency

  10. Instrument to collect fogwater for chemical analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jacob, D.J.; Waldman, J.M.; Haghi, M.; Hoffmann, M.R.; Flagan, R.C.

    1985-06-01

    An instrument is presented which collects large samples of ambient fogwater by impaction of droplets on a screen. The collection efficiency of the instrument is determined as a function of droplet size, and it is shown that fog droplets in the range 3--100-..mu..m diameter are efficiently collected. No significant evaporation or condensation occurs at any stage of the collection process. Field testing indicates that samples collected are representative of the ambient fogwater. The instrument may easily be automated, and is suitable for use in routine air quality monitoring programs.

  11. Instrument to collect fogwater for chemical analysis

    Science.gov (United States)

    Jacob, Daniel J.; Waldman, Jed M.; Haghi, Mehrdad; Hoffmann, Michael R.; Flagan, Richard C.

    1985-06-01

    An instrument is presented which collects large samples of ambient fogwater by impaction of droplets on a screen. The collection efficiency of the instrument is determined as a function of droplet size, and it is shown that fog droplets in the range 3-100-μm diameter are efficiently collected. No significant evaporation or condensation occurs at any stage of the collection process. Field testing indicates that samples collected are representative of the ambient fogwater. The instrument may easily be automated, and is suitable for use in routine air quality monitoring programs.

  12. Automated quantification and sizing of unbranched filamentous cyanobacteria by model based object oriented image analysis

    OpenAIRE

    Zeder, M; Van den Wyngaert, S; Köster, O; Felder, K M; Pernthaler, J

    2010-01-01

    Quantification and sizing of filamentous cyanobacteria in environmental samples or cultures are time-consuming and are often performed by using manual or semiautomated microscopic analysis. Automation of conventional image analysis is difficult because filaments may exhibit great variations in length and patchy autofluorescence. Moreover, individual filaments frequently cross each other in microscopic preparations, as deduced by modeling. This paper describes a novel approach based on object-...

  13. Quantitative analysis and automation of CaF2 ore flotation via neutronic activation

    International Nuclear Information System (INIS)

    Reggiani, F.; Garagnani, A.; Lembo, L.; Muntoni, C.

    1992-01-01

    The aim of the project was to prove the operative feasibility of an 'one line analysis' based on nuclear activation using 14 MeV neutrons. This on line analysis method was to be used to automate control of the flotation process for ores containing mostly fluorspar. The thermoluminescence technique was also forecast as a subsidiary method in those points of the flotation cycle where less precision is acceptable. (author). 11 figs., 4 tabs

  14. Evaluation of an Automated Analysis Tool for Prostate Cancer Prediction Using Multiparametric Magnetic Resonance Imaging.

    Directory of Open Access Journals (Sweden)

    Matthias C Roethke

    Full Text Available To evaluate the diagnostic performance of an automated analysis tool for the assessment of prostate cancer based on multiparametric magnetic resonance imaging (mpMRI of the prostate.A fully automated analysis tool was used for a retrospective analysis of mpMRI sets (T2-weighted, T1-weighted dynamic contrast-enhanced, and diffusion-weighted sequences. The software provided a malignancy prediction value for each image pixel, defined as Malignancy Attention Index (MAI that can be depicted as a colour map overlay on the original images. The malignancy maps were compared to histopathology derived from a combination of MRI-targeted and systematic transperineal MRI/TRUS-fusion biopsies.In total, mpMRI data of 45 patients were evaluated. With a sensitivity of 85.7% (with 95% CI of 65.4-95.0, a specificity of 87.5% (with 95% CI of 69.0-95.7 and a diagnostic accuracy of 86.7% (with 95% CI of 73.8-93.8 for detection of prostate cancer, the automated analysis results corresponded well with the reported diagnostic accuracies by human readers based on the PI-RADS system in the current literature.The study revealed comparable diagnostic accuracies for the detection of prostate cancer of a user-independent MAI-based automated analysis tool and PI-RADS-scoring-based human reader analysis of mpMRI. Thus, the analysis tool could serve as a detection support system for less experienced readers. The results of the study also suggest the potential of MAI-based analysis for advanced lesion assessments, such as cancer extent and staging prediction.

  15. An automated method to find transition states using chemical dynamics simulations.

    Science.gov (United States)

    Martínez-Núñez, Emilio

    2015-02-05

    A procedure to automatically find the transition states (TSs) of a molecular system (MS) is proposed. It has two components: high-energy chemical dynamics simulations (CDS), and an algorithm that analyzes the geometries along the trajectories to find reactive pathways. Two levels of electronic structure calculations are involved: a low level (LL) is used to integrate the trajectories and also to optimize the TSs, and a higher level (HL) is used to reoptimize the structures. The method has been tested in three MSs: formaldehyde, formic acid (FA), and vinyl cyanide (VC), using MOPAC2012 and Gaussian09 to run the LL and HL calculations, respectively. Both the efficacy and efficiency of the method are very good, with around 15 TS structures optimized every 10 trajectories, which gives a total of 7, 12, and 83 TSs for formaldehyde, FA, and VC, respectively. The use of CDS makes it a powerful tool to unveil possible nonstatistical behavior of the system under study. © 2014 Wiley Periodicals, Inc.

  16. Automated tool for virtual screening and pharmacology-based pathway prediction and analysis

    Directory of Open Access Journals (Sweden)

    Sugandh Kumar

    2017-10-01

    Full Text Available The virtual screening is an effective tool for the lead identification in drug discovery. However, there are limited numbers of crystal structures available as compared to the number of biological sequences which makes (Structure Based Drug Discovery SBDD a difficult choice. The current tool is an attempt to automate the protein structure modelling and automatic virtual screening followed by pharmacology-based prediction and analysis. Starting from sequence(s, this tool automates protein structure modelling, binding site identification, automated docking, ligand preparation, post docking analysis and identification of hits in the biological pathways that can be modulated by a group of ligands. This automation helps in the characterization of ligands selectivity and action of ligands on a complex biological molecular network as well as on individual receptor. The judicial combination of the ligands binding different receptors can be used to inhibit selective biological pathways in a disease. This tool also allows the user to systemically investigate network-dependent effects of a drug or drug candidate.

  17. Evaluation of an automated analysis for pain-related evoked potentials

    Directory of Open Access Journals (Sweden)

    Wulf Michael

    2017-09-01

    Full Text Available This paper presents initial steps towards an auto-mated analysis for pain-related evoked potentials (PREP to achieve a higher objectivity and non-biased examination as well as a reduction in the time expended during clinical daily routines. While manually examining, each epoch of an en-semble of stimulus-locked EEG signals, elicited by electrical stimulation of predominantly intra-epidermal small nerve fibers and recorded over the central electrode (Cz, is in-spected for artifacts before calculating the PREP by averag-ing the artifact-free epochs. Afterwards, specific peak-latencies (like the P0-, N1 and P1-latency are identified as certain extrema in the PREP’s waveform. The proposed automated analysis uses Pearson’s correlation and low-pass differentiation to perform these tasks. To evaluate the auto-mated analysis’ accuracy its results of 232 datasets were compared to the results of the manually performed examina-tion. Results of the automated artifact rejection were compa-rable to the manual examination. Detection of peak-latencies was more heterogeneous, indicating some sensitivity of the detected events upon the criteria used during data examina-tion.

  18. High Throughput Petrochronology and Sedimentary Provenance Analysis by Automated Phase Mapping and LAICPMS

    Science.gov (United States)

    Vermeesch, Pieter; Rittner, Martin; Petrou, Ethan; Omma, Jenny; Mattinson, Chris; Garzanti, Eduardo

    2017-11-01

    The first step in most geochronological studies is to extract dateable minerals from the host rock, which is time consuming, removes textural context, and increases the chance for sample cross contamination. We here present a new method to rapidly perform in situ analyses by coupling a fast scanning electron microscope (SEM) with Energy Dispersive X-ray Spectrometer (EDS) to a Laser Ablation Inductively Coupled Plasma Mass Spectrometer (LAICPMS) instrument. Given a polished hand specimen, a petrographic thin section, or a grain mount, Automated Phase Mapping (APM) by SEM/EDS produces chemical and mineralogical maps from which the X-Y coordinates of the datable minerals are extracted. These coordinates are subsequently passed on to the laser ablation system for isotopic analysis. We apply the APM + LAICPMS method to three igneous, metamorphic, and sedimentary case studies. In the first case study, a polished slab of granite from Guernsey was scanned for zircon, producing a 609 ± 8 Ma weighted mean age. The second case study investigates a paragneiss from an ultra high pressure terrane in the north Qaidam terrane (Qinghai, China). One hundred seven small (25 µm) metamorphic zircons were analyzed by LAICPMS to confirm a 419 ± 4 Ma age of peak metamorphism. The third and final case study uses APM + LAICPMS to generate a large provenance data set and trace the provenance of 25 modern sediments from Angola, documenting longshore drift of Orange River sediments over a distance of 1,500 km. These examples demonstrate that APM + LAICPMS is an efficient and cost effective way to improve the quantity and quality of geochronological data.

  19. The role of the computer in automated spectral analysis

    International Nuclear Information System (INIS)

    Rasmussen, S.E.

    This report describes how a computer can be an extremely valuable tool for routine analysis of spectra, which is a time consuming process. A number of general-purpose algorithms that are available for the various phases of the analysis can be implemented, if these algorithms are designed to cope with all the variations that may occur. Since this is basically impossible, one must find a compromise between obscure error and program complexity. This is usually possible with human interaction at critical points. In spectral analysis this is possible if the user scans the data on an interactive graphics terminal, makes the necessary changes and then returns control to the computer for completion of the analysis

  20. availability analysis of chemicals for water treatment

    African Journals Online (AJOL)

    NIJOTECH

    In most countries, chemicals are generally recognized as being vital in the production of potable water and will ... industries and water utility ventures are being started in Nigeria ... are being dumped into rivers thereby polluting them the more.

  1. An automated robotic platform for rapid profiling oligosaccharide analysis of monoclonal antibodies directly from cell culture.

    Science.gov (United States)

    Doherty, Margaret; Bones, Jonathan; McLoughlin, Niaobh; Telford, Jayne E; Harmon, Bryan; DeFelippis, Michael R; Rudd, Pauline M

    2013-11-01

    Oligosaccharides attached to Asn297 in each of the CH2 domains of monoclonal antibodies play an important role in antibody effector functions by modulating the affinity of interaction with Fc receptors displayed on cells of the innate immune system. Rapid, detailed, and quantitative N-glycan analysis is required at all stages of bioprocess development to ensure the safety and efficacy of the therapeutic. The high sample numbers generated during quality by design (QbD) and process analytical technology (PAT) create a demand for high-performance, high-throughput analytical technologies for comprehensive oligosaccharide analysis. We have developed an automated 96-well plate-based sample preparation platform for high-throughput N-glycan analysis using a liquid handling robotic system. Complete process automation includes monoclonal antibody (mAb) purification directly from bioreactor media, glycan release, fluorescent labeling, purification, and subsequent ultra-performance liquid chromatography (UPLC) analysis. The entire sample preparation and commencement of analysis is achieved within a 5-h timeframe. The automated sample preparation platform can easily be interfaced with other downstream analytical technologies, including mass spectrometry (MS) and capillary electrophoresis (CE), for rapid characterization of oligosaccharides present on therapeutic antibodies. Copyright © 2013 Elsevier Inc. All rights reserved.

  2. Proposed minimum reporting standards for chemical analysis Chemical Analysis Working Group (CAWG) Metabolomics Standards Initiative (MSI)

    Science.gov (United States)

    Amberg, Alexander; Barrett, Dave; Beale, Michael H.; Beger, Richard; Daykin, Clare A.; Fan, Teresa W.-M.; Fiehn, Oliver; Goodacre, Royston; Griffin, Julian L.; Hankemeier, Thomas; Hardy, Nigel; Harnly, James; Higashi, Richard; Kopka, Joachim; Lane, Andrew N.; Lindon, John C.; Marriott, Philip; Nicholls, Andrew W.; Reily, Michael D.; Thaden, John J.; Viant, Mark R.

    2013-01-01

    There is a general consensus that supports the need for standardized reporting of metadata or information describing large-scale metabolomics and other functional genomics data sets. Reporting of standard metadata provides a biological and empirical context for the data, facilitates experimental replication, and enables the re-interrogation and comparison of data by others. Accordingly, the Metabolomics Standards Initiative is building a general consensus concerning the minimum reporting standards for metabolomics experiments of which the Chemical Analysis Working Group (CAWG) is a member of this community effort. This article proposes the minimum reporting standards related to the chemical analysis aspects of metabolomics experiments including: sample preparation, experimental analysis, quality control, metabolite identification, and data pre-processing. These minimum standards currently focus mostly upon mass spectrometry and nuclear magnetic resonance spectroscopy due to the popularity of these techniques in metabolomics. However, additional input concerning other techniques is welcomed and can be provided via the CAWG on-line discussion forum at http://msi-workgroups.sourceforge.net/ or http://Msi-workgroups-feedback@lists.sourceforge.net. Further, community input related to this document can also be provided via this electronic forum. PMID:24039616

  3. Automated thermal mapping techniques using chromatic image analysis

    Science.gov (United States)

    Buck, Gregory M.

    1989-01-01

    Thermal imaging techniques are introduced using a chromatic image analysis system and temperature sensitive coatings. These techniques are used for thermal mapping and surface heat transfer measurements on aerothermodynamic test models in hypersonic wind tunnels. Measurements are made on complex vehicle configurations in a timely manner and at minimal expense. The image analysis system uses separate wavelength filtered images to analyze surface spectral intensity data. The system was initially developed for quantitative surface temperature mapping using two-color thermographic phosphors but was found useful in interpreting phase change paint and liquid crystal data as well.

  4. Fluorescence In Situ Hybridization (FISH Signal Analysis Using Automated Generated Projection Images

    Directory of Open Access Journals (Sweden)

    Xingwei Wang

    2012-01-01

    Full Text Available Fluorescence in situ hybridization (FISH tests provide promising molecular imaging biomarkers to more accurately and reliably detect and diagnose cancers and genetic disorders. Since current manual FISH signal analysis is low-efficient and inconsistent, which limits its clinical utility, developing automated FISH image scanning systems and computer-aided detection (CAD schemes has been attracting research interests. To acquire high-resolution FISH images in a multi-spectral scanning mode, a huge amount of image data with the stack of the multiple three-dimensional (3-D image slices is generated from a single specimen. Automated preprocessing these scanned images to eliminate the non-useful and redundant data is important to make the automated FISH tests acceptable in clinical applications. In this study, a dual-detector fluorescence image scanning system was applied to scan four specimen slides with FISH-probed chromosome X. A CAD scheme was developed to detect analyzable interphase cells and map the multiple imaging slices recorded FISH-probed signals into the 2-D projection images. CAD scheme was then applied to each projection image to detect analyzable interphase cells using an adaptive multiple-threshold algorithm, identify FISH-probed signals using a top-hat transform, and compute the ratios between the normal and abnormal cells. To assess CAD performance, the FISH-probed signals were also independently visually detected by an observer. The Kappa coefficients for agreement between CAD and observer ranged from 0.69 to 1.0 in detecting/counting FISH signal spots in four testing samples. The study demonstrated the feasibility of automated FISH signal analysis that applying a CAD scheme to the automated generated 2-D projection images.

  5. Automated retroillumination photography analysis for objective assessment of Fuchs Corneal Dystrophy severity

    Science.gov (United States)

    Eghrari, Allen O.; Mumtaz, Aisha A.; Garrett, Brian; Rezaei, Mahsa; Akhavan, Mina S.; Riazuddin, S. Amer; Gottsch, John D.

    2016-01-01

    Purpose Retroillumination photography analysis (RPA) is an objective tool for assessment of the number and distribution of guttae in eyes affected with Fuchs Corneal Dystrophy (FCD). Current protocols include manual processing of images; here we assess validity and interrater reliability of automated analysis across various levels of FCD severity. Methods Retroillumination photographs of 97 FCD-affected corneas were acquired and total counts of guttae previously summated manually. For each cornea, a single image was loaded into ImageJ software. We reduced color variability and subtracted background noise. Reflection of light from each gutta was identified as a local area of maximum intensity and counted automatically. Noise tolerance level was titrated for each cornea by examining a small region of each image with automated overlay to ensure appropriate coverage of individual guttae. We tested interrater reliability of automated counts of guttae across a spectrum of clinical and educational experience. Results A set of 97 retroillumination photographs were analyzed. Clinical severity as measured by a modified Krachmer scale ranged from a severity level of 1 to 5 in the set of analyzed corneas. Automated counts by an ophthalmologist correlated strongly with Krachmer grading (R2=0.79) and manual counts (R2=0.88). Intraclass correlation coefficient demonstrated strong correlation, at 0.924 (95% CI, 0.870- 0.958) among cases analyzed by three students, and 0.869 (95% CI, 0.797- 0.918) among cases for which images was analyzed by an ophthalmologist and two students. Conclusions Automated RPA allows for grading of FCD severity with high resolution across a spectrum of disease severity. PMID:27811565

  6. GenePublisher: automated analysis of DNA microarray data

    DEFF Research Database (Denmark)

    Knudsen, Steen; Workman, Christopher; Sicheritz-Ponten, T.

    2003-01-01

    GenePublisher, a system for automatic analysis of data from DNA microarray experiments, has been implemented with a web interface at http://www.cbs.dtu.dk/services/GenePublisher. Raw data are uploaded to the server together with aspecification of the data. The server performs normalization...

  7. Automated analysis of security requirements through risk-based argumentation

    NARCIS (Netherlands)

    Yu, Yijun; Nunes Leal Franqueira, V.; Tun, Thein Tan; Wieringa, Roelf J.; Nuseibeh, Bashar

    2015-01-01

    Computer-based systems are increasingly being exposed to evolving security threats, which often reveal new vulnerabilities. A formal analysis of the evolving threats is difficult due to a number of practical considerations such as incomplete knowledge about the design, limited information about

  8. Automated injection of slurry samples in flow-injection analysis

    NARCIS (Netherlands)

    Hulsman, M.H.F.M.; Hulsman, M.; Bos, M.; van der Linden, W.E.

    1996-01-01

    Two types of injectors are described for introducing solid samples as slurries in flow analysis systems. A time-based and a volume-based injector based on multitube solenoid pinch valves were built, both can be characterized as hydrodynamic injectors. Reproducibility of the injections of dispersed

  9. Automated Speech and Audio Analysis for Semantic Access to Multimedia

    NARCIS (Netherlands)

    Jong, F.M.G. de; Ordelman, R.; Huijbregts, M.

    2006-01-01

    The deployment and integration of audio processing tools can enhance the semantic annotation of multimedia content, and as a consequence, improve the effectiveness of conceptual access tools. This paper overviews the various ways in which automatic speech and audio analysis can contribute to

  10. Automated speech and audio analysis for semantic access to multimedia

    NARCIS (Netherlands)

    de Jong, Franciska M.G.; Ordelman, Roeland J.F.; Huijbregts, M.A.H.; Avrithis, Y.; Kompatsiaris, Y.; Staab, S.; O' Connor, N.E.

    2006-01-01

    The deployment and integration of audio processing tools can enhance the semantic annotation of multimedia content, and as a consequence, improve the effectiveness of conceptual access tools. This paper overviews the various ways in which automatic speech and audio analysis can contribute to

  11. Physico-Chemical Analysis and Sensory Evaluation of Bread

    African Journals Online (AJOL)

    Shuaibu et al.

    Physico-Chemical Analysis and Sensory Evaluation of Bread Produced Using ... analysis of the bread samples revealed that the moisture content ..... 72. Jarup, L. ,2003. Hazards of heavy metal contamination. Br Med. Bull; 68, pp.167-82.

  12. Chemical analysis and base-promoted hydrolysis of locally ...

    African Journals Online (AJOL)

    Abstract. The study was on the chemical analysis and base- promoted hydrolysis of extracted shea nut fat. The local method of extraction of the shea nut oil was employed in comparison with literature report. A simple cold-process alkali hydrolysis of the shea nut oil was used in producing the soap. The chemical analysis of ...

  13. Instrumental analysis

    International Nuclear Information System (INIS)

    Jae, Myeong Gi; Lee, Won Seong; Kim, Ha Hyeok

    1989-02-01

    This book give description of electronic engineering such as circuit element and device, circuit analysis and logic digital circuit, the method of electrochemistry like conductometry, potentiometry and current measuring, spectro chemical analysis with electromagnetic radiant rays, optical components, absorption spectroscopy, X-ray analysis, atomic absorption spectrometry and reference, chromatography like gas-chromatography and liquid-chromatography and automated analysis on control system evaluation of automated analysis and automated analysis system and reference.

  14. Instrumental analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jae, Myeong Gi; Lee, Won Seong; Kim, Ha Hyeok

    1989-02-15

    This book give description of electronic engineering such as circuit element and device, circuit analysis and logic digital circuit, the method of electrochemistry like conductometry, potentiometry and current measuring, spectro chemical analysis with electromagnetic radiant rays, optical components, absorption spectroscopy, X-ray analysis, atomic absorption spectrometry and reference, chromatography like gas-chromatography and liquid-chromatography and automated analysis on control system evaluation of automated analysis and automated analysis system and reference.

  15. An automated image analysis system to measure and count organisms in laboratory microcosms.

    Directory of Open Access Journals (Sweden)

    François Mallard

    Full Text Available 1. Because of recent technological improvements in the way computer and digital camera perform, the potential use of imaging for contributing to the study of communities, populations or individuals in laboratory microcosms has risen enormously. However its limited use is due to difficulties in the automation of image analysis. 2. We present an accurate and flexible method of image analysis for detecting, counting and measuring moving particles on a fixed but heterogeneous substrate. This method has been specifically designed to follow individuals, or entire populations, in experimental laboratory microcosms. It can be used in other applications. 3. The method consists in comparing multiple pictures of the same experimental microcosm in order to generate an image of the fixed background. This background is then used to extract, measure and count the moving organisms, leaving out the fixed background and the motionless or dead individuals. 4. We provide different examples (springtails, ants, nematodes, daphnia to show that this non intrusive method is efficient at detecting organisms under a wide variety of conditions even on faintly contrasted and heterogeneous substrates. 5. The repeatability and reliability of this method has been assessed using experimental populations of the Collembola Folsomia candida. 6. We present an ImageJ plugin to automate the analysis of digital pictures of laboratory microcosms. The plugin automates the successive steps of the analysis and recursively analyses multiple sets of images, rapidly producing measurements from a large number of replicated microcosms.

  16. Molecular Detection of Bladder Cancer by Fluorescence Microsatellite Analysis and an Automated Genetic Analyzing System

    Directory of Open Access Journals (Sweden)

    Sarel Halachmi

    2007-01-01

    Full Text Available To investigate the ability of an automated fluorescent analyzing system to detect microsatellite alterations, in patients with bladder cancer. We investigated 11 with pathology proven bladder Transitional Cell Carcinoma (TCC for microsatellite alterations in blood, urine, and tumor biopsies. DNA was prepared by standard methods from blood, urine and resected tumor specimens, and was used for microsatellite analysis. After the primers were fluorescent labeled, amplification of the DNA was performed with PCR. The PCR products were placed into the automated genetic analyser (ABI Prism 310, Perkin Elmer, USA and were subjected to fluorescent scanning with argon ion laser beams. The fluorescent signal intensity measured by the genetic analyzer measured the product size in terms of base pairs. We found loss of heterozygocity (LOH or microsatellite alterations (a loss or gain of nucleotides, which alter the original normal locus size in all the patients by using fluorescent microsatellite analysis and an automated analyzing system. In each case the genetic changes found in urine samples were identical to those found in the resected tumor sample. The studies demonstrated the ability to detect bladder tumor non-invasively by fluorescent microsatellite analysis of urine samples. Our study supports the worldwide trend for the search of non-invasive methods to detect bladder cancer. We have overcome major obstacles that prevented the clinical use of an experimental system. With our new tested system microsatellite analysis can be done cheaper, faster, easier and with higher scientific accuracy.

  17. Automated vessel segmentation using cross-correlation and pooled covariance matrix analysis.

    Science.gov (United States)

    Du, Jiang; Karimi, Afshin; Wu, Yijing; Korosec, Frank R; Grist, Thomas M; Mistretta, Charles A

    2011-04-01

    Time-resolved contrast-enhanced magnetic resonance angiography (CE-MRA) provides contrast dynamics in the vasculature and allows vessel segmentation based on temporal correlation analysis. Here we present an automated vessel segmentation algorithm including automated generation of regions of interest (ROIs), cross-correlation and pooled sample covariance matrix analysis. The dynamic images are divided into multiple equal-sized regions. In each region, ROIs for artery, vein and background are generated using an iterative thresholding algorithm based on the contrast arrival time map and contrast enhancement map. Region-specific multi-feature cross-correlation analysis and pooled covariance matrix analysis are performed to calculate the Mahalanobis distances (MDs), which are used to automatically separate arteries from veins. This segmentation algorithm is applied to a dual-phase dynamic imaging acquisition scheme where low-resolution time-resolved images are acquired during the dynamic phase followed by high-frequency data acquisition at the steady-state phase. The segmented low-resolution arterial and venous images are then combined with the high-frequency data in k-space and inverse Fourier transformed to form the final segmented arterial and venous images. Results from volunteer and patient studies demonstrate the advantages of this automated vessel segmentation and dual phase data acquisition technique. Copyright © 2011 Elsevier Inc. All rights reserved.

  18. Automated analysis of craniofacial morphology using magnetic resonance images.

    Directory of Open Access Journals (Sweden)

    M Mallar Chakravarty

    Full Text Available Quantitative analysis of craniofacial morphology is of interest to scholars working in a wide variety of disciplines, such as anthropology, developmental biology, and medicine. T1-weighted (anatomical magnetic resonance images (MRI provide excellent contrast between soft tissues. Given its three-dimensional nature, MRI represents an ideal imaging modality for the analysis of craniofacial structure in living individuals. Here we describe how T1-weighted MR images, acquired to examine brain anatomy, can also be used to analyze facial features. Using a sample of typically developing adolescents from the Saguenay Youth Study (N = 597; 292 male, 305 female, ages: 12 to 18 years, we quantified inter-individual variations in craniofacial structure in two ways. First, we adapted existing nonlinear registration-based morphological techniques to generate iteratively a group-wise population average of craniofacial features. The nonlinear transformations were used to map the craniofacial structure of each individual to the population average. Using voxel-wise measures of expansion and contraction, we then examined the effects of sex and age on inter-individual variations in facial features. Second, we employed a landmark-based approach to quantify variations in face surfaces. This approach involves: (a placing 56 landmarks (forehead, nose, lips, jaw-line, cheekbones, and eyes on a surface representation of the MRI-based group average; (b warping the landmarks to the individual faces using the inverse nonlinear transformation estimated for each person; and (3 using a principal components analysis (PCA of the warped landmarks to identify facial features (i.e. clusters of landmarks that vary in our sample in a correlated fashion. As with the voxel-wise analysis of the deformation fields, we examined the effects of sex and age on the PCA-derived spatial relationships between facial features. Both methods demonstrated significant sexual dimorphism in

  19. SHEAN (Simplified Human Error Analysis code) and automated THERP

    International Nuclear Information System (INIS)

    Wilson, J.R.

    1993-01-01

    One of the most widely used human error analysis tools is THERP (Technique for Human Error Rate Prediction). Unfortunately, this tool has disadvantages. The Nuclear Regulatory Commission, realizing these drawbacks, commissioned Dr. Swain, the author of THERP, to create a simpler, more consistent tool for deriving human error rates. That effort produced the Accident Sequence Evaluation Program Human Reliability Analysis Procedure (ASEP), which is more conservative than THERP, but a valuable screening tool. ASEP involves answering simple questions about the scenario in question, and then looking up the appropriate human error rate in the indicated table (THERP also uses look-up tables, but four times as many). The advantages of ASEP are that human factors expertise is not required, and the training to use the method is minimal. Although not originally envisioned by Dr. Swain, the ASEP approach actually begs to be computerized. That WINCO did, calling the code SHEAN, for Simplified Human Error ANalysis. The code was done in TURBO Basic for IBM or IBM-compatible MS-DOS, for fast execution. WINCO is now in the process of comparing this code against THERP for various scenarios. This report provides a discussion of SHEAN

  20. Satellite Imagery Analysis for Automated Global Food Security Forecasting

    Science.gov (United States)

    Moody, D.; Brumby, S. P.; Chartrand, R.; Keisler, R.; Mathis, M.; Beneke, C. M.; Nicholaeff, D.; Skillman, S.; Warren, M. S.; Poehnelt, J.

    2017-12-01

    The recent computing performance revolution has driven improvements in sensor, communication, and storage technology. Multi-decadal remote sensing datasets at the petabyte scale are now available in commercial clouds, with new satellite constellations generating petabytes/year of daily high-resolution global coverage imagery. Cloud computing and storage, combined with recent advances in machine learning, are enabling understanding of the world at a scale and at a level of detail never before feasible. We present results from an ongoing effort to develop satellite imagery analysis tools that aggregate temporal, spatial, and spectral information and that can scale with the high-rate and dimensionality of imagery being collected. We focus on the problem of monitoring food crop productivity across the Middle East and North Africa, and show how an analysis-ready, multi-sensor data platform enables quick prototyping of satellite imagery analysis algorithms, from land use/land cover classification and natural resource mapping, to yearly and monthly vegetative health change trends at the structural field level.

  1. An automated sensitivity analysis procedure for the performance assessment of nuclear waste isolation systems

    International Nuclear Information System (INIS)

    Pin, F.G.; Worley, B.A.; Oblow, E.M.; Wright, R.Q.; Harper, W.V.

    1986-01-01

    To support an effort in making large-scale sensitivity analyses feasible, cost efficient and quantitatively complete, the authors have developed an automated procedure making use of computer calculus. The procedure, called GRESS (GRadient Enhanced Software System), is embodied in a precompiler that can process Fortran computer codes and add derivative-taking capabilities to the normal calculation scheme. In this paper, the automated GRESS procedure is described and applied to the code UCB-NE-10.2, which simulates the migration through a sorption medium of the radionuclide members of a decay chain. The sensitivity calculations for a sample problem are verified using comparison with analytical and perturbation analysis results. Conclusions are drawn relative to the applicability of GRESS for more general large-scale sensitivity studies, and the role of such techniques in an overall sensitivity and uncertainty analysis program is discussed

  2. Automated three-dimensional X-ray analysis using a dual-beam FIB

    International Nuclear Information System (INIS)

    Schaffer, Miroslava; Wagner, Julian; Schaffer, Bernhard; Schmied, Mario; Mulders, Hans

    2007-01-01

    We present a fully automated method for three-dimensional (3D) elemental analysis demonstrated using a ceramic sample of chemistry (Ca)MgTiO x . The specimen is serially sectioned by a focused ion beam (FIB) microscope, and energy-dispersive X-ray spectrometry (EDXS) is used for elemental analysis of each cross-section created. A 3D elemental model is reconstructed from the stack of two-dimensional (2D) data. This work concentrates on issues arising from process automation, the large sample volume of approximately 17x17x10 μm 3 , and the insulating nature of the specimen. A new routine for post-acquisition data correction of different drift effects is demonstrated. Furthermore, it is shown that EDXS data may be erroneous for specimens containing voids, and that back-scattered electron images have to be used to correct for these errors

  3. Thermally emissive sensing materials for chemical spectroscopy analysis

    Science.gov (United States)

    Poole, Zsolt; Ohodnicki, Paul R.

    2018-05-08

    A sensor using thermally emissive materials for chemical spectroscopy analysis includes an emissive material, wherein the emissive material includes the thermally emissive materials which emit electromagnetic radiation, wherein the electromagnetic radiation is modified due to chemical composition in an environment; and a detector adapted to detect the electromagnetic radiation, wherein the electromagnetic radiation is indicative of the chemical interaction changes and hence chemical composition and/or chemical composition changes of the environment. The emissive material can be utilized with an optical fiber sensor, with the optical fiber sensor operating without the emissive material probed with a light source external to the material.

  4. Automated analysis of plethysmograms for functional studies of hemodynamics

    Science.gov (United States)

    Zatrudina, R. Sh.; Isupov, I. B.; Gribkov, V. Yu.

    2018-04-01

    The most promising method for the quantitative determination of cardiovascular tone indicators and of cerebral hemodynamics indicators is the method of impedance plethysmography. The accurate determination of these indicators requires the correct identification of the characteristic points in the thoracic impedance plethysmogram and the cranial impedance plethysmogram respectively. An algorithm for automatic analysis of these plethysmogram is presented. The algorithm is based on the hard temporal relationships between the phases of the cardiac cycle and the characteristic points of the plethysmogram. The proposed algorithm does not require estimation of initial data and selection of processing parameters. Use of the method on healthy subjects showed a very low detection error of characteristic points.

  5. A Neural Network Based Workstation for Automated Cell Proliferation Analysis

    Science.gov (United States)

    2001-10-25

    work was supported by the Programa de Apoyo a Proyectos de Desarrollo e Investigacíon en Informática REDII 2000. We thank Blanca Itzel Taboada for...Meléndez1, G. Corkidi.2 1Centro de Instrumentos, UNAM. P.O. Box 70-186, México 04510, D.F. 2Instituto de Biotecnología, UNAM. P.O. Box 510-3, 62250...proliferation analysis, of cytological microscope images. The software of the system assists the expert biotechnologist during cell proliferation and

  6. The impact of air pollution on the level of micronuclei measured by automated image analysis

    Czech Academy of Sciences Publication Activity Database

    Rössnerová, Andrea; Špátová, Milada; Rossner, P.; Solanský, I.; Šrám, Radim

    2009-01-01

    Roč. 669, 1-2 (2009), s. 42-47 ISSN 0027-5107 R&D Projects: GA AV ČR 1QS500390506; GA MŠk 2B06088; GA MŠk 2B08005 Institutional research plan: CEZ:AV0Z50390512 Keywords : micronuclei * binucleated cells * automated image analysis Subject RIV: DN - Health Impact of the Environment Quality Impact factor: 3.556, year: 2009

  7. CloVR-Comparative: automated, cloud-enabled comparative microbial genome sequence analysis pipeline

    OpenAIRE

    Agrawal, Sonia; Arze, Cesar; Adkins, Ricky S.; Crabtree, Jonathan; Riley, David; Vangala, Mahesh; Galens, Kevin; Fraser, Claire M.; Tettelin, Herv?; White, Owen; Angiuoli, Samuel V.; Mahurkar, Anup; Fricke, W. Florian

    2017-01-01

    Background The benefit of increasing genomic sequence data to the scientific community depends on easy-to-use, scalable bioinformatics support. CloVR-Comparative combines commonly used bioinformatics tools into an intuitive, automated, and cloud-enabled analysis pipeline for comparative microbial genomics. Results CloVR-Comparative runs on annotated complete or draft genome sequences that are uploaded by the user or selected via a taxonomic tree-based user interface and downloaded from NCBI. ...

  8. PLAYER POSITION DETECTION AND MOVEMENT PATTERN RECOGNITION FOR AUTOMATED TACTICAL ANALYSIS IN BADMINTON

    OpenAIRE

    KOKUM GAYANATH WEERATUNGA

    2018-01-01

    This thesis documents the development of a comprehensive approach to automate badminton tactical analysis. First, a computer algorithm was developed to automatically track badminton players moving on a court. Next, a machine learning algorithm was developed to analyse these movements and understand their underlying tactical implications. Both algorithms were tested and validated using video footage recorded at International badminton tournaments. The results demonstrate that the combination o...

  9. Automated analysis of Physarum network structure and dynamics

    Science.gov (United States)

    Fricker, Mark D.; Akita, Dai; Heaton, Luke LM; Jones, Nick; Obara, Boguslaw; Nakagaki, Toshiyuki

    2017-06-01

    We evaluate different ridge-enhancement and segmentation methods to automatically extract the network architecture from time-series of Physarum plasmodia withdrawing from an arena via a single exit. Whilst all methods gave reasonable results, judged by precision-recall analysis against a ground-truth skeleton, the mean phase angle (Feature Type) from intensity-independent, phase-congruency edge enhancement and watershed segmentation was the most robust to variation in threshold parameters. The resultant single pixel-wide segmented skeleton was converted to a graph representation as a set of weighted adjacency matrices containing the physical dimensions of each vein, and the inter-vein regions. We encapsulate the complete image processing and network analysis pipeline in a downloadable software package, and provide an extensive set of metrics that characterise the network structure, including hierarchical loop decomposition to analyse the nested structure of the developing network. In addition, the change in volume for each vein and intervening plasmodial sheet was used to predict the net flow across the network. The scaling relationships between predicted current, speed and shear force with vein radius were consistent with predictions from Murray’s law. This work was presented at PhysNet 2015.

  10. Automated economic analysis model for hazardous waste minimization

    International Nuclear Information System (INIS)

    Dharmavaram, S.; Mount, J.B.; Donahue, B.A.

    1990-01-01

    The US Army has established a policy of achieving a 50 percent reduction in hazardous waste generation by the end of 1992. To assist the Army in reaching this goal, the Environmental Division of the US Army Construction Engineering Research Laboratory (USACERL) designed the Economic Analysis Model for Hazardous Waste Minimization (EAHWM). The EAHWM was designed to allow the user to evaluate the life cycle costs for various techniques used in hazardous waste minimization and to compare them to the life cycle costs of current operating practices. The program was developed in C language on an IBM compatible PC and is consistent with other pertinent models for performing economic analyses. The potential hierarchical minimization categories used in EAHWM include source reduction, recovery and/or reuse, and treatment. Although treatment is no longer an acceptable minimization option, its use is widespread and has therefore been addressed in the model. The model allows for economic analysis for minimization of the Army's six most important hazardous waste streams. These include, solvents, paint stripping wastes, metal plating wastes, industrial waste-sludges, used oils, and batteries and battery electrolytes. The EAHWM also includes a general application which can be used to calculate and compare the life cycle costs for minimization alternatives of any waste stream, hazardous or non-hazardous. The EAHWM has been fully tested and implemented in more than 60 Army installations in the United States

  11. Automated analysis of Physarum network structure and dynamics

    International Nuclear Information System (INIS)

    Fricker, Mark D; Heaton, Luke LM; Akita, Dai; Jones, Nick; Obara, Boguslaw; Nakagaki, Toshiyuki

    2017-01-01

    We evaluate different ridge-enhancement and segmentation methods to automatically extract the network architecture from time-series of Physarum plasmodia withdrawing from an arena via a single exit. Whilst all methods gave reasonable results, judged by precision-recall analysis against a ground-truth skeleton, the mean phase angle (Feature Type) from intensity-independent, phase-congruency edge enhancement and watershed segmentation was the most robust to variation in threshold parameters. The resultant single pixel-wide segmented skeleton was converted to a graph representation as a set of weighted adjacency matrices containing the physical dimensions of each vein, and the inter-vein regions. We encapsulate the complete image processing and network analysis pipeline in a downloadable software package, and provide an extensive set of metrics that characterise the network structure, including hierarchical loop decomposition to analyse the nested structure of the developing network. In addition, the change in volume for each vein and intervening plasmodial sheet was used to predict the net flow across the network. The scaling relationships between predicted current, speed and shear force with vein radius were consistent with predictions from Murray’s law. This work was presented at PhysNet 2015. (paper)

  12. Chemical and thermal analysis for characterisation of building materials

    International Nuclear Information System (INIS)

    Kumar, S.C.; Sudersanan, M.; Ravindran, P.V.; Kalekar, B.B.; Mathur, P.K.

    2000-01-01

    Cement and other construction materials are extensively used for the construction of shielding materials for nuclear and high energy radiations. The design and optimum utilisation of such materials need an accurate analysis of their chemical composition. The moisture content and presence of bound water and other volatile materials are also important. The use of thermal analysis supplements the data obtained by chemical analysis and enables a distinction of moisture and chemically bound water. It also enables an identification of the process leading to the loss on ignition. The work carried out on the analysis of sand, cement and other aggregate materials used for the preparation of concrete is described in the paper. (author)

  13. Chemical Diversity, Origin, and Analysis of Phycotoxins

    DEFF Research Database (Denmark)

    Rasmussen, Silas Anselm; Andersen, Aaron John Christian; Andersen, Nikolaj Gedsted

    2016-01-01

    , yessotoxins, azaspiracids, brevetoxins, and pinnatoxins. Other toxins, such as ciguatoxins and maitotoxins, accumulate in fish, where, as is the case for the latter compounds, they can be metabolized to even more toxic metabolites. On the other hand, much less is known about the chemical nature of compounds...

  14. Evaluation of a novel automated water analyzer for continuous monitoring of toxicity and chemical parameters in municipal water supply.

    Science.gov (United States)

    Bodini, Sergio F; Malizia, Marzio; Tortelli, Annalisa; Sanfilippo, Luca; Zhou, Xingpeng; Arosio, Roberta; Bernasconi, Marzia; Di Lucia, Stefano; Manenti, Angela; Moscetta, Pompeo

    2018-08-15

    A novel tool, the DAMTA analyzer (Device for Analytical Monitoring and Toxicity Assessment), designed for fully automated toxicity measurements based on luminescent bacteria as well as for concomitant determination of chemical parameters, was developed and field-tested. The instrument is a robotic water analyzer equipped with a luminometer and a spectrophotometer, integrated on a thermostated reaction plate which contains a movable carousel with 80 cuvettes. Acute toxicity is measured on-line using a wild type Photobacterium phosphoreum strain with measurable bioluminescence and unaltered sensitivity to toxicants lasting up to ten days. The EC50 values of reference compounds tested were consistent with A. fischeri and P. phosphoreum international standards and comparable to previously published data. Concurrently, a laboratory trial demonstrated the feasibility of use of the analyzer for the determination of nutrients and metals in parallel to the toxicity measurements. In a prolonged test, the system was installed only in toxicity mode at the premises of the World Fair "Expo Milano-2015″, a high security site to ensure the quality of the supplied drinking water. The monitoring program lasted for six months during which ca. 2400 toxicity tests were carried out; the results indicated a mean non-toxic outcome of -5.5 ± 6.2%. In order to warrant the system's robustness in detecting toxic substances, Zn was measured daily with highly reproducible inhibition results, 70.8 ± 13.6%. These results assure that this novel toxicity monitor can be used as an early warning system for protection of drinking water sources from emergencies involving low probability/high impact contamination events in source water or treated water. Copyright © 2018 Elsevier Inc. All rights reserved.

  15. Sensitivity analysis of predictive models with an automated adjoint generator

    International Nuclear Information System (INIS)

    Pin, F.G.; Oblow, E.M.

    1987-01-01

    The adjoint method is a well established sensitivity analysis methodology that is particularly efficient in large-scale modeling problems. The coefficients of sensitivity of a given response with respect to every parameter involved in the modeling code can be calculated from the solution of a single adjoint run of the code. Sensitivity coefficients provide a quantitative measure of the importance of the model data in calculating the final results. The major drawback of the adjoint method is the requirement for calculations of very large numbers of partial derivatives to set up the adjoint equations of the model. ADGEN is a software system that has been designed to eliminate this drawback and automatically implement the adjoint formulation in computer codes. The ADGEN system will be described and its use for improving performance assessments and predictive simulations will be discussed. 8 refs., 1 fig

  16. Learning Methods for Dynamic Topic Modeling in Automated Behavior Analysis.

    Science.gov (United States)

    Isupova, Olga; Kuzin, Danil; Mihaylova, Lyudmila

    2017-09-27

    Semisupervised and unsupervised systems provide operators with invaluable support and can tremendously reduce the operators' load. In the light of the necessity to process large volumes of video data and provide autonomous decisions, this paper proposes new learning algorithms for activity analysis in video. The activities and behaviors are described by a dynamic topic model. Two novel learning algorithms based on the expectation maximization approach and variational Bayes inference are proposed. Theoretical derivations of the posterior estimates of model parameters are given. The designed learning algorithms are compared with the Gibbs sampling inference scheme introduced earlier in the literature. A detailed comparison of the learning algorithms is presented on real video data. We also propose an anomaly localization procedure, elegantly embedded in the topic modeling framework. It is shown that the developed learning algorithms can achieve 95% success rate. The proposed framework can be applied to a number of areas, including transportation systems, security, and surveillance.

  17. Toward the automated analysis of plasma physics problems

    International Nuclear Information System (INIS)

    Mynick, H.E.

    1989-04-01

    A program (CALC) is described, which carries out nontrivial plasma physics calculations, in a manner intended to emulate the approach of a human theorist. This includes the initial process of gathering the relevant equations from a plasma knowledge base, and then determining how to solve them. Solution of the sets of equations governing physics problems, which in general have a nonuniform,irregular structure, not amenable to solution by standardized algorithmic procedures, is facilitated by an analysis of the structure of the equations and the relations among them. This often permits decompositions of the full problem into subproblems, and other simplifications in form, which renders the resultant subsystems soluble by more standardized tools. CALC's operation is illustrated by a detailed description of its treatment of a sample plasma calculation. 5 refs., 3 figs

  18. Automated analysis of autoradiographic grains density with scanning microspectrophotometer

    International Nuclear Information System (INIS)

    Han Pingrong

    1988-01-01

    The mouse ascites tumour cells were used as the specimen, and 3 H-thymidine was used as the tracer. The smears were treated without staining or with Eosin or Giemsa-Wright staining. The automatic analysis of autoradiographic silver grains was performed with UNIVAR-MSPM. The relationship between integrated optical density (IOD) and silver grain density (SGD) in cell nuclei was studied. The results showed that the IOD could reflect the SGD basically, the correlation coefficients being 0.922, 0.9118 and 0.6218. Since the G-W stained cell nuclei appeared blue, their IOD was influenced strongly, whereas the Eosin stained cell nuclei appeared light red, their IOD was influenced only slightly. The latter was recommended. This method could be used for studying the DNA synthesis and cell proliferating kinetics

  19. Automated analysis technique developed for detection of ODSCC on the tubes of OPR1000 steam generator

    International Nuclear Information System (INIS)

    Kim, In Chul; Nam, Min Woo

    2013-01-01

    A steam generator (SG) tube is an important component of a nuclear power plant (NPP). It works as a pressure boundary between the primary and secondary systems. The integrity of a SG tube can be assessed by an eddy current test every outage. The eddy current technique(adopting a bobbin probe) is currently the main technique used to assess the integrity of the tubing of a steam generator. An eddy current signal analyst for steam generator tubes continuously analyzes data over a given period of time. However, there are possibilities that the analyst conducting the test may get tired and cause mistakes, such as: missing indications or not being able to separate a true defect signal from one that is more complicated. This error could lead to confusion and an improper interpretation of the signal analysis. In order to avoid these possibilities, many countries of opted for automated analyses. Axial ODSCC (outside diameter stress corrosion cracking) defects on the tubes of OPR1000 steam generators have been found on the tube that are in contract with tube support plates. In this study, automated analysis software called CDS (computer data screening) made by Zetec was used. This paper will discuss the results of introducing an automated analysis system for an axial ODSCC on the tubes of an OPR1000 steam generator.

  20. Automated analysis of free speech predicts psychosis onset in high-risk youths

    Science.gov (United States)

    Bedi, Gillinder; Carrillo, Facundo; Cecchi, Guillermo A; Slezak, Diego Fernández; Sigman, Mariano; Mota, Natália B; Ribeiro, Sidarta; Javitt, Daniel C; Copelli, Mauro; Corcoran, Cheryl M

    2015-01-01

    Background/Objectives: Psychiatry lacks the objective clinical tests routinely used in other specializations. Novel computerized methods to characterize complex behaviors such as speech could be used to identify and predict psychiatric illness in individuals. AIMS: In this proof-of-principle study, our aim was to test automated speech analyses combined with Machine Learning to predict later psychosis onset in youths at clinical high-risk (CHR) for psychosis. Methods: Thirty-four CHR youths (11 females) had baseline interviews and were assessed quarterly for up to 2.5 years; five transitioned to psychosis. Using automated analysis, transcripts of interviews were evaluated for semantic and syntactic features predicting later psychosis onset. Speech features were fed into a convex hull classification algorithm with leave-one-subject-out cross-validation to assess their predictive value for psychosis outcome. The canonical correlation between the speech features and prodromal symptom ratings was computed. Results: Derived speech features included a Latent Semantic Analysis measure of semantic coherence and two syntactic markers of speech complexity: maximum phrase length and use of determiners (e.g., which). These speech features predicted later psychosis development with 100% accuracy, outperforming classification from clinical interviews. Speech features were significantly correlated with prodromal symptoms. Conclusions: Findings support the utility of automated speech analysis to measure subtle, clinically relevant mental state changes in emergent psychosis. Recent developments in computer science, including natural language processing, could provide the foundation for future development of objective clinical tests for psychiatry. PMID:27336038

  1. Application of Automated Facial Expression Analysis and Qualitative Analysis to Assess Consumer Perception and Acceptability of Beverages and Water

    OpenAIRE

    Crist, Courtney Alissa

    2016-01-01

    Sensory and consumer sciences aim to understand the influences of product acceptability and purchase decisions. The food industry measures product acceptability through hedonic testing but often does not assess implicit or qualitative response. Incorporation of qualitative research and automated facial expression analysis (AFEA) may supplement hedonic acceptability testing to provide product insights. The purpose of this research was to assess the application of AFEA and qualitative analysis ...

  2. Search Analytics: Automated Learning, Analysis, and Search with Open Source

    Science.gov (United States)

    Hundman, K.; Mattmann, C. A.; Hyon, J.; Ramirez, P.

    2016-12-01

    The sheer volume of unstructured scientific data makes comprehensive human analysis impossible, resulting in missed opportunities to identify relationships, trends, gaps, and outliers. As the open source community continues to grow, tools like Apache Tika, Apache Solr, Stanford's DeepDive, and Data-Driven Documents (D3) can help address this challenge. With a focus on journal publications and conference abstracts often in the form of PDF and Microsoft Office documents, we've initiated an exploratory NASA Advanced Concepts project aiming to use the aforementioned open source text analytics tools to build a data-driven justification for the HyspIRI Decadal Survey mission. We call this capability Search Analytics, and it fuses and augments these open source tools to enable the automatic discovery and extraction of salient information. In the case of HyspIRI, a hyperspectral infrared imager mission, key findings resulted from the extractions and visualizations of relationships from thousands of unstructured scientific documents. The relationships include links between satellites (e.g. Landsat 8), domain-specific measurements (e.g. spectral coverage) and subjects (e.g. invasive species). Using the above open source tools, Search Analytics mined and characterized a corpus of information that would be infeasible for a human to process. More broadly, Search Analytics offers insights into various scientific and commercial applications enabled through missions and instrumentation with specific technical capabilities. For example, the following phrases were extracted in close proximity within a publication: "In this study, hyperspectral images…with high spatial resolution (1 m) were analyzed to detect cutleaf teasel in two areas. …Classification of cutleaf teasel reached a users accuracy of 82 to 84%." Without reading a single paper we can use Search Analytics to automatically identify that a 1 m spatial resolution provides a cutleaf teasel detection users accuracy of 82

  3. ADGEN: a system for automated sensitivity analysis of predictive models

    International Nuclear Information System (INIS)

    Pin, F.G.; Horwedel, J.E.; Oblow, E.M.; Lucius, J.L.

    1987-01-01

    A system that can automatically enhance computer codes with a sensitivity calculation capability is presented. With this new system, named ADGEN, rapid and cost-effective calculation of sensitivities can be performed in any FORTRAN code for all input data or parameters. The resulting sensitivities can be used in performance assessment studies related to licensing or interactions with the public to systematically and quantitatively prove the relative importance of each of the system parameters in calculating the final performance results. A general procedure calling for the systematic use of sensitivities in assessment studies is presented. The procedure can be used in modeling and model validation studies to avoid over modeling, in site characterization planning to avoid over collection of data, and in performance assessments to determine the uncertainties on the final calculated results. The added capability to formally perform the inverse problem, i.e., to determine the input data or parameters on which to focus to determine the input data or parameters on which to focus additional research or analysis effort in order to improve the uncertainty of the final results, is also discussed. 7 references, 2 figures

  4. ADGEN: a system for automated sensitivity analysis of predictive models

    International Nuclear Information System (INIS)

    Pin, F.G.; Horwedel, J.E.; Oblow, E.M.; Lucius, J.L.

    1986-09-01

    A system that can automatically enhance computer codes with a sensitivity calculation capability is presented. With this new system, named ADGEN, rapid and cost-effective calculation of sensitivities can be performed in any FORTRAN code for all input data or parameters. The resulting sensitivities can be used in performance assessment studies related to licensing or interactions with the public to systematically and quantitatively prove the relative importance of each of the system parameters in calculating the final performance results. A general procedure calling for the systematic use of sensitivities in assessment studies is presented. The procedure can be used in modelling and model validation studies to avoid ''over modelling,'' in site characterization planning to avoid ''over collection of data,'' and in performance assessment to determine the uncertainties on the final calculated results. The added capability to formally perform the inverse problem, i.e., to determine the input data or parameters on which to focus additional research or analysis effort in order to improve the uncertainty of the final results, is also discussed

  5. Automated region selection for analysis of dynamic cardiac SPECT data

    Science.gov (United States)

    Di Bella, E. V. R.; Gullberg, G. T.; Barclay, A. B.; Eisner, R. L.

    1997-06-01

    Dynamic cardiac SPECT using Tc-99m labeled teboroxime can provide kinetic parameters (washin, washout) indicative of myocardial blood flow. A time-consuming and subjective step of the data analysis is drawing regions of interest to delineate blood pool and myocardial tissue regions. The time-activity curves of the regions are then used to estimate local kinetic parameters. In this work, the appropriate regions are found automatically, in a manner similar to that used for calculating maximum count circumferential profiles in conventional static cardiac studies. The drawbacks to applying standard static circumferential profile methods are the high noise level and high liver uptake common in dynamic teboroxime studies. Searching along each ray for maxima to locate the myocardium does not typically provide useful information. Here we propose an iterative scheme in which constraints are imposed on the radii searched along each ray. The constraints are based on the shape of the time-activity curves of the circumferential profile members and on an assumption that the short axis slices are approximately circular. The constraints eliminate outliers and help to reduce the effects of noise and liver activity. Kinetic parameter estimates from the automatically generated regions were comparable to estimates from manually selected regions in dynamic canine teboroxime studies.

  6. Unsupervised EEG analysis for automated epileptic seizure detection

    Science.gov (United States)

    Birjandtalab, Javad; Pouyan, Maziyar Baran; Nourani, Mehrdad

    2016-07-01

    Epilepsy is a neurological disorder which can, if not controlled, potentially cause unexpected death. It is extremely crucial to have accurate automatic pattern recognition and data mining techniques to detect the onset of seizures and inform care-givers to help the patients. EEG signals are the preferred biosignals for diagnosis of epileptic patients. Most of the existing pattern recognition techniques used in EEG analysis leverage the notion of supervised machine learning algorithms. Since seizure data are heavily under-represented, such techniques are not always practical particularly when the labeled data is not sufficiently available or when disease progression is rapid and the corresponding EEG footprint pattern will not be robust. Furthermore, EEG pattern change is highly individual dependent and requires experienced specialists to annotate the seizure and non-seizure events. In this work, we present an unsupervised technique to discriminate seizures and non-seizures events. We employ power spectral density of EEG signals in different frequency bands that are informative features to accurately cluster seizure and non-seizure events. The experimental results tried so far indicate achieving more than 90% accuracy in clustering seizure and non-seizure events without having any prior knowledge on patient's history.

  7. Conventional Versus Automated Implantation of Loose Seeds in Prostate Brachytherapy: Analysis of Dosimetric and Clinical Results

    International Nuclear Information System (INIS)

    Genebes, Caroline; Filleron, Thomas; Graff, Pierre; Jonca, Frédéric; Huyghe, Eric; Thoulouzan, Matthieu; Soulie, Michel; Malavaud, Bernard; Aziza, Richard; Brun, Thomas; Delannes, Martine; Bachaud, Jean-Marc

    2013-01-01

    Purpose: To review the clinical outcome of I-125 permanent prostate brachytherapy (PPB) for low-risk and intermediate-risk prostate cancer and to compare 2 techniques of loose-seed implantation. Methods and Materials: 574 consecutive patients underwent I-125 PPB for low-risk and intermediate-risk prostate cancer between 2000 and 2008. Two successive techniques were used: conventional implantation from 2000 to 2004 and automated implantation (Nucletron, FIRST system) from 2004 to 2008. Dosimetric and biochemical recurrence-free (bNED) survival results were reported and compared for the 2 techniques. Univariate and multivariate analysis researched independent predictors for bNED survival. Results: 419 (73%) and 155 (27%) patients with low-risk and intermediate-risk disease, respectively, were treated (median follow-up time, 69.3 months). The 60-month bNED survival rates were 95.2% and 85.7%, respectively, for patients with low-risk and intermediate-risk disease (P=.04). In univariate analysis, patients treated with automated implantation had worse bNED survival rates than did those treated with conventional implantation (P<.0001). By day 30, patients treated with automated implantation showed lower values of dose delivered to 90% of prostate volume (D90) and volume of prostate receiving 100% of prescribed dose (V100). In multivariate analysis, implantation technique, Gleason score, and V100 on day 30 were independent predictors of recurrence-free status. Grade 3 urethritis and urinary incontinence were observed in 2.6% and 1.6% of the cohort, respectively, with no significant differences between the 2 techniques. No grade 3 proctitis was observed. Conclusion: Satisfactory 60-month bNED survival rates (93.1%) and acceptable toxicity (grade 3 urethritis <3%) were achieved by loose-seed implantation. Automated implantation was associated with worse dosimetric and bNED survival outcomes

  8. Conventional Versus Automated Implantation of Loose Seeds in Prostate Brachytherapy: Analysis of Dosimetric and Clinical Results

    Energy Technology Data Exchange (ETDEWEB)

    Genebes, Caroline, E-mail: genebes.caroline@claudiusregaud.fr [Radiation Oncology Department, Institut Claudius Regaud, Toulouse (France); Filleron, Thomas; Graff, Pierre [Radiation Oncology Department, Institut Claudius Regaud, Toulouse (France); Jonca, Frédéric [Department of Urology, Clinique Ambroise Paré, Toulouse (France); Huyghe, Eric; Thoulouzan, Matthieu; Soulie, Michel; Malavaud, Bernard [Department of Urology and Andrology, CHU Rangueil, Toulouse (France); Aziza, Richard; Brun, Thomas; Delannes, Martine; Bachaud, Jean-Marc [Radiation Oncology Department, Institut Claudius Regaud, Toulouse (France)

    2013-11-15

    Purpose: To review the clinical outcome of I-125 permanent prostate brachytherapy (PPB) for low-risk and intermediate-risk prostate cancer and to compare 2 techniques of loose-seed implantation. Methods and Materials: 574 consecutive patients underwent I-125 PPB for low-risk and intermediate-risk prostate cancer between 2000 and 2008. Two successive techniques were used: conventional implantation from 2000 to 2004 and automated implantation (Nucletron, FIRST system) from 2004 to 2008. Dosimetric and biochemical recurrence-free (bNED) survival results were reported and compared for the 2 techniques. Univariate and multivariate analysis researched independent predictors for bNED survival. Results: 419 (73%) and 155 (27%) patients with low-risk and intermediate-risk disease, respectively, were treated (median follow-up time, 69.3 months). The 60-month bNED survival rates were 95.2% and 85.7%, respectively, for patients with low-risk and intermediate-risk disease (P=.04). In univariate analysis, patients treated with automated implantation had worse bNED survival rates than did those treated with conventional implantation (P<.0001). By day 30, patients treated with automated implantation showed lower values of dose delivered to 90% of prostate volume (D90) and volume of prostate receiving 100% of prescribed dose (V100). In multivariate analysis, implantation technique, Gleason score, and V100 on day 30 were independent predictors of recurrence-free status. Grade 3 urethritis and urinary incontinence were observed in 2.6% and 1.6% of the cohort, respectively, with no significant differences between the 2 techniques. No grade 3 proctitis was observed. Conclusion: Satisfactory 60-month bNED survival rates (93.1%) and acceptable toxicity (grade 3 urethritis <3%) were achieved by loose-seed implantation. Automated implantation was associated with worse dosimetric and bNED survival outcomes.

  9. Video and accelerometer-based motion analysis for automated surgical skills assessment.

    Science.gov (United States)

    Zia, Aneeq; Sharma, Yachna; Bettadapura, Vinay; Sarin, Eric L; Essa, Irfan

    2018-03-01

    Basic surgical skills of suturing and knot tying are an essential part of medical training. Having an automated system for surgical skills assessment could help save experts time and improve training efficiency. There have been some recent attempts at automated surgical skills assessment using either video analysis or acceleration data. In this paper, we present a novel approach for automated assessment of OSATS-like surgical skills and provide an analysis of different features on multi-modal data (video and accelerometer data). We conduct a large study for basic surgical skill assessment on a dataset that contained video and accelerometer data for suturing and knot-tying tasks. We introduce "entropy-based" features-approximate entropy and cross-approximate entropy, which quantify the amount of predictability and regularity of fluctuations in time series data. The proposed features are compared to existing methods of Sequential Motion Texture, Discrete Cosine Transform and Discrete Fourier Transform, for surgical skills assessment. We report average performance of different features across all applicable OSATS-like criteria for suturing and knot-tying tasks. Our analysis shows that the proposed entropy-based features outperform previous state-of-the-art methods using video data, achieving average classification accuracies of 95.1 and 92.2% for suturing and knot tying, respectively. For accelerometer data, our method performs better for suturing achieving 86.8% average accuracy. We also show that fusion of video and acceleration features can improve overall performance for skill assessment. Automated surgical skills assessment can be achieved with high accuracy using the proposed entropy features. Such a system can significantly improve the efficiency of surgical training in medical schools and teaching hospitals.

  10. Knowledge Support and Automation for Performance Analysis with PerfExplorer 2.0

    Directory of Open Access Journals (Sweden)

    Kevin A. Huck

    2008-01-01

    Full Text Available The integration of scalable performance analysis in parallel development tools is difficult. The potential size of data sets and the need to compare results from multiple experiments presents a challenge to manage and process the information. Simply to characterize the performance of parallel applications running on potentially hundreds of thousands of processor cores requires new scalable analysis techniques. Furthermore, many exploratory analysis processes are repeatable and could be automated, but are now implemented as manual procedures. In this paper, we will discuss the current version of PerfExplorer, a performance analysis framework which provides dimension reduction, clustering and correlation analysis of individual trails of large dimensions, and can perform relative performance analysis between multiple application executions. PerfExplorer analysis processes can be captured in the form of Python scripts, automating what would otherwise be time-consuming tasks. We will give examples of large-scale analysis results, and discuss the future development of the framework, including the encoding and processing of expert performance rules, and the increasing use of performance metadata.

  11. Recording and automated analysis of naturalistic bioptic driving.

    Science.gov (United States)

    Luo, Gang; Peli, Eli

    2011-05-01

    People with moderate central vision loss are legally permitted to drive with a bioptic telescope in 39 US states and the Netherlands, but the safety of bioptic driving remains highly controversial. There is no scientific evidence about bioptic use and its impact on safety. We propose searching for evidence by recording naturalistic driving activities in patients' cars. In a pilot study we used an analogue video system to record two bioptic drivers' daily driving activities for 10 and 5 days, respectively. In this technical report, we also describe our novel digital system that collects vehicle manoeuvre information and enables recording over more extended periods, and discuss our approach to analyzing the vast amount of data. Our observations of telescope use by the pilot subjects were quite different from their reports in a previous survey. One subject used the telescope only seven times in nearly 6 h of driving. For the other subject, the average interval between telescope use was about 2 min, and Mobile (cell) phone use in one trip extended the interval to almost 5 min. We demonstrate that computerized analysis of lengthy recordings based on video, GPS, acceleration, and black box data can be used to select informative segments for efficient off-line review of naturalistic driving behaviours. The inconsistency between self reports and objective data as well as infrequent telescope use underscores the importance of recording bioptic driving behaviours in naturalistic conditions over extended periods. We argue that the new recording system is important for understanding bioptic use behaviours and bioptic driving safety. © 2011 The College of Optometrists.

  12. Analysis of chemical constituents in Cistanche species.

    Science.gov (United States)

    Jiang, Yong; Tu, Peng-Fei

    2009-03-13

    Species of the genus of Cistanche (Rou Cong Rong in Chinese) are perennial parasite herbs, and are mainly distributed in arid lands and warm deserts. As a superior tonic for the treatment of kidney deficiency, impotence, female infertility, morbid leucorrhea, profuse metrorrhagia and senile constipation, Cistanche herbs earned the honor of "Ginseng of the desert". Recently, there has been increasing scientific attention on Herba Cistanche for its remarkable bioactivities including antioxidation, neuroprotection, and anti-aging. The chemical constituents of Cistanche plants mainly include volatile oils and non-volatile phenylethanoid glycosides (PhGs), iridoids, lignans, alditols, oligosaccharides and polysaccharides. Pharmacological studies show that PhGs are the main active components for curing kidney deficiency, antioxidation and neuroprotection; galactitol and oligosaccharides are the representatives for the treatment of senile constipation, while polysaccharides are responsible for improving body immunity. In this paper, the advances on the chemical constituents of Cistanche plants and their corresponding analyses are reviewed.

  13. Hybrid chemical and nondestructive-analysis technique

    International Nuclear Information System (INIS)

    Hsue, S.T.; Marsh, S.F.; Marks, T.

    1982-01-01

    A hybrid chemical/NDA technique has been applied at the Los Alamos National Laboratory to the assay of plutonium in ion-exchange effluents. Typical effluent solutions contain low concentrations of plutonium and high concentrations of americium. A simple trioctylphosphine oxide (TOPO) separation can remove 99.9% of the americium. The organic phase that contains the separated plutonium can be accurately assayed by monitoring the uranium L x-ray intensities

  14. Semi-automated volumetric analysis of artificial lymph nodes in a phantom study

    International Nuclear Information System (INIS)

    Fabel, M.; Biederer, J.; Jochens, A.; Bornemann, L.; Soza, G.; Heller, M.; Bolte, H.

    2011-01-01

    Purpose: Quantification of tumour burden in oncology requires accurate and reproducible image evaluation. The current standard is one-dimensional measurement (e.g. RECIST) with inherent disadvantages. Volumetric analysis is discussed as an alternative for therapy monitoring of lung and liver metastases. The aim of this study was to investigate the accuracy of semi-automated volumetric analysis of artificial lymph node metastases in a phantom study. Materials and methods: Fifty artificial lymph nodes were produced in a size range from 10 to 55 mm; some of them enhanced using iodine contrast media. All nodules were placed in an artificial chest phantom (artiCHEST ® ) within different surrounding tissues. MDCT was performed using different collimations (1–5 mm) at varying reconstruction kernels (B20f, B40f, B60f). Volume and RECIST measurements were performed using Oncology Software (Siemens Healthcare, Forchheim, Germany) and were compared to reference volume and diameter by calculating absolute percentage errors. Results: The software performance allowed a robust volumetric analysis in a phantom setting. Unsatisfying segmentation results were frequently found for native nodules within surrounding muscle. The absolute percentage error (APE) for volumetric analysis varied between 0.01 and 225%. No significant differences were seen between different reconstruction kernels. The most unsatisfactory segmentation results occurred in higher slice thickness (4 and 5 mm). Contrast enhanced lymph nodes showed better segmentation results by trend. Conclusion: The semi-automated 3D-volumetric analysis software tool allows a reliable and convenient segmentation of artificial lymph nodes in a phantom setting. Lymph nodes adjacent to tissue of similar density cause segmentation problems. For volumetric analysis of lymph node metastases in clinical routine a slice thickness of ≤3 mm and a medium soft reconstruction kernel (e.g. B40f for Siemens scan systems) may be a suitable

  15. Automation of dimethylation after guanidination labeling chemistry and its compatibility with common buffers and surfactants for mass spectrometry-based shotgun quantitative proteome analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lo, Andy; Tang, Yanan; Chen, Lu; Li, Liang, E-mail: Liang.Li@ualberta.ca

    2013-07-25

    Graphical abstract: -- Highlights: •Dimethylation after guanidination (2MEGA) uses inexpensive reagents for isotopic labeling of peptides. •2MEGA can be optimized and automated for labeling peptides with high efficiency. •2MEGA is compatible with several commonly used cell lysis and protein solubilization reagents. •The automated 2MEGA labeling method can be used to handle a variety of protein samples for relative proteome quantification. -- Abstract: Isotope labeling liquid chromatography–mass spectrometry (LC–MS) is a major analytical platform for quantitative proteome analysis. Incorporation of isotopes used to distinguish samples plays a critical role in the success of this strategy. In this work, we optimized and automated a chemical derivatization protocol (dimethylation after guanidination, 2MEGA) to increase the labeling reproducibility and reduce human intervention. We also evaluated the reagent compatibility of this protocol to handle biological samples in different types of buffers and surfactants. A commercially available liquid handler was used for reagent dispensation to minimize analyst intervention and at least twenty protein digest samples could be prepared in a single run. Different front-end sample preparation methods for protein solubilization (SDS, urea, Rapigest™, and ProteaseMAX™) and two commercially available cell lysis buffers were evaluated for compatibility with the automated protocol. It was found that better than 94% desired labeling could be obtained in all conditions studied except urea, where the rate was reduced to about 92% due to carbamylation on the peptide amines. This work illustrates the automated 2MEGA labeling process can be used to handle a wide range of protein samples containing various reagents that are often encountered in protein sample preparation for quantitative proteome analysis.

  16. Design and Demonstration of Automated Data Analysis Algorithms for Ultrasonic Inspection of Complex Composite Panels with Bonds

    Science.gov (United States)

    2016-02-01

    all of the ADA called indications into three groups: true positives (TP), missed calls (MC) and false calls (FC). Note, an indication position error...data review burden and improve the reliability of the ultrasonic inspection of large composite structures, automated data analysis ( ADA ) algorithms...thickness and backwall C-scan images. 15. SUBJECT TERMS automated data analysis ( ADA ) algorithms; time-of-flight indications; backwall amplitude dropout

  17. Automated Software Analysis of Fetal Movement Recorded during a Pregnant Woman's Sleep at Home.

    Science.gov (United States)

    Nishihara, Kyoko; Ohki, Noboru; Kamata, Hideo; Ryo, Eiji; Horiuchi, Shigeko

    2015-01-01

    Fetal movement is an important biological index of fetal well-being. Since 2008, we have been developing an original capacitive acceleration sensor and device that a pregnant woman can easily use to record fetal movement by herself at home during sleep. In this study, we report a newly developed automated software system for analyzing recorded fetal movement. This study will introduce the system and compare its results to those of a manual analysis of the same fetal movement signals (Experiment I). We will also demonstrate an appropriate way to use the system (Experiment II). In Experiment I, fetal movement data reported previously for six pregnant women at 28-38 gestational weeks were used. We evaluated the agreement of the manual and automated analyses for the same 10-sec epochs using prevalence-adjusted bias-adjusted kappa (PABAK) including quantitative indicators for prevalence and bias. The mean PABAK value was 0.83, which can be considered almost perfect. In Experiment II, twelve pregnant women at 24-36 gestational weeks recorded fetal movement at night once every four weeks. Overall, mean fetal movement counts per hour during maternal sleep significantly decreased along with gestational weeks, though individual differences in fetal development were noted. This newly developed automated analysis system can provide important data throughout late pregnancy.

  18. Automated magnification calibration in transmission electron microscopy using Fourier analysis of replica images

    International Nuclear Information System (INIS)

    Laak, Jeroen A.W.M. van der; Dijkman, Henry B.P.M.; Pahlplatz, Martin M.M.

    2006-01-01

    The magnification factor in transmission electron microscopy is not very precise, hampering for instance quantitative analysis of specimens. Calibration of the magnification is usually performed interactively using replica specimens, containing line or grating patterns with known spacing. In the present study, a procedure is described for automated magnification calibration using digital images of a line replica. This procedure is based on analysis of the power spectrum of Fourier transformed replica images, and is compared to interactive measurement in the same images. Images were used with magnification ranging from 1,000x to 200,000x. The automated procedure deviated on average 0.10% from interactive measurements. Especially for catalase replicas, the coefficient of variation of automated measurement was considerably smaller (average 0.28%) compared to that of interactive measurement (average 3.5%). In conclusion, calibration of the magnification in digital images from transmission electron microscopy may be performed automatically, using the procedure presented here, with high precision and accuracy

  19. Automated Software Analysis of Fetal Movement Recorded during a Pregnant Woman's Sleep at Home.

    Directory of Open Access Journals (Sweden)

    Kyoko Nishihara

    Full Text Available Fetal movement is an important biological index of fetal well-being. Since 2008, we have been developing an original capacitive acceleration sensor and device that a pregnant woman can easily use to record fetal movement by herself at home during sleep. In this study, we report a newly developed automated software system for analyzing recorded fetal movement. This study will introduce the system and compare its results to those of a manual analysis of the same fetal movement signals (Experiment I. We will also demonstrate an appropriate way to use the system (Experiment II. In Experiment I, fetal movement data reported previously for six pregnant women at 28-38 gestational weeks were used. We evaluated the agreement of the manual and automated analyses for the same 10-sec epochs using prevalence-adjusted bias-adjusted kappa (PABAK including quantitative indicators for prevalence and bias. The mean PABAK value was 0.83, which can be considered almost perfect. In Experiment II, twelve pregnant women at 24-36 gestational weeks recorded fetal movement at night once every four weeks. Overall, mean fetal movement counts per hour during maternal sleep significantly decreased along with gestational weeks, though individual differences in fetal development were noted. This newly developed automated analysis system can provide important data throughout late pregnancy.

  20. Data for automated, high-throughput microscopy analysis of intracellular bacterial colonies using spot detection.

    Science.gov (United States)

    Ernstsen, Christina L; Login, Frédéric H; Jensen, Helene H; Nørregaard, Rikke; Møller-Jensen, Jakob; Nejsum, Lene N

    2017-10-01

    Quantification of intracellular bacterial colonies is useful in strategies directed against bacterial attachment, subsequent cellular invasion and intracellular proliferation. An automated, high-throughput microscopy-method was established to quantify the number and size of intracellular bacterial colonies in infected host cells (Detection and quantification of intracellular bacterial colonies by automated, high-throughput microscopy, Ernstsen et al., 2017 [1]). The infected cells were imaged with a 10× objective and number of intracellular bacterial colonies, their size distribution and the number of cell nuclei were automatically quantified using a spot detection-tool. The spot detection-output was exported to Excel, where data analysis was performed. In this article, micrographs and spot detection data are made available to facilitate implementation of the method.

  1. Some problems concenrning the use of automated radiochemical separation systems in destructive neutron activation analysis

    International Nuclear Information System (INIS)

    Nagy, L.G.; Toeroek, G.

    1977-01-01

    The present state of a long term program is reviewed. It was started to elaborate a remote controlled automated radiochemical processing system for the neutron activation analysis of biological materials. The system is based on wet ashing of the sample followed by reactive desorption of some volatile components. The distillation residue is passed through a series of columns filled with selective ion screening materials to remove the matrix activity. The solution is thus ''stripped'' from the interfering radioions, and it is processed to single-elements through group separations using ion-exchange chromatographic techniques. Some special problems concerning this system are treated. (a) General aspects of the construction of a (semi)automated radiochemical processing system are discussed. (b) Comparison is made between various technical realizations of the same basic concept. (c) Some problems concerning the ''reconstruction'' of an already published processing system are outlined. (T.G.)

  2. Controlling the accuracy of chemical analysis

    International Nuclear Information System (INIS)

    Suschny, O.; Danesi, P.R.

    1991-01-01

    The involvement of the IAEA in quantitative analysis began in the early 1960's with radiochemical work connected with the environment. It than expanded to cover analysis (mostly by nuclear techniques) of samples for projects associated with human health, agriculture, hydrology and international safeguards. This article highlights the IAEA activities in the field of quality control in quantitative analysis

  3. Comprehensive Analysis Competence and Innovative Approaches for Sustainable Chemical Production.

    Science.gov (United States)

    Appel, Joerg; Colombo, Corrado; Dätwyler, Urs; Chen, Yun; Kerimoglu, Nimet

    2016-01-01

    Humanity currently sees itself facing enormous economic, ecological, and social challenges. Sustainable products and production in specialty chemistry are an important strategic element to address these megatrends. In addition to that, digitalization and global connectivity will create new opportunities for the industry. One aspect is examined in this paper, which shows the development of comprehensive analysis of production networks for a more sustainable production in which the need for innovative solutions arises. Examples from data analysis, advanced process control and automated performance monitoring are shown. These efforts have significant impact on improved yields, reduced energy and water consumption, and better product performance in the application of the products.

  4. SAMPO 90 - High resolution interactive gamma spectrum analysis including automation with macros

    International Nuclear Information System (INIS)

    Aarnio, P.A.; Nikkinen, M.T.; Routti, J.T.

    1991-01-01

    SAMPO 90 is a high performance gamma spectrum analysis program for personal computers. It uses high resolution color graphics to display calibrations, spectra, fitting results as multiplet components, and analysis results. All the analysis phases can be done either under full interactive user control or by using macros for automated measurement and analysis sequences including the control of MCAs and sample changers. Semi-automated calibrations for peak shapes (Gaussian with exponential tails), detector efficiency, and energy are available with a possibility for user intervention through interactive graphics. Accurate peak area determination of even the most complex multiplets, of up to 32 components, is accomplished using linear, non-linear and mixed mode fitting, where the component energies and areas can be either frozen or allowed to float in arbitrary combinations. Nuclide identification is done using associated lines techniques which allow interference correction for fully overlapping peaks. Peaked Background Subtraction can be performed and Minimum Detectable Activities calculated. Attenuation corrections can be taken into account in detector efficiency calculation. The most common PC-based MCA spectrum formats (Canberra S100, Ortec ACE, Nucleus PCA, ND AccuSpec) are supported as well as ASCII spectrum files. A gamma-line library is included together with an editor for user configurable libraries. The analysis reports and program parameters are fully customizable. Function key macros can be used to automate the most common analysis procedures. Small batch type modules are additionally available for routine work. SAMPO 90 is a result of over twenty man years of programming and contains 25,000 lines of Fortran, 10,000 lines of C, and 12,000 lines of assembler

  5. Chemical considerations in severe accident analysis

    International Nuclear Information System (INIS)

    Malinauskas, A.P.; Kress, T.S.

    1988-01-01

    The Reactor Safety Study presented the first systematic attempt to include fission product physicochemical effects in the determination of expected consequences of hypothetical nuclear reactor power plant accidents. At the time, however, the data base was sparse, and the treatment of fission product behavior was not entirely consistent or accurate. Considerable research has since been performed to identify and understand chemical phenomena that can occur in the course of a nuclear reactor accident, and how these phenomena affect fission product behavior. In this report, the current status of our understanding of the chemistry of fission products in severe core damage accidents is summarized and contrasted with that of the Reactor Safety Study

  6. OpenComet: An automated tool for comet assay image analysis

    Directory of Open Access Journals (Sweden)

    Benjamin M. Gyori

    2014-01-01

    Full Text Available Reactive species such as free radicals are constantly generated in vivo and DNA is the most important target of oxidative stress. Oxidative DNA damage is used as a predictive biomarker to monitor the risk of development of many diseases. The comet assay is widely used for measuring oxidative DNA damage at a single cell level. The analysis of comet assay output images, however, poses considerable challenges. Commercial software is costly and restrictive, while free software generally requires laborious manual tagging of cells. This paper presents OpenComet, an open-source software tool providing automated analysis of comet assay images. It uses a novel and robust method for finding comets based on geometric shape attributes and segmenting the comet heads through image intensity profile analysis. Due to automation, OpenComet is more accurate, less prone to human bias, and faster than manual analysis. A live analysis functionality also allows users to analyze images captured directly from a microscope. We have validated OpenComet on both alkaline and neutral comet assay images as well as sample images from existing software packages. Our results show that OpenComet achieves high accuracy with significantly reduced analysis time.

  7. Unmet needs in automated cytogenetics

    International Nuclear Information System (INIS)

    Bender, M.A.

    1976-01-01

    Though some, at least, of the goals of automation systems for analysis of clinical cytogenetic material seem either at hand, like automatic metaphase finding, or at least likely to be met in the near future, like operator-assisted semi-automatic analysis of banded metaphase spreads, important areas of cytogenetic analsis, most importantly the determination of chromosomal aberration frequencies in populations of cells or in samples of cells from people exposed to environmental mutagens, await practical methods of automation. Important as are the clinical diagnostic applications, it is apparent that increasing concern over the clastogenic effects of the multitude of potentially clastogenic chemical and physical agents to which human populations are being increasingly exposed, and the resulting emergence of extensive cytogenetic testing protocols, makes the development of automation not only economically feasible but almost mandatory. The nature of the problems involved, and acutal of possible approaches to their solution, are discussed

  8. Optimizing transformations for automated, high throughput analysis of flow cytometry data.

    Science.gov (United States)

    Finak, Greg; Perez, Juan-Manuel; Weng, Andrew; Gottardo, Raphael

    2010-11-04

    In a high throughput setting, effective flow cytometry data analysis depends heavily on proper data preprocessing. While usual preprocessing steps of quality assessment, outlier removal, normalization, and gating have received considerable scrutiny from the community, the influence of data transformation on the output of high throughput analysis has been largely overlooked. Flow cytometry measurements can vary over several orders of magnitude, cell populations can have variances that depend on their mean fluorescence intensities, and may exhibit heavily-skewed distributions. Consequently, the choice of data transformation can influence the output of automated gating. An appropriate data transformation aids in data visualization and gating of cell populations across the range of data. Experience shows that the choice of transformation is data specific. Our goal here is to compare the performance of different transformations applied to flow cytometry data in the context of automated gating in a high throughput, fully automated setting. We examine the most common transformations used in flow cytometry, including the generalized hyperbolic arcsine, biexponential, linlog, and generalized Box-Cox, all within the BioConductor flowCore framework that is widely used in high throughput, automated flow cytometry data analysis. All of these transformations have adjustable parameters whose effects upon the data are non-intuitive for most users. By making some modelling assumptions about the transformed data, we develop maximum likelihood criteria to optimize parameter choice for these different transformations. We compare the performance of parameter-optimized and default-parameter (in flowCore) data transformations on real and simulated data by measuring the variation in the locations of cell populations across samples, discovered via automated gating in both the scatter and fluorescence channels. We find that parameter-optimized transformations improve visualization, reduce

  9. Optimizing transformations for automated, high throughput analysis of flow cytometry data

    Directory of Open Access Journals (Sweden)

    Weng Andrew

    2010-11-01

    Full Text Available Abstract Background In a high throughput setting, effective flow cytometry data analysis depends heavily on proper data preprocessing. While usual preprocessing steps of quality assessment, outlier removal, normalization, and gating have received considerable scrutiny from the community, the influence of data transformation on the output of high throughput analysis has been largely overlooked. Flow cytometry measurements can vary over several orders of magnitude, cell populations can have variances that depend on their mean fluorescence intensities, and may exhibit heavily-skewed distributions. Consequently, the choice of data transformation can influence the output of automated gating. An appropriate data transformation aids in data visualization and gating of cell populations across the range of data. Experience shows that the choice of transformation is data specific. Our goal here is to compare the performance of different transformations applied to flow cytometry data in the context of automated gating in a high throughput, fully automated setting. We examine the most common transformations used in flow cytometry, including the generalized hyperbolic arcsine, biexponential, linlog, and generalized Box-Cox, all within the BioConductor flowCore framework that is widely used in high throughput, automated flow cytometry data analysis. All of these transformations have adjustable parameters whose effects upon the data are non-intuitive for most users. By making some modelling assumptions about the transformed data, we develop maximum likelihood criteria to optimize parameter choice for these different transformations. Results We compare the performance of parameter-optimized and default-parameter (in flowCore data transformations on real and simulated data by measuring the variation in the locations of cell populations across samples, discovered via automated gating in both the scatter and fluorescence channels. We find that parameter

  10. Analysis of chemical constituents in medicinal plants of selected ...

    African Journals Online (AJOL)

    Analysis of chemical constituents in medicinal plants of selected districts of Pakhtoonkhwa, Pakistan. I Hussain, R Ullah, J Khan, N Khan, M Zahoor, N Ullah, MuR Khattak, FA Khan, A Baseer, M Khurram ...

  11. Automated image analysis of lateral lumber X-rays by a form model

    International Nuclear Information System (INIS)

    Mahnken, A.H.; Kohnen, M.; Steinberg, S.; Wein, B.B.; Guenther, R.W.

    2001-01-01

    Development of a software for fully automated image analysis of lateral lumbar spine X-rays. Material and method: Using the concept of active shape models, we developed a software that produces a form model of the lumbar spine from lateral lumbar spine radiographs and runs an automated image segmentation. This model is able to detect lumbar vertebrae automatically after the filtering of digitized X-ray images. The model was trained with 20 lateral lumbar spine radiographs with no pathological findings before we evaluated the software with 30 further X-ray images which were sorted by image quality ranging from one (best) to three (worst). There were 10 images for each quality. Results: Image recognition strongly depended on image quality. In group one 52 and in group two 51 out of 60 vertebral bodies including the sacrum were recognized, but in group three only 18 vertebral bodies were properly identified. Conclusion: Fully automated and reliable recognition of vertebral bodies from lateral spine radiographs using the concept of active shape models is possible. The precision of this technique is limited by the superposition of different structures. Further improvements are necessary. Therefore standardized image quality and enlargement of the training data set are required. (orig.) [de

  12. Automated kidney morphology measurements from ultrasound images using texture and edge analysis

    Science.gov (United States)

    Ravishankar, Hariharan; Annangi, Pavan; Washburn, Michael; Lanning, Justin

    2016-04-01

    In a typical ultrasound scan, a sonographer measures Kidney morphology to assess renal abnormalities. Kidney morphology can also help to discriminate between chronic and acute kidney failure. The caliper placements and volume measurements are often time consuming and an automated solution will help to improve accuracy, repeatability and throughput. In this work, we developed an automated Kidney morphology measurement solution from long axis Ultrasound scans. Automated kidney segmentation is challenging due to wide variability in kidney shape, size, weak contrast of the kidney boundaries and presence of strong edges like diaphragm, fat layers. To address the challenges and be able to accurately localize and detect kidney regions, we present a two-step algorithm that makes use of edge and texture information in combination with anatomical cues. First, we use an edge analysis technique to localize kidney region by matching the edge map with predefined templates. To accurately estimate the kidney morphology, we use textural information in a machine learning algorithm framework using Haar features and Gradient boosting classifier. We have tested the algorithm on 45 unseen cases and the performance against ground truth is measured by computing Dice overlap, % error in major and minor axis of kidney. The algorithm shows successful performance on 80% cases.

  13. Large-scale automated analysis of news media: a novel computational method for obesity policy research.

    Science.gov (United States)

    Hamad, Rita; Pomeranz, Jennifer L; Siddiqi, Arjumand; Basu, Sanjay

    2015-02-01

    Analyzing news media allows obesity policy researchers to understand popular conceptions about obesity, which is important for targeting health education and policies. A persistent dilemma is that investigators have to read and manually classify thousands of individual news articles to identify how obesity and obesity-related policy proposals may be described to the public in the media. A machine learning method called "automated content analysis" that permits researchers to train computers to "read" and classify massive volumes of documents was demonstrated. 14,302 newspaper articles that mentioned the word "obesity" during 2011-2012 were identified. Four states that vary in obesity prevalence and policy (Alabama, California, New Jersey, and North Carolina) were examined. The reliability of an automated program to categorize the media's framing of obesity as an individual-level problem (e.g., diet) and/or an environmental-level problem (e.g., obesogenic environment) was tested. The automated program performed similarly to human coders. The proportion of articles with individual-level framing (27.7-31.0%) was higher than the proportion with neutral (18.0-22.1%) or environmental-level framing (16.0-16.4%) across all states and over the entire study period (Pnews media was demonstrated. © 2014 The Obesity Society.

  14. Approach to analysis of single nucleotide polymorphisms by automated constant denaturant capillary electrophoresis

    International Nuclear Information System (INIS)

    Bjoerheim, Jens; Abrahamsen, Torveig Weum; Kristensen, Annette Torgunrud; Gaudernack, Gustav; Ekstroem, Per O.

    2003-01-01

    Melting gel techniques have proven to be amenable and powerful tools in point mutation and single nucleotide polymorphism (SNP) analysis. With the introduction of commercially available capillary electrophoresis instruments, a partly automated platform for denaturant capillary electrophoresis with potential for routine screening of selected target sequences has been established. The aim of this article is to demonstrate the use of automated constant denaturant capillary electrophoresis (ACDCE) in single nucleotide polymorphism analysis of various target sequences. Optimal analysis conditions for different single nucleotide polymorphisms on ACDCE are evaluated with the Poland algorithm. Laboratory procedures include only PCR and electrophoresis. For direct genotyping of individual SNPs, the samples are analyzed with an internal standard and the alleles are identified by co-migration of sample and standard peaks. In conclusion, SNPs suitable for melting gel analysis based on theoretical thermodynamics were separated by ACDCE under appropriate conditions. With this instrumentation (ABI 310 Genetic Analyzer), 48 samples could be analyzed without any intervention. Several institutions have capillary instrumentation in-house, thus making this SNP analysis method accessible to large groups of researchers without any need for instrument modification

  15. An automated system designed for large scale NMR data deposition and annotation: application to over 600 assigned chemical shift data entries to the BioMagResBank from the Riken Structural Genomics/Proteomics Initiative internal database

    International Nuclear Information System (INIS)

    Kobayashi, Naohiro; Harano, Yoko; Tochio, Naoya; Nakatani, Eiichi; Kigawa, Takanori; Yokoyama, Shigeyuki; Mading, Steve; Ulrich, Eldon L.; Markley, John L.; Akutsu, Hideo; Fujiwara, Toshimichi

    2012-01-01

    Biomolecular NMR chemical shift data are key information for the functional analysis of biomolecules and the development of new techniques for NMR studies utilizing chemical shift statistical information. Structural genomics projects are major contributors to the accumulation of protein chemical shift information. The management of the large quantities of NMR data generated by each project in a local database and the transfer of the data to the public databases are still formidable tasks because of the complicated nature of NMR data. Here we report an automated and efficient system developed for the deposition and annotation of a large number of data sets including 1 H, 13 C and 15 N resonance assignments used for the structure determination of proteins. We have demonstrated the feasibility of our system by applying it to over 600 entries from the internal database generated by the RIKEN Structural Genomics/Proteomics Initiative (RSGI) to the public database, BioMagResBank (BMRB). We have assessed the quality of the deposited chemical shifts by comparing them with those predicted from the PDB coordinate entry for the corresponding protein. The same comparison for other matched BMRB/PDB entries deposited from 2001–2011 has been carried out and the results suggest that the RSGI entries greatly improved the quality of the BMRB database. Since the entries include chemical shifts acquired under strikingly similar experimental conditions, these NMR data can be expected to be a promising resource to improve current technologies as well as to develop new NMR methods for protein studies.

  16. Electrically evoked compound action potentials artefact rejection by independent component analysis: procedure automation.

    Science.gov (United States)

    Akhoun, Idrick; McKay, Colette; El-Deredy, Wael

    2015-01-15

    Independent-components-analysis (ICA) successfully separated electrically-evoked compound action potentials (ECAPs) from the stimulation artefact and noise (ECAP-ICA, Akhoun et al., 2013). This paper shows how to automate the ECAP-ICA artefact cancellation process. Raw-ECAPs without artefact rejection were consecutively recorded for each stimulation condition from at least 8 intra-cochlear electrodes. Firstly, amplifier-saturated recordings were discarded, and the data from different stimulus conditions (different current-levels) were concatenated temporally. The key aspect of the automation procedure was the sequential deductive source categorisation after ICA was applied with a restriction to 4 sources. The stereotypical aspect of the 4 sources enables their automatic classification as two artefact components, a noise and the sought ECAP based on theoretical and empirical considerations. The automatic procedure was tested using 8 cochlear implant (CI) users and one to four stimulus electrodes. The artefact and noise sources were successively identified and discarded, leaving the ECAP as the remaining source. The automated ECAP-ICA procedure successfully extracted the correct ECAPs compared to standard clinical forward masking paradigm in 22 out of 26 cases. ECAP-ICA does not require extracting the ECAP from a combination of distinct buffers as it is the case with regular methods. It is an alternative that does not have the possible bias of traditional artefact rejections such as alternate-polarity or forward-masking paradigms. The ECAP-ICA procedure bears clinical relevance, for example as the artefact rejection sub-module of automated ECAP-threshold detection techniques, which are common features of CI clinical fitting software. Copyright © 2014. Published by Elsevier B.V.

  17. SAMPO 90 high resolution interactive gamma-spectrum analysis including automation with macros

    International Nuclear Information System (INIS)

    Aarnio, P.A.; Nikkinen, M.T.; Routti, J.T.

    1992-01-01

    SAMPO 90 is high performance gamma-spectrum analysis program for personal computers. It uses color graphics to display calibrations, spectra, fitting results as multiplet components, and analysis results. All the analysis phases can be done either under full interactive user control or macros and programmable function keys can be used for completely automated measurement and analysis sequences including the control of MACs and sample changers. Accurate peak area determination of even the most complex multiplets, of up to 32 components, is accomplished using linear and mixed mode fitting. Nuclide identification is done using associated lines techniques allowing interference correction for fully overlapping peaks. Peaked Background Subtraction can be performed and Minimum Detectable Activities calculated. The analysis reports and program parameters are fully customizable. (author) 13 refs.; 1 fig

  18. Chemical analysis of cyanide in cyanidation process: review of methods

    International Nuclear Information System (INIS)

    Nova-Alonso, F.; Elorza-Rodriguez, E.; Uribe-Salas, A.; Perez-Garibay, R.

    2007-01-01

    Cyanidation, the world wide method for precious metals recovery, the chemical analysis of cyanide, is a very important, but complex operation. Cyanide can be present forming different species, each of them with different stability, toxicity, analysis method and elimination technique. For cyanide analysis, there exists a wide selection of analytical methods but most of them present difficulties because of the interference of species present in the solution. This paper presents the different available methods for chemical analysis of cyanide: titration, specific electrode and distillation, giving special emphasis on the interferences problem, with the aim of helping in the interpretation of the results. (Author)

  19. A method for the automated detection phishing websites through both site characteristics and image analysis

    Science.gov (United States)

    White, Joshua S.; Matthews, Jeanna N.; Stacy, John L.

    2012-06-01

    Phishing website analysis is largely still a time-consuming manual process of discovering potential phishing sites, verifying if suspicious sites truly are malicious spoofs and if so, distributing their URLs to the appropriate blacklisting services. Attackers increasingly use sophisticated systems for bringing phishing sites up and down rapidly at new locations, making automated response essential. In this paper, we present a method for rapid, automated detection and analysis of phishing websites. Our method relies on near real-time gathering and analysis of URLs posted on social media sites. We fetch the pages pointed to by each URL and characterize each page with a set of easily computed values such as number of images and links. We also capture a screen-shot of the rendered page image, compute a hash of the image and use the Hamming distance between these image hashes as a form of visual comparison. We provide initial results demonstrate the feasibility of our techniques by comparing legitimate sites to known fraudulent versions from Phishtank.com, by actively introducing a series of minor changes to a phishing toolkit captured in a local honeypot and by performing some initial analysis on a set of over 2.8 million URLs posted to Twitter over a 4 days in August 2011. We discuss the issues encountered during our testing such as resolvability and legitimacy of URL's posted on Twitter, the data sets used, the characteristics of the phishing sites we discovered, and our plans for future work.

  20. Automated Classification and Analysis of Non-metallic Inclusion Data Sets

    Science.gov (United States)

    Abdulsalam, Mohammad; Zhang, Tongsheng; Tan, Jia; Webler, Bryan A.

    2018-05-01

    The aim of this study is to utilize principal component analysis (PCA), clustering methods, and correlation analysis to condense and examine large, multivariate data sets produced from automated analysis of non-metallic inclusions. Non-metallic inclusions play a major role in defining the properties of steel and their examination has been greatly aided by automated analysis in scanning electron microscopes equipped with energy dispersive X-ray spectroscopy. The methods were applied to analyze inclusions on two sets of samples: two laboratory-scale samples and four industrial samples from a near-finished 4140 alloy steel components with varying machinability. The laboratory samples had well-defined inclusions chemistries, composed of MgO-Al2O3-CaO, spinel (MgO-Al2O3), and calcium aluminate inclusions. The industrial samples contained MnS inclusions as well as (Ca,Mn)S + calcium aluminate oxide inclusions. PCA could be used to reduce inclusion chemistry variables to a 2D plot, which revealed inclusion chemistry groupings in the samples. Clustering methods were used to automatically classify inclusion chemistry measurements into groups, i.e., no user-defined rules were required.

  1. Development and Applications of a Prototypic SCALE Control Module for Automated Burnup Credit Analysis

    International Nuclear Information System (INIS)

    Gauld, I.C.

    2001-01-01

    Consideration of the depletion phenomena and isotopic uncertainties in burnup-credit criticality analysis places an increasing reliance on computational tools and significantly increases the overall complexity of the calculations. An automated analysis and data management capability is essential for practical implementation of large-scale burnup credit analyses that can be performed in a reasonable amount of time. STARBUCS is a new prototypic analysis sequence being developed for the SCALE code system to perform automated criticality calculations of spent fuel systems employing burnup credit. STARBUCS is designed to help analyze the dominant burnup credit phenomena including spatial burnup gradients and isotopic uncertainties. A search capability also allows STARBUCS to iterate to determine the spent fuel parameters (e.g., enrichment and burnup combinations) that result in a desired k eff for a storage configuration. Although STARBUCS was developed to address the analysis needs for spent fuel transport and storage systems, it provides sufficient flexibility to allow virtually any configuration of spent fuel to be analyzed, such as storage pools and reprocessing operations. STARBUCS has been used extensively at Oak Ridge National Laboratory (ORNL) to study burnup credit phenomena in support of the NRC Research program

  2. Chemical composition analysis and authentication of whisky.

    Science.gov (United States)

    Wiśniewska, Paulina; Dymerski, Tomasz; Wardencki, Waldemar; Namieśnik, Jacek

    2015-08-30

    Whisky (whiskey) is one of the most popular spirit-based drinks made from malted or saccharified grains, which should mature for at least 3 years in wooden barrels. High popularity of products usually causes a potential risk of adulteration. Thus authenticity assessment is one of the key elements of food product marketing. Authentication of whisky is based on comparing the composition of this alcohol with other spirit drinks. The present review summarizes all information about the comparison of whisky and other alcoholic beverages, the identification of type of whisky or the assessment of its quality and finally the authentication of whisky. The article also presents the various techniques used for analyzing whisky, such as gas and liquid chromatography with different types of detectors (FID, AED, UV-Vis), electronic nose, atomic absorption spectroscopy and mass spectrometry. In some cases the application of chemometric methods is also described, namely PCA, DFA, LDA, ANOVA, SIMCA, PNN, k-NN and CA, as well as preparation techniques such SPME or SPE. © 2014 Society of Chemical Industry.

  3. Analysis of blood spots for polyfluoroalkyl chemicals

    Energy Technology Data Exchange (ETDEWEB)

    Kato, Kayoko; Wanigatunga, Amal A.; Needham, Larry L. [Division of Laboratory Sciences, National Center for Environmental Health, Centers for Disease Control and Prevention, Atlanta, GA (United States); Calafat, Antonia M., E-mail: acalafat@cdc.gov [Division of Laboratory Sciences, National Center for Environmental Health, Centers for Disease Control and Prevention, Atlanta, GA (United States)

    2009-12-10

    Polyfluoroalkyl chemicals (PFCs) have been detected in humans, in the environment, and in ecosystems around the world. The potential for developmental and reproductive toxicities of some PFCs is of concern especially to children's health. In the United States, a sample of a baby's blood, called a 'dried blood spot' (DBS), is obtained from a heel stick within 48 h of a child's birth. DBS could be useful for assessing prenatal exposure to PFCs. We developed a method based on online solid phase extraction coupled with high performance liquid chromatography-isotope dilution tandem mass spectrometry for measuring four PFCs in DBS, perfluorooctane sulfonate (PFOS), perfluorohexane sulfonate, perfluorooctanoate (PFOA), and perfluorononanoate. The analytical limits of detection using one whole DBS ({approx}75 {mu}L of blood) were <0.5 ng mL{sup -1}. To validate the method, we analyzed 98 DBS collected in May 2007 in the United States. PFOS and PFOA were detected in all DBS at concentrations in the low ng mL{sup -1} range. These data suggest that DBS may be a suitable matrix for assessing perinatal exposure to PFCs, but additional information related to sampling and specimen storage is needed to demonstrate the utility of these measures for assessing exposure.

  4. Droplet microfluidics in (bio) chemical analysis

    Czech Academy of Sciences Publication Activity Database

    Basova, E. Y.; Foret, František

    2015-01-01

    Roč. 140, č. 1 (2015), s. 22-38 ISSN 0003-2654 R&D Projects: GA ČR(CZ) GBP206/12/G014 Institutional support: RVO:68081715 Keywords : droplet chemistry * bio analysis * microfluidics * protein Subject RIV: CB - Analytical Chemistry, Separation Impact factor: 4.033, year: 2015

  5. Chemical aspects of nuclear methods of analysis

    International Nuclear Information System (INIS)

    1985-01-01

    This final report includes papers which fall into three general areas: development of practical pre-analysis separation techniques, uranium/thorium separation from other elements for analytical and processing operations, and theory and mechanism of separation techniques. A separate abstract was prepared for each of the 9 papers

  6. Chemical composition, antimicrobial activity, proximate analysis and ...

    African Journals Online (AJOL)

    Detarium senegalense JF Gmelin (Caesalpiniaceae), commonly known as tallow tree, is used traditionally for the treatment of bronchitis, pneumonia, internal complaints and skin diseases in Tropical Africa. The seed is used as a soup thickener in Eastern Nigeria. Analysis of the petroleum ether extract of the seeds with ...

  7. Performance analysis of automated evaluation of Crithidia luciliae-based indirect immunofluorescence tests in a routine setting - strengths and weaknesses.

    Science.gov (United States)

    Hormann, Wymke; Hahn, Melanie; Gerlach, Stefan; Hochstrate, Nicola; Affeldt, Kai; Giesen, Joyce; Fechner, Kai; Damoiseaux, Jan G M C

    2017-11-27

    Antibodies directed against dsDNA are a highly specific diagnostic marker for the presence of systemic lupus erythematosus and of particular importance in its diagnosis. To assess anti-dsDNA antibodies, the Crithidia luciliae-based indirect immunofluorescence test (CLIFT) is one of the assays considered to be the best choice. To overcome the drawback of subjective result interpretation that inheres indirect immunofluorescence assays in general, automated systems have been introduced into the market during the last years. Among these systems is the EUROPattern Suite, an advanced automated fluorescence microscope equipped with different software packages, capable of automated pattern interpretation and result suggestion for ANA, ANCA and CLIFT analysis. We analyzed the performance of the EUROPattern Suite with its automated fluorescence interpretation for CLIFT in a routine setting, reflecting the everyday life of a diagnostic laboratory. Three hundred and twelve consecutive samples were collected, sent to the Central Diagnostic Laboratory of the Maastricht University Medical Centre with a request for anti-dsDNA analysis over a period of 7 months. Agreement between EUROPattern assay analysis and the visual read was 93.3%. Sensitivity and specificity were 94.1% and 93.2%, respectively. The EUROPattern Suite performed reliably and greatly supported result interpretation. Automated image acquisition is readily performed and automated image classification gives a reliable recommendation for assay evaluation to the operator. The EUROPattern Suite optimizes workflow and contributes to standardization between different operators or laboratories.

  8. Bacterial growth on surfaces: Automated image analysis for quantification of growth rate-related parameters

    DEFF Research Database (Denmark)

    Møller, S.; Sternberg, Claus; Poulsen, L. K.

    1995-01-01

    species-specific hybridizations with fluorescence-labelled ribosomal probes to estimate the single-cell concentration of RNA. By automated analysis of digitized images of stained cells, we determined four independent growth rate-related parameters: cellular RNA and DNA contents, cell volume......, and the frequency of dividing cells in a cell population. These parameters were used to compare physiological states of liquid-suspended and surfacegrowing Pseudomonas putida KT2442 in chemostat cultures. The major finding is that the correlation between substrate availability and cellular growth rate found...

  9. Software Tool for Automated Failure Modes and Effects Analysis (FMEA) of Hydraulic Systems

    DEFF Research Database (Denmark)

    Stecki, J. S.; Conrad, Finn; Oh, B.

    2002-01-01

    Offshore, marine,aircraft and other complex engineering systems operate in harsh environmental and operational conditions and must meet stringent requirements of reliability, safety and maintability. To reduce the hight costs of development of new systems in these fields improved the design...... management techniques and a vast array of computer aided techniques are applied during design and testing stages. The paper present and discusses the research and development of a software tool for automated failure mode and effects analysis - FMEA - of hydraulic systems. The paper explains the underlying...

  10. Secure automated fabrication (SAF). Phase I interim report: a systems analysis

    International Nuclear Information System (INIS)

    1979-01-01

    An advanced Secure Automated Fabrication (SAF) System is being developed for mixed uranium and plutonium fuel fabrication. SAF System development will ultimately result in systems which maximize personnel radiation protection, restrict and control access to SNM material, improve containment and detection systems for nuclear materials, provide adequate SNM accountability and improve product uniformity and quality. A systems requirement analysis study was initiated to establish the consistent and objective set of requirements within which the choice among alternatives represents the balanced viewpoints of performance, achievability and risk

  11. Chemical analysis developments for fusion materials studies

    International Nuclear Information System (INIS)

    McCown, J.J.; Baldwin, D.L.; Keough, R.F.; Van der Cook, B.P.

    1985-04-01

    Several projects at Hanford under the management of the Westinghouse Hanford Company have involved research and development (R and D) on fusion materials. They include work on the Fusion Materials Irradiation Test Facility and its associated Experimental Lithium System; testing of irradiated lithium compounds as breeding materials; and testing of Li and Li-Pb alloy reactions with various atmospheres, concrete, and other reactor materials for fusion safety studies. In the course of these projects, a number of interesting and challenging analytical chemistry problems were encountered. They include sampling and analysis of lithium while adding and removing elements of interest; sampling, assaying and compound identification efforts on filters, aerosol particles and fire residues; development of dissolution and analysis techniques for measuring tritium and helium in lithium ceramics including oxides, aluminates, silicates and zirconates. An overview of the analytical chemistry development problems plus equipment and procedures used will be presented

  12. Chemical analysis of rare earth elements

    International Nuclear Information System (INIS)

    Tsukahara, Ryoichi; Sakoh, Takefumi; Nagai, Iwao

    1994-01-01

    Recently attention has been paid to ICP-AES or ICP-MS, and the reports on the analysis of rare earth elements by utilizing these methods continue to increase. These reports have become to take about 30% of the reports on rare earth analysis, and this is because these methods are highly sensitive to rare earth elements, and also these methods have spread widely. In ICP-AES and ICP-MS, mostly solution samples are measured, therefore, solids must be made into solution. At the time of quantitatively determining the rare earth elements of low concentration, separation and concentration are necessary. Referring to the literatures reported partially in 1990 and from 1991 to 1993, the progress of ICP-AES and ICP-MS is reported. Rare earth oxides and the alloys containing rare earth elements are easily decomposed with acids, but the decomposition of rocks is difficult, and its method is discussed. The separation of the rare earth elements from others in geochemical samples, cation exchange process is frequently utilized. Also solvent extraction process has been studied. For the separation of rare earth elements mutually, chromatography is used. The spectral interference in spectral analysis was studied. The comparison of these methods with other methods is reported. (K.I)

  13. An Automated Approach to Syntax-based Analysis of Classical Latin

    Directory of Open Access Journals (Sweden)

    Anjalie Field

    2016-12-01

    Full Text Available The goal of this study is to present an automated method for analyzing the style of Latin authors. Many of the common automated methods in stylistic analysis are based on lexical measures, which do not work well with Latin because of the language’s high degree of inflection and free word order. In contrast, this study focuses on analysis at a syntax level by examining two constructions, the ablative absolute and the cum clause. These constructions are often interchangeable, which suggests an author’s choice of construction is typically more stylistic than functional. We first identified these constructions in hand-annotated texts. Next we developed a method for identifying the constructions in unannotated texts, using probabilistic morphological tagging. Our methods identified constructions with enough accuracy to distinguish among different genres and different authors. In particular, we were able to determine which book of Caesar’s Commentarii de Bello Gallico was not written by Caesar. Furthermore, the usage of ablative absolutes and cum clauses observed in this study is consistent with the usage scholars have observed when analyzing these texts by hand. The proposed methods for an automatic syntax-based analysis are shown to be valuable for the study of classical literature.

  14. Development of an Automated LIBS Analytical Test System Integrated with Component Control and Spectrum Analysis Capabilities

    International Nuclear Information System (INIS)

    Ding Yu; Tian Di; Chen Feipeng; Chen Pengfei; Qiao Shujun; Yang Guang; Li Chunsheng

    2015-01-01

    The present paper proposes an automated Laser-Induced Breakdown Spectroscopy (LIBS) analytical test system, which consists of a LIBS measurement and control platform based on a modular design concept, and a LIBS qualitative spectrum analysis software and is developed in C#. The platform provides flexible interfacing and automated control; it is compatible with different manufacturer component models and is constructed in modularized form for easy expandability. During peak identification, a more robust peak identification method with improved stability in peak identification has been achieved by applying additional smoothing on the slope obtained by calculation before peak identification. For the purpose of element identification, an improved main lines analysis method, which detects all elements on the spectral peak to avoid omission of certain elements without strong spectral lines, is applied to element identification in the tested LIBS samples. This method also increases the identification speed. In this paper, actual applications have been carried out. According to tests, the analytical test system is compatible with components of various models made by different manufacturers. It can automatically control components to get experimental data and conduct filtering, peak identification and qualitative analysis, etc. on spectral data. (paper)

  15. GapCoder automates the use of indel characters in phylogenetic analysis.

    Science.gov (United States)

    Young, Nelson D; Healy, John

    2003-02-19

    Several ways of incorporating indels into phylogenetic analysis have been suggested. Simple indel coding has two strengths: (1) biological realism and (2) efficiency of analysis. In the method, each indel with different start and/or end positions is considered to be a separate character. The presence/absence of these indel characters is then added to the data set. We have written a program, GapCoder to automate this procedure. The program can input PIR format aligned datasets, find the indels and add the indel-based characters. The output is a NEXUS format file, which includes a table showing what region each indel characters is based on. If regions are excluded from analysis, this table makes it easy to identify the corresponding indel characters for exclusion. Manual implementation of the simple indel coding method can be very time-consuming, especially in data sets where indels are numerous and/or overlapping. GapCoder automates this method and is therefore particularly useful during procedures where phylogenetic analyses need to be repeated many times, such as when different alignments are being explored or when various taxon or character sets are being explored. GapCoder is currently available for Windows from http://www.home.duq.edu/~youngnd/GapCoder.

  16. Automation process for morphometric analysis of volumetric CT data from pulmonary vasculature in rats.

    Science.gov (United States)

    Shingrani, Rahul; Krenz, Gary; Molthen, Robert

    2010-01-01

    With advances in medical imaging scanners, it has become commonplace to generate large multidimensional datasets. These datasets require tools for a rapid, thorough analysis. To address this need, we have developed an automated algorithm for morphometric analysis incorporating A Visualization Workshop computational and image processing libraries for three-dimensional segmentation, vascular tree generation and structural hierarchical ordering with a two-stage numeric optimization procedure for estimating vessel diameters. We combine this new technique with our mathematical models of pulmonary vascular morphology to quantify structural and functional attributes of lung arterial trees. Our physiological studies require repeated measurements of vascular structure to determine differences in vessel biomechanical properties between animal models of pulmonary disease. Automation provides many advantages including significantly improved speed and minimized operator interaction and biasing. The results are validated by comparison with previously published rat pulmonary arterial micro-CT data analysis techniques, in which vessels were manually mapped and measured using intense operator intervention. Published by Elsevier Ireland Ltd.

  17. Correlation of maple sap composition with bacterial and fungal communities determined by multiplex automated ribosomal intergenic spacer analysis (MARISA).

    Science.gov (United States)

    Filteau, Marie; Lagacé, Luc; LaPointe, Gisèle; Roy, Denis

    2011-08-01

    During collection, maple sap is contaminated by bacteria and fungi that subsequently colonize the tubing system. The bacterial microbiota has been more characterized than the fungal microbiota, but the impact of both components on maple sap quality remains unclear. This study focused on identifying bacterial and fungal members of maple sap and correlating microbiota composition with maple sap properties. A multiplex automated ribosomal intergenic spacer analysis (MARISA) method was developed to presumptively identify bacterial and fungal members of maple sap samples collected from 19 production sites during the tapping period. Results indicate that the fungal community of maple sap is mainly composed of yeast related to Mrakia sp., Mrakiella sp., Guehomyces pullulans, Cryptococcus victoriae and Williopsis saturnus. Mrakia, Mrakiella and Guehomyces peaks were identified in samples of all production sites and can be considered dominant and stable members of the fungal microbiota of maple sap. A multivariate analysis based on MARISA profiles and maple sap chemical composition data showed correlations between Candida sake, Janthinobacterium lividum, Williopsis sp., Leuconostoc mesenteroides, Mrakia sp., Rhodococcus sp., Pseudomonas tolaasii, G. pullulans and maple sap composition at different flow periods. This study provides new insights on the relationship between microbial community and maple sap quality. Copyright © 2011 Elsevier Ltd. All rights reserved.

  18. Failure mode and effects analysis of software-based automation systems

    International Nuclear Information System (INIS)

    Haapanen, P.; Helminen, A.

    2002-08-01

    Failure mode and effects analysis (FMEA) is one of the well-known analysis methods having an established position in the traditional reliability analysis. The purpose of FMEA is to identify possible failure modes of the system components, evaluate their influences on system behaviour and propose proper countermeasures to suppress these effects. The generic nature of FMEA has enabled its wide use in various branches of industry reaching from business management to the design of spaceships. The popularity and diverse use of the analysis method has led to multiple interpretations, practices and standards presenting the same analysis method. FMEA is well understood at the systems and hardware levels, where the potential failure modes usually are known and the task is to analyse their effects on system behaviour. Nowadays, more and more system functions are realised on software level, which has aroused the urge to apply the FMEA methodology also on software based systems. Software failure modes generally are unknown - 'software modules do not fail, they only display incorrect behaviour' - and depend on dynamic behaviour of the application. These facts set special requirements on the FMEA of software based systems and make it difficult to realise. In this report the failure mode and effects analysis is studied for the use of reliability analysis of software-based systems. More precisely, the target system of FMEA is defined to be a safety-critical software-based automation application in a nuclear power plant, implemented on an industrial automation system platform. Through a literature study the report tries to clarify the intriguing questions related to the practical use of software failure mode and effects analysis. The study is a part of the research project 'Programmable Automation System Safety Integrity assessment (PASSI)', belonging to the Finnish Nuclear Safety Research Programme (FINNUS, 1999-2002). In the project various safety assessment methods and tools for

  19. Automated image analysis for quantitative fluorescence in situ hybridization with environmental samples.

    Science.gov (United States)

    Zhou, Zhi; Pons, Marie Noëlle; Raskin, Lutgarde; Zilles, Julie L

    2007-05-01

    When fluorescence in situ hybridization (FISH) analyses are performed with complex environmental samples, difficulties related to the presence of microbial cell aggregates and nonuniform background fluorescence are often encountered. The objective of this study was to develop a robust and automated quantitative FISH method for complex environmental samples, such as manure and soil. The method and duration of sample dispersion were optimized to reduce the interference of cell aggregates. An automated image analysis program that detects cells from 4',6'-diamidino-2-phenylindole (DAPI) micrographs and extracts the maximum and mean fluorescence intensities for each cell from corresponding FISH images was developed with the software Visilog. Intensity thresholds were not consistent even for duplicate analyses, so alternative ways of classifying signals were investigated. In the resulting method, the intensity data were divided into clusters using fuzzy c-means clustering, and the resulting clusters were classified as target (positive) or nontarget (negative). A manual quality control confirmed this classification. With this method, 50.4, 72.1, and 64.9% of the cells in two swine manure samples and one soil sample, respectively, were positive as determined with a 16S rRNA-targeted bacterial probe (S-D-Bact-0338-a-A-18). Manual counting resulted in corresponding values of 52.3, 70.6, and 61.5%, respectively. In two swine manure samples and one soil sample 21.6, 12.3, and 2.5% of the cells were positive with an archaeal probe (S-D-Arch-0915-a-A-20), respectively. Manual counting resulted in corresponding values of 22.4, 14.0, and 2.9%, respectively. This automated method should facilitate quantitative analysis of FISH images for a variety of complex environmental samples.

  20. Advanced chemical analysis service for elements, radionuclides and phases

    International Nuclear Information System (INIS)

    Sansoni, B.

    1986-01-01

    A review is given on the structure, organisation and performance of the chemical analysis service of the Central Department for Chemical Analysis at the Kernforschungsanlage Juelich GmbH. The research and development programs together with the infrastructure of the Centre afford to analyse almost all stable elements of the periodical table in almost any material. The corresponding chemical analysis service has been organized according to a new modular system of analytical steps. According to this, the most complicated and, therefore, most general case of an analytical scheme for element and radionuclide analysis in any type of material can be differentiated into about 14 different steps, the modules. They are more or less independent of the special problem. The laboratory is designed and organized according to these steps. (orig./PW) [de

  1. A Meta-Analysis of Factors Influencing the Development of Trust in Automation: Implications for Understanding Autonomy in Future Systems.

    Science.gov (United States)

    Schaefer, Kristin E; Chen, Jessie Y C; Szalma, James L; Hancock, P A

    2016-05-01

    We used meta-analysis to assess research concerning human trust in automation to understand the foundation upon which future autonomous systems can be built. Trust is increasingly important in the growing need for synergistic human-machine teaming. Thus, we expand on our previous meta-analytic foundation in the field of human-robot interaction to include all of automation interaction. We used meta-analysis to assess trust in automation. Thirty studies provided 164 pairwise effect sizes, and 16 studies provided 63 correlational effect sizes. The overall effect size of all factors on trust development was ḡ = +0.48, and the correlational effect was [Formula: see text]  = +0.34, each of which represented medium effects. Moderator effects were observed for the human-related (ḡ  = +0.49; [Formula: see text] = +0.16) and automation-related (ḡ = +0.53; [Formula: see text] = +0.41) factors. Moderator effects specific to environmental factors proved insufficient in number to calculate at this time. Findings provide a quantitative representation of factors influencing the development of trust in automation as well as identify additional areas of needed empirical research. This work has important implications to the enhancement of current and future human-automation interaction, especially in high-risk or extreme performance environments. © 2016, Human Factors and Ergonomics Society.

  2. Chemical kinetic functional sensitivity analysis: Elementary sensitivities

    International Nuclear Information System (INIS)

    Demiralp, M.; Rabitz, H.

    1981-01-01

    Sensitivity analysis is considered for kinetics problems defined in the space--time domain. This extends an earlier temporal Green's function method to handle calculations of elementary functional sensitivities deltau/sub i//deltaα/sub j/ where u/sub i/ is the ith species concentration and α/sub j/ is the jth system parameter. The system parameters include rate constants, diffusion coefficients, initial conditions, boundary conditions, or any other well-defined variables in the kinetic equations. These parameters are generally considered to be functions of position and/or time. Derivation of the governing equations for the sensitivities and the Green's funciton are presented. The physical interpretation of the Green's function and sensitivities is given along with a discussion of the relation of this work to earlier research

  3. Linking Automated Data Analysis and Visualization with Applications in Developmental Biology and High-Energy Physics

    Energy Technology Data Exchange (ETDEWEB)

    Ruebel, Oliver [Technical Univ. of Darmstadt (Germany)

    2009-11-20

    Knowledge discovery from large and complex collections of today's scientific datasets is a challenging task. With the ability to measure and simulate more processes at increasingly finer spatial and temporal scales, the increasing number of data dimensions and data objects is presenting tremendous challenges for data analysis and effective data exploration methods and tools. Researchers are overwhelmed with data and standard tools are often insufficient to enable effective data analysis and knowledge discovery. The main objective of this thesis is to provide important new capabilities to accelerate scientific knowledge discovery form large, complex, and multivariate scientific data. The research covered in this thesis addresses these scientific challenges using a combination of scientific visualization, information visualization, automated data analysis, and other enabling technologies, such as efficient data management. The effectiveness of the proposed analysis methods is demonstrated via applications in two distinct scientific research fields, namely developmental biology and high-energy physics.Advances in microscopy, image analysis, and embryo registration enable for the first time measurement of gene expression at cellular resolution for entire organisms. Analysis of high-dimensional spatial gene expression datasets is a challenging task. By integrating data clustering and visualization, analysis of complex, time-varying, spatial gene expression patterns and their formation becomes possible. The analysis framework MATLAB and the visualization have been integrated, making advanced analysis tools accessible to biologist and enabling bioinformatic researchers to directly integrate their analysis with the visualization. Laser wakefield particle accelerators (LWFAs) promise to be a new compact source of high-energy particles and radiation, with wide applications ranging from medicine to physics. To gain insight into the complex physical processes of particle

  4. Linking Automated Data Analysis and Visualization with Applications in Developmental Biology and High-Energy Physics

    International Nuclear Information System (INIS)

    Ruebel, Oliver

    2009-01-01

    Knowledge discovery from large and complex collections of today's scientific datasets is a challenging task. With the ability to measure and simulate more processes at increasingly finer spatial and temporal scales, the increasing number of data dimensions and data objects is presenting tremendous challenges for data analysis and effective data exploration methods and tools. Researchers are overwhelmed with data and standard tools are often insufficient to enable effective data analysis and knowledge discovery. The main objective of this thesis is to provide important new capabilities to accelerate scientific knowledge discovery form large, complex, and multivariate scientific data. The research covered in this thesis addresses these scientific challenges using a combination of scientific visualization, information visualization, automated data analysis, and other enabling technologies, such as efficient data management. The effectiveness of the proposed analysis methods is demonstrated via applications in two distinct scientific research fields, namely developmental biology and high-energy physics.Advances in microscopy, image analysis, and embryo registration enable for the first time measurement of gene expression at cellular resolution for entire organisms. Analysis of high-dimensional spatial gene expression datasets is a challenging task. By integrating data clustering and visualization, analysis of complex, time-varying, spatial gene expression patterns and their formation becomes possible. The analysis framework MATLAB and the visualization have been integrated, making advanced analysis tools accessible to biologist and enabling bioinformatic researchers to directly integrate their analysis with the visualization. Laser wakefield particle accelerators (LWFAs) promise to be a new compact source of high-energy particles and radiation, with wide applications ranging from medicine to physics. To gain insight into the complex physical processes of particle

  5. Handbook of Basic Tables for Chemical Analysis. Final report

    International Nuclear Information System (INIS)

    Bruno, T.J.; Svoronos, P.D.N.

    1988-04-01

    This work began as a slim booklet prepared by one of the authors (TJB) to accompany a course on chemical instrumentation presented at the National Bureau of Standards, Boulder Laboratories. The booklet contained tables on chromatography, spectroscopy, and chemical (wet) methods, and was intended to provide the students with enough basic data to design their own analytical methods and procedures. Shortly thereafter, with the co-authorship of Prof. Paris D. N. Svoronos, it was expanded into a more-extensive compilation entitled Basic Tables for Chemical Analysis, published as National Bureau of Standards Technical Note 1096. That work has now been expanded and updated into the present body of tables. Although there have been considerable changes since the first version of these tables, the aim has remained essentially the same. The authors have tried to provide a single source of information for those practicing scientists and research students who must use various aspects of chemical analysis in their work. In this respect, it is geared less toward the researcher in analytical chemistry than to those practitioners in other chemical disciplines who must have routine use of chemical analysis

  6. Foreign object detection and removal to improve automated analysis of chest radiographs

    International Nuclear Information System (INIS)

    Hogeweg, Laurens; Sánchez, Clara I.; Melendez, Jaime; Maduskar, Pragnya; Ginneken, Bram van; Story, Alistair; Hayward, Andrew

    2013-01-01

    Purpose: Chest radiographs commonly contain projections of foreign objects, such as buttons, brassier clips, jewellery, or pacemakers and wires. The presence of these structures can substantially affect the output of computer analysis of these images. An automated method is presented to detect, segment, and remove foreign objects from chest radiographs.Methods: Detection is performed using supervised pixel classification with a kNN classifier, resulting in a probability estimate per pixel to belong to a projected foreign object. Segmentation is performed by grouping and post-processing pixels with a probability above a certain threshold. Next, the objects are replaced by texture inpainting.Results: The method is evaluated in experiments on 257 chest radiographs. The detection at pixel level is evaluated with receiver operating characteristic analysis on pixels within the unobscured lung fields and an A z value of 0.949 is achieved. Free response operator characteristic analysis is performed at the object level, and 95.6% of objects are detected with on average 0.25 false positive detections per image. To investigate the effect of removing the detected objects through inpainting, a texture analysis system for tuberculosis detection is applied to images with and without pathology and with and without foreign object removal. Unprocessed, the texture analysis abnormality score of normal images with foreign objects is comparable to those with pathology. After removing foreign objects, the texture score of normal images with and without foreign objects is similar, while abnormal images, whether they contain foreign objects or not, achieve on average higher scores.Conclusions: The authors conclude that removal of foreign objects from chest radiographs is feasible and beneficial for automated image analysis

  7. Automated Analysis of Human Sperm Number and Concentration (Oligospermia) Using Otsu Threshold Method and Labelling

    Science.gov (United States)

    Susrama, I. G.; Purnama, K. E.; Purnomo, M. H.

    2016-01-01

    Oligospermia is a male fertility issue defined as a low sperm concentration in the ejaculate. Normally the sperm concentration is 20-120 million/ml, while Oligospermia patients has sperm concentration less than 20 million/ml. Sperm test done in the fertility laboratory to determine oligospermia by checking fresh sperm according to WHO standards in 2010 [9]. The sperm seen in a microscope using a Neubauer improved counting chamber and manually count the number of sperm. In order to be counted automatically, this research made an automation system to analyse and count the sperm concentration called Automated Analysis of Sperm Concentration Counters (A2SC2) using Otsu threshold segmentation process and morphology. Data sperm used is the fresh sperm directly in the analysis in the laboratory from 10 people. The test results using A2SC2 method obtained an accuracy of 91%. Thus in this study, A2SC2 can be used to calculate the amount and concentration of sperm automatically

  8. Statistical analysis to assess automated level of suspicion scoring methods in breast ultrasound

    Science.gov (United States)

    Galperin, Michael

    2003-05-01

    A well-defined rule-based system has been developed for scoring 0-5 the Level of Suspicion (LOS) based on qualitative lexicon describing the ultrasound appearance of breast lesion. The purposes of the research are to asses and select one of the automated LOS scoring quantitative methods developed during preliminary studies in benign biopsies reduction. The study has used Computer Aided Imaging System (CAIS) to improve the uniformity and accuracy of applying the LOS scheme by automatically detecting, analyzing and comparing breast masses. The overall goal is to reduce biopsies on the masses with lower levels of suspicion, rather that increasing the accuracy of diagnosis of cancers (will require biopsy anyway). On complex cysts and fibroadenoma cases experienced radiologists were up to 50% less certain in true negatives than CAIS. Full correlation analysis was applied to determine which of the proposed LOS quantification methods serves CAIS accuracy the best. This paper presents current results of applying statistical analysis for automated LOS scoring quantification for breast masses with known biopsy results. It was found that First Order Ranking method yielded most the accurate results. The CAIS system (Image Companion, Data Companion software) is developed by Almen Laboratories and was used to achieve the results.

  9. Quantification of sterol-specific response in human macrophages using automated imaged-based analysis.

    Science.gov (United States)

    Gater, Deborah L; Widatalla, Namareq; Islam, Kinza; AlRaeesi, Maryam; Teo, Jeremy C M; Pearson, Yanthe E

    2017-12-13

    The transformation of normal macrophage cells into lipid-laden foam cells is an important step in the progression of atherosclerosis. One major contributor to foam cell formation in vivo is the intracellular accumulation of cholesterol. Here, we report the effects of various combinations of low-density lipoprotein, sterols, lipids and other factors on human macrophages, using an automated image analysis program to quantitatively compare single cell properties, such as cell size and lipid content, in different conditions. We observed that the addition of cholesterol caused an increase in average cell lipid content across a range of conditions. All of the sterol-lipid mixtures examined were capable of inducing increases in average cell lipid content, with variations in the distribution of the response, in cytotoxicity and in how the sterol-lipid combination interacted with other activating factors. For example, cholesterol and lipopolysaccharide acted synergistically to increase cell lipid content while also increasing cell survival compared with the addition of lipopolysaccharide alone. Additionally, ergosterol and cholesteryl hemisuccinate caused similar increases in lipid content but also exhibited considerably greater cytotoxicity than cholesterol. The use of automated image analysis enables us to assess not only changes in average cell size and content, but also to rapidly and automatically compare population distributions based on simple fluorescence images. Our observations add to increasing understanding of the complex and multifactorial nature of foam-cell formation and provide a novel approach to assessing the heterogeneity of macrophage response to a variety of factors.

  10. Scaling up Ecological Measurements of Coral Reefs Using Semi-Automated Field Image Collection and Analysis

    Directory of Open Access Journals (Sweden)

    Manuel González-Rivero

    2016-01-01

    Full Text Available Ecological measurements in marine settings are often constrained in space and time, with spatial heterogeneity obscuring broader generalisations. While advances in remote sensing, integrative modelling and meta-analysis enable generalisations from field observations, there is an underlying need for high-resolution, standardised and geo-referenced field data. Here, we evaluate a new approach aimed at optimising data collection and analysis to assess broad-scale patterns of coral reef community composition using automatically annotated underwater imagery, captured along 2 km transects. We validate this approach by investigating its ability to detect spatial (e.g., across regions and temporal (e.g., over years change, and by comparing automated annotation errors to those of multiple human annotators. Our results indicate that change of coral reef benthos can be captured at high resolution both spatially and temporally, with an average error below 5%, among key benthic groups. Cover estimation errors using automated annotation varied between 2% and 12%, slightly larger than human errors (which varied between 1% and 7%, but small enough to detect significant changes among dominant groups. Overall, this approach allows a rapid collection of in-situ observations at larger spatial scales (km than previously possible, and provides a pathway to link, calibrate, and validate broader analyses across even larger spatial scales (10–10,000 km2.

  11. Automated flow cytometric analysis across large numbers of samples and cell types.

    Science.gov (United States)

    Chen, Xiaoyi; Hasan, Milena; Libri, Valentina; Urrutia, Alejandra; Beitz, Benoît; Rouilly, Vincent; Duffy, Darragh; Patin, Étienne; Chalmond, Bernard; Rogge, Lars; Quintana-Murci, Lluis; Albert, Matthew L; Schwikowski, Benno

    2015-04-01

    Multi-parametric flow cytometry is a key technology for characterization of immune cell phenotypes. However, robust high-dimensional post-analytic strategies for automated data analysis in large numbers of donors are still lacking. Here, we report a computational pipeline, called FlowGM, which minimizes operator input, is insensitive to compensation settings, and can be adapted to different analytic panels. A Gaussian Mixture Model (GMM)-based approach was utilized for initial clustering, with the number of clusters determined using Bayesian Information Criterion. Meta-clustering in a reference donor permitted automated identification of 24 cell types across four panels. Cluster labels were integrated into FCS files, thus permitting comparisons to manual gating. Cell numbers and coefficient of variation (CV) were similar between FlowGM and conventional gating for lymphocyte populations, but notably FlowGM provided improved discrimination of "hard-to-gate" monocyte and dendritic cell (DC) subsets. FlowGM thus provides rapid high-dimensional analysis of cell phenotypes and is amenable to cohort studies. Copyright © 2015. Published by Elsevier Inc.

  12. Airborne chemistry: acoustic levitation in chemical analysis.

    Science.gov (United States)

    Santesson, Sabina; Nilsson, Staffan

    2004-04-01

    This review with 60 references describes a unique path to miniaturisation, that is, the use of acoustic levitation in analytical and bioanalytical chemistry applications. Levitation of small volumes of sample by means of a levitation technique can be used as a way to avoid solid walls around the sample, thus circumventing the main problem of miniaturisation, the unfavourable surface-to-volume ratio. Different techniques for sample levitation have been developed and improved. Of the levitation techniques described, acoustic or ultrasonic levitation fulfils all requirements for analytical chemistry applications. This technique has previously been used to study properties of molten materials and the equilibrium shape()and stability of liquid drops. Temperature and mass transfer in levitated drops have also been described, as have crystallisation and microgravity applications. The airborne analytical system described here is equipped with different and exchangeable remote detection systems. The levitated drops are normally in the 100 nL-2 microL volume range and additions to the levitated drop can be made in the pL-volume range. The use of levitated drops in analytical and bioanalytical chemistry offers several benefits. Several remote detection systems are compatible with acoustic levitation, including fluorescence imaging detection, right angle light scattering, Raman spectroscopy, and X-ray diffraction. Applications include liquid/liquid extractions, solvent exchange, analyte enrichment, single-cell analysis, cell-cell communication studies, precipitation screening of proteins to establish nucleation conditions, and crystallisation of proteins and pharmaceuticals.

  13. Development of chemical analysis techniques: pt. 3

    International Nuclear Information System (INIS)

    Kim, K.J.; Chi, K.Y.; Choi, G.C.

    1981-01-01

    For the purpose of determining trace rare earths a spectrofluorimetric method has been studied. Except Ce and Tb, the fluorescence intensities are not enough to allow satisfactory analysis. Complexing agents such as tungstate and hexafluoroacetylacetone should be employed to increase fluorescence intensities. As a preliminary experiment for the separation of individual rare earth element and uranium, the distribution coefficient, % S here, are obtained on the Dowex 50 W against HCl concentration by a batch method. These % S data are utilized to obtain elution curves. The % S data showed a minimum at around 4 M HCl. To understand this previously known phenomenon the adsorption of Cl - on Dowex 50 W is examined as a function of HCl concentration and found to be decreasing while % S of rare earths increasing. It is interpreted that Cl - and rare earth ions are moved into the resin phase separately and that the charge and the charge densities of these ions are responsible for the different % S curves. Dehydration appears to play an important role in the upturn of the % S curves at higher HCl concentrations

  14. Automated retinofugal visual pathway reconstruction with multi-shell HARDI and FOD-based analysis.

    Science.gov (United States)

    Kammen, Alexandra; Law, Meng; Tjan, Bosco S; Toga, Arthur W; Shi, Yonggang

    2016-01-15

    Diffusion MRI tractography provides a non-invasive modality to examine the human retinofugal projection, which consists of the optic nerves, optic chiasm, optic tracts, the lateral geniculate nuclei (LGN) and the optic radiations. However, the pathway has several anatomic features that make it particularly challenging to study with tractography, including its location near blood vessels and bone-air interface at the base of the cerebrum, crossing fibers at the chiasm, somewhat-tortuous course around the temporal horn via Meyer's Loop, and multiple closely neighboring fiber bundles. To date, these unique complexities of the visual pathway have impeded the development of a robust and automated reconstruction method using tractography. To overcome these challenges, we develop a novel, fully automated system to reconstruct the retinofugal visual pathway from high-resolution diffusion imaging data. Using multi-shell, high angular resolution diffusion imaging (HARDI) data, we reconstruct precise fiber orientation distributions (FODs) with high order spherical harmonics (SPHARM) to resolve fiber crossings, which allows the tractography algorithm to successfully navigate the complicated anatomy surrounding the retinofugal pathway. We also develop automated algorithms for the identification of ROIs used for fiber bundle reconstruction. In particular, we develop a novel approach to extract the LGN region of interest (ROI) based on intrinsic shape analysis of a fiber bundle computed from a seed region at the optic chiasm to a target at the primary visual cortex. By combining automatically identified ROIs and FOD-based tractography, we obtain a fully automated system to compute the main components of the retinofugal pathway, including the optic tract and the optic radiation. We apply our method to the multi-shell HARDI data of 215 subjects from the Human Connectome Project (HCP). Through comparisons with post-mortem dissection measurements, we demonstrate the retinotopic

  15. Automation of peak-tracking analysis of stepwise perturbed NMR spectra

    Energy Technology Data Exchange (ETDEWEB)

    Banelli, Tommaso; Vuano, Marco [Università di Udine, Dipartimento di Area Medica (Italy); Fogolari, Federico [INBB (Italy); Fusiello, Andrea [Università di Udine, Dipartimento Politecnico di Ingegneria e Architettura (Italy); Esposito, Gennaro [INBB (Italy); Corazza, Alessandra, E-mail: alessandra.corazza@uniud.it [Università di Udine, Dipartimento di Area Medica (Italy)

    2017-02-15

    We describe a new algorithmic approach able to automatically pick and track the NMR resonances of a large number of 2D NMR spectra acquired during a stepwise variation of a physical parameter. The method has been named Trace in Track (TinT), referring to the idea that a gaussian decomposition traces peaks within the tracks recognised through 3D mathematical morphology. It is capable of determining the evolution of the chemical shifts, intensity and linewidths of each tracked peak.The performances obtained in term of track reconstruction and correct assignment on realistic synthetic spectra were high above 90% when a noise level similar to that of experimental data were considered. TinT was applied successfully to several protein systems during a temperature ramp in isotope exchange experiments. A comparison with a state-of-the-art algorithm showed promising results for great numbers of spectra and low signal to noise ratios, when the graduality of the perturbation is appropriate. TinT can be applied to different kinds of high throughput chemical shift mapping experiments, with quasi-continuous variations, in which a quantitative automated recognition is crucial.

  16. Design and development of a phantom for tomosynthesis with potential for automated analysis via the cloud.

    Science.gov (United States)

    Goodenough, David; Levy, Josh; Olafsdottir, Hildur; Olafsson, Ingvi

    2018-03-06

    This paper describes Development of a Phantom for Tomosynthesis with Potential for Automated Analysis via the Cloud. Several studies are underway to investigate the effectiveness of Tomosynthesis Mammographic Image Screening, including the large TMIST project as funded by the National Cancer Institute https://www.cancer.gov/about-cancer/treatment/clinical-trials/nci-supported/tmist. The development of the phantom described in this paper follows initiatives from the FDA, the AAPM TG245 task group, and European Reference Organization (EUREF) for Quality Assured Breast Screening and Diagnostic Services Committee report noting, that no formal endorsement nor recommendation for use has been sought, or granted by any of these groups. This paper reports on the possibility of using this newly developed Tomosynthesis Phantom for Quality Assurance, field testing of image performance, including remote monitoring of DBT system performance, e.g., via transmission over the cloud. The phantom includes tests for: phantom positioning and alignment (important for remote analysis), scan geometry (x and y), chest wall offset, scan slice width and Slice Sensitivity Profile (SSP(z)) slice geometry (slice width), scan slice incrementation (z), z axis geometry bead, low contrast detectability using low contrast spheres, spatial resolution via Point Spread Function (PSF), Image uniformity, Signal to Noise Ratio (SNR), and Contrast to Noise Ratio (CNR) via readings over an Aluminum square. The phantom is designed for use with automated analysis via transmission of images over the cloud and the analysis package includes test of positioning accuracy (roll, pitch, and yaw). Data are shown from several commercial Tomosynthesis Scanners including Fuji, GE, Hologic, IMS-Giotti, and Siemens; however, the focus of this paper is on phantom design, and not in general aimed at direct commercial comparisons, and wherever possible the identity of the data is anonymized. Results of automated analysis of

  17. Automated on-line liquid–liquid extraction system for temporal mass spectrometric analysis of dynamic samples

    Energy Technology Data Exchange (ETDEWEB)

    Hsieh, Kai-Ta; Liu, Pei-Han [Department of Applied Chemistry, National Chiao Tung University, 1001 University Rd, Hsinchu, 300, Taiwan (China); Urban, Pawel L. [Department of Applied Chemistry, National Chiao Tung University, 1001 University Rd, Hsinchu, 300, Taiwan (China); Institute of Molecular Science, National Chiao Tung University, 1001 University Rd, Hsinchu, 300, Taiwan (China)

    2015-09-24

    Most real samples cannot directly be infused to mass spectrometers because they could contaminate delicate parts of ion source and guides, or cause ion suppression. Conventional sample preparation procedures limit temporal resolution of analysis. We have developed an automated liquid–liquid extraction system that enables unsupervised repetitive treatment of dynamic samples and instantaneous analysis by mass spectrometry (MS). It incorporates inexpensive open-source microcontroller boards (Arduino and Netduino) to guide the extraction and analysis process. Duration of every extraction cycle is 17 min. The system enables monitoring of dynamic processes over many hours. The extracts are automatically transferred to the ion source incorporating a Venturi pump. Operation of the device has been characterized (repeatability, RSD = 15%, n = 20; concentration range for ibuprofen, 0.053–2.000 mM; LOD for ibuprofen, ∼0.005 mM; including extraction and detection). To exemplify its usefulness in real-world applications, we implemented this device in chemical profiling of pharmaceutical formulation dissolution process. Temporal dissolution profiles of commercial ibuprofen and acetaminophen tablets were recorded during 10 h. The extraction-MS datasets were fitted with exponential functions to characterize the rates of release of the main and auxiliary ingredients (e.g. ibuprofen, k = 0.43 ± 0.01 h{sup −1}). The electronic control unit of this system interacts with the operator via touch screen, internet, voice, and short text messages sent to the mobile phone, which is helpful when launching long-term (e.g. overnight) measurements. Due to these interactive features, the platform brings the concept of the Internet-of-Things (IoT) to the chemistry laboratory environment. - Highlights: • Mass spectrometric analysis normally requires sample preparation. • Liquid–liquid extraction can isolate analytes from complex matrices. • The proposed system automates

  18. Automated on-line liquid–liquid extraction system for temporal mass spectrometric analysis of dynamic samples

    International Nuclear Information System (INIS)

    Hsieh, Kai-Ta; Liu, Pei-Han; Urban, Pawel L.

    2015-01-01

    Most real samples cannot directly be infused to mass spectrometers because they could contaminate delicate parts of ion source and guides, or cause ion suppression. Conventional sample preparation procedures limit temporal resolution of analysis. We have developed an automated liquid–liquid extraction system that enables unsupervised repetitive treatment of dynamic samples and instantaneous analysis by mass spectrometry (MS). It incorporates inexpensive open-source microcontroller boards (Arduino and Netduino) to guide the extraction and analysis process. Duration of every extraction cycle is 17 min. The system enables monitoring of dynamic processes over many hours. The extracts are automatically transferred to the ion source incorporating a Venturi pump. Operation of the device has been characterized (repeatability, RSD = 15%, n = 20; concentration range for ibuprofen, 0.053–2.000 mM; LOD for ibuprofen, ∼0.005 mM; including extraction and detection). To exemplify its usefulness in real-world applications, we implemented this device in chemical profiling of pharmaceutical formulation dissolution process. Temporal dissolution profiles of commercial ibuprofen and acetaminophen tablets were recorded during 10 h. The extraction-MS datasets were fitted with exponential functions to characterize the rates of release of the main and auxiliary ingredients (e.g. ibuprofen, k = 0.43 ± 0.01 h"−"1). The electronic control unit of this system interacts with the operator via touch screen, internet, voice, and short text messages sent to the mobile phone, which is helpful when launching long-term (e.g. overnight) measurements. Due to these interactive features, the platform brings the concept of the Internet-of-Things (IoT) to the chemistry laboratory environment. - Highlights: • Mass spectrometric analysis normally requires sample preparation. • Liquid–liquid extraction can isolate analytes from complex matrices. • The proposed system automates the

  19. A Study of Automation for Examination Analysis of Inservice Inspection for Nuclear Power Plant (I)

    International Nuclear Information System (INIS)

    Kim, W.

    1985-01-01

    The developing country, KOREA where does not possess the natural resources for traditional energy such as oil and gas, so. The nuclear energy is the most single reliable source available for closing the energy gap. For these reason, It is inavoidable to construct the nuclear power plant and to develop technology related nuclear energy. The rate of operation in large nuclear power facilities depends upon the performance of work system through design and construction, and also the applied technology. Especially, it is the most important element that safety and reliability in operation of nuclear power plant. In view of this aspects, Nuclear power plant is performed severe examinations during perceives and inservice inspection. This study provide an automation of analysis for volumetric examination which is required to nuclear power plant components. It is composed as follows: I. Introduction II. Inservice Inspection of Nuclear Power Plant * General Requirement. * Principle and Methods of Ultrasonic Test. * Study of Flaw Evaluation and Design of Classifying Formula for Flaws. III. Design of Automation for Flaw Evaluation. IV. An Example V. Conclusion In this theory, It is classifying the flaws, the formula of classifying flaws and the design of automation that is the main important point. As motioned the above, Owing to such as automatic design, more time could be allocated to practical test than that of evaluation of defects, Protecting against subjective bias tester by himself and miscalculation by dint of various process of computation. For the more, adopting this method would be used to more retaining for many test data and comparative evaluating during successive inspection intervals. Inspire of limitation for testing method and required application to test components, it provide useful application to flow evaluation for volumetric examination. Owing to the characteristics of nuclear power plant that is highly skill intensive industry and has huge system, the

  20. Automated high-speed video analysis of the bubble dynamics in subcooled flow boiling

    Energy Technology Data Exchange (ETDEWEB)

    Maurus, Reinhold; Ilchenko, Volodymyr; Sattelmayer, Thomas [Technische Univ. Muenchen, Lehrstuhl fuer Thermodynamik, Garching (Germany)

    2004-04-01

    Subcooled flow boiling is a commonly applied technique for achieving efficient heat transfer. In the study, an experimental investigation in the nucleate boiling regime was performed for water circulating in a closed loop at atmospheric pressure. The test-section consists of a rectangular channel with a one side heated copper strip and a very good optical access. For the optical observation of the bubble behaviour the high-speed cinematography is used. Automated image processing and analysis algorithms developed by the authors were applied for a wide range of mass flow rates and heat fluxes in order to extract characteristic length and time scales of the bubbly layer during the boiling process. Using this methodology, a huge number of bubble cycles could be analysed. The structure of the developed algorithms for the detection of the bubble diameter, the bubble lifetime, the lifetime after the detachment process and the waiting time between two bubble cycles is described. Subsequently, the results from using these automated procedures are presented. A remarkable novelty is the presentation of all results as distribution functions. This is of physical importance because the commonly applied spatial and temporal averaging leads to a loss of information and, moreover, to an unjustified deterministic view of the boiling process, which exhibits in reality a very wide spread of bubble sizes and characteristic times. The results show that the mass flux dominates the temporal bubble behaviour. An increase of the liquid mass flux reveals a strong decrease of the bubble life - and waiting time. In contrast, the variation of the heat flux has a much smaller impact. It is shown in addition that the investigation of the bubble history using automated algorithms delivers novel information with respect to the bubble lift-off probability. (Author)

  1. Automated high-speed video analysis of the bubble dynamics in subcooled flow boiling

    International Nuclear Information System (INIS)

    Maurus, Reinhold; Ilchenko, Volodymyr; Sattelmayer, Thomas

    2004-01-01

    Subcooled flow boiling is a commonly applied technique for achieving efficient heat transfer. In the study, an experimental investigation in the nucleate boiling regime was performed for water circulating in a closed loop at atmospheric pressure. The test-section consists of a rectangular channel with a one side heated copper strip and a very good optical access. For the optical observation of the bubble behaviour the high-speed cinematography is used. Automated image processing and analysis algorithms developed by the authors were applied for a wide range of mass flow rates and heat fluxes in order to extract characteristic length and time scales of the bubbly layer during the boiling process. Using this methodology, a huge number of bubble cycles could be analysed. The structure of the developed algorithms for the detection of the bubble diameter, the bubble lifetime, the lifetime after the detachment process and the waiting time between two bubble cycles is described. Subsequently, the results from using these automated procedures are presented. A remarkable novelty is the presentation of all results as distribution functions. This is of physical importance because the commonly applied spatial and temporal averaging leads to a loss of information and, moreover, to an unjustified deterministic view of the boiling process, which exhibits in reality a very wide spread of bubble sizes and characteristic times. The results show that the mass flux dominates the temporal bubble behaviour. An increase of the liquid mass flux reveals a strong decrease of the bubble life- and waiting time. In contrast, the variation of the heat flux has a much smaller impact. It is shown in addition that the investigation of the bubble history using automated algorithms delivers novel information with respect to the bubble lift-off probability

  2. Technical study for the automation and control of processes of the chemical processing plant for liquid radioactive waste at Racso Nuclear Center

    International Nuclear Information System (INIS)

    Quevedo D, M.; Ayala S, A.

    1997-01-01

    The purpose of this study is to introduce the development of an automation and control system in a chemical processing plant for liquid radioactive waste of low and medium activity. The control system established for the chemical processing plant at RACSO Nuclear Center is described. It is an on-off sequential type system with feedback. This type of control has been chosen according to the volumes to be treated at the plant as processing is carried out by batches. The system will be governed by a programmable controller (PLC), modular, with a minimum of 24 digital inputs, 01 analog input, 16 digital outputs and 01 analog input. Digital inputs and outputs are specifically found at the level sensors of the tanks and at the solenoid-type electro valve control. Analog inputs and outputs have been considered at the pH control. The comprehensive system has been divided into three control bonds, The bonds considered for the operation of the plant are described, the plant has storing, fitting, processing and clarifying tanks. National Instruments' Lookout software has been used for simulation, constituting an important tool not only for a design phase but also for a practical one since this software will be used as SCADA system. Finally, the advantages and benefits of this automation system are analyzed, radiation doses received by occupationally exposed workers are reduced and reliability on the operation on the system is increased. (authors)

  3. Automated microfluidic devices integrating solid-phase extraction, fluorescent labeling, and microchip electrophoresis for preterm birth biomarker analysis.

    Science.gov (United States)

    Sahore, Vishal; Sonker, Mukul; Nielsen, Anna V; Knob, Radim; Kumar, Suresh; Woolley, Adam T

    2018-01-01

    We have developed multichannel integrated microfluidic devices for automated preconcentration, labeling, purification, and separation of preterm birth (PTB) biomarkers. We fabricated multilayer poly(dimethylsiloxane)-cyclic olefin copolymer (PDMS-COC) devices that perform solid-phase extraction (SPE) and microchip electrophoresis (μCE) for automated PTB biomarker analysis. The PDMS control layer had a peristaltic pump and pneumatic valves for flow control, while the PDMS fluidic layer had five input reservoirs connected to microchannels and a μCE system. The COC layers had a reversed-phase octyl methacrylate porous polymer monolith for SPE and fluorescent labeling of PTB biomarkers. We determined μCE conditions for two PTB biomarkers, ferritin (Fer) and corticotropin-releasing factor (CRF). We used these integrated microfluidic devices to preconcentrate and purify off-chip-labeled Fer and CRF in an automated fashion. Finally, we performed a fully automated on-chip analysis of unlabeled PTB biomarkers, involving SPE, labeling, and μCE separation with 1 h total analysis time. These integrated systems have strong potential to be combined with upstream immunoaffinity extraction, offering a compact sample-to-answer biomarker analysis platform. Graphical abstract Pressure-actuated integrated microfluidic devices have been developed for automated solid-phase extraction, fluorescent labeling, and microchip electrophoresis of preterm birth biomarkers.

  4. CEST ANALYSIS: AUTOMATED CHANGE DETECTION FROM VERY-HIGH-RESOLUTION REMOTE SENSING IMAGES

    Directory of Open Access Journals (Sweden)

    M. Ehlers

    2012-08-01

    Full Text Available A fast detection, visualization and assessment of change in areas of crisis or catastrophes are important requirements for coordination and planning of help. Through the availability of new satellites and/or airborne sensors with very high spatial resolutions (e.g., WorldView, GeoEye new remote sensing data are available for a better detection, delineation and visualization of change. For automated change detection, a large number of algorithms has been proposed and developed. From previous studies, however, it is evident that to-date no single algorithm has the potential for being a reliable change detector for all possible scenarios. This paper introduces the Combined Edge Segment Texture (CEST analysis, a decision-tree based cooperative suite of algorithms for automated change detection that is especially designed for the generation of new satellites with very high spatial resolution. The method incorporates frequency based filtering, texture analysis, and image segmentation techniques. For the frequency analysis, different band pass filters can be applied to identify the relevant frequency information for change detection. After transforming the multitemporal images via a fast Fourier transform (FFT and applying the most suitable band pass filter, different methods are available to extract changed structures: differencing and correlation in the frequency domain and correlation and edge detection in the spatial domain. Best results are obtained using edge extraction. For the texture analysis, different 'Haralick' parameters can be calculated (e.g., energy, correlation, contrast, inverse distance moment with 'energy' so far providing the most accurate results. These algorithms are combined with a prior segmentation of the image data as well as with morphological operations for a final binary change result. A rule-based combination (CEST of the change algorithms is applied to calculate the probability of change for a particular location. CEST

  5. Automated CO2 extraction from air for clumped isotope analysis in the atmo- and biosphere

    Science.gov (United States)

    Hofmann, Magdalena; Ziegler, Martin; Pons, Thijs; Lourens, Lucas; Röckmann, Thomas

    2015-04-01

    The conventional stable isotope ratios 13C/12C and 18O/16O in atmospheric CO2 are a powerful tool for unraveling the global carbon cycle. In recent years, it has been suggested that the abundance of the very rare isotopologue 13C18O16O on m/z 47 might be a promising tracer to complement conventional stable isotope analysis of atmospheric CO2 [Affek and Eiler, 2006; Affek et al. 2007; Eiler and Schauble, 2004; Yeung et al., 2009]. Here we present an automated analytical system that is designed for clumped isotope analysis of atmo- and biospheric CO2. The carbon dioxide gas is quantitatively extracted from about 1.5L of air (ATP). The automated stainless steel extraction and purification line consists of three main components: (i) a drying unit (a magnesium perchlorate unit and a cryogenic water trap), (ii) two CO2 traps cooled with liquid nitrogen [Werner et al., 2001] and (iii) a GC column packed with Porapak Q that can be cooled with liquid nitrogen to -30°C during purification and heated up to 230°C in-between two extraction runs. After CO2 extraction and purification, the CO2 is automatically transferred to the mass spectrometer. Mass spectrometric analysis of the 13C18O16O abundance is carried out in dual inlet mode on a MAT 253 mass spectrometer. Each analysis generally consists of 80 change-over-cycles. Three additional Faraday cups were added to the mass spectrometer for simultaneous analysis of the mass-to-charge ratios 44, 45, 46, 47, 48 and 49. The reproducibility for δ13C, δ18O and Δ47 for repeated CO2 extractions from air is in the range of 0.11o (SD), 0.18o (SD) and 0.02 (SD)o respectively. This automated CO2 extraction and purification system will be used to analyse the clumped isotopic signature in atmospheric CO2 (tall tower, Cabauw, Netherlands) and to study the clumped isotopic fractionation during photosynthesis (leaf chamber experiments) and soil respiration. References Affek, H. P., Xu, X. & Eiler, J. M., Geochim. Cosmochim. Acta 71, 5033

  6. Automated impedance-manometry analysis detects esophageal motor dysfunction in patients who have non-obstructive dysphagia with normal manometry

    NARCIS (Netherlands)

    Nguyen, N. Q.; Holloway, R. H.; Smout, A. J.; Omari, T. I.

    2013-01-01

    Background  Automated integrated analysis of impedance and pressure signals has been reported to identify patients at risk of developing dysphagia post fundoplication. This study aimed to investigate this analysis in the evaluation of patients with non-obstructive dysphagia (NOD) and normal

  7. Mapping of brain activity by automated volume analysis of immediate early genes

    Science.gov (United States)

    Renier, Nicolas; Adams, Eliza L.; Kirst, Christoph; Wu, Zhuhao; Azevedo, Ricardo; Kohl, Johannes; Autry, Anita E.; Kadiri, Lolahon; Venkataraju, Kannan Umadevi; Zhou, Yu; Wang, Victoria X.; Tang, Cheuk Y.; Olsen, Olav; Dulac, Catherine; Osten, Pavel; Tessier-Lavigne, Marc

    2016-01-01

    Summary Understanding how neural information is processed in physiological and pathological states would benefit from precise detection, localization and quantification of the activity of all neurons across the entire brain, which has not to date been achieved in the mammalian brain. We introduce a pipeline for high speed acquisition of brain activity at cellular resolution through profiling immediate early gene expression using immunostaining and light-sheet fluorescence imaging, followed by automated mapping and analysis of activity by an open-source software program we term ClearMap. We validate the pipeline first by analysis of brain regions activated in response to Haloperidol. Next, we report new cortical regions downstream of whisker-evoked sensory processing during active exploration. Lastly, we combine activity mapping with axon tracing to uncover new brain regions differentially activated during parenting behavior. This pipeline is widely applicable to different experimental paradigms, including animal species for which transgenic activity reporters are not readily available. PMID:27238021

  8. Optimisation of automated ribosomal intergenic spacer analysis for the estimation of microbial diversity in fynbos soil

    Directory of Open Access Journals (Sweden)

    Karin Jacobs

    2010-07-01

    Full Text Available Automated ribosomal intergenic spacer analysis (ARISA has become a commonly used molecular technique for the study of microbial populations in environmental samples. The reproducibility and accuracy of ARISA, with and without the polymerase chain reaction (PCR are important aspects that influence the results and effectiveness of these techniques. We used the primer set ITS4/ITS5 for ARISA to assess the fungal community composition of two sites situated in the Sand Fynbos. The primer set proved to deliver reproducible ARISA profiles of the fungal community composition with little variation observed between ARISA-PCRs. Variation that occurred in a sample due to repeated DNA extraction is expected for ecological studies. This reproducibility made ARISA a useful tool for the assessment and comparison of diversity in ecological samples. In this paper, we also offered particular suggestions concerning the binning strategy for the analysis of ARISA profiles.

  9. ANALYSIS OF SAMPLES FROM TANK 5F CHEMICAL CLEANING

    Energy Technology Data Exchange (ETDEWEB)

    Poirier, M.; Fink, S.

    2011-03-07

    The Savannah River Site (SRS) is preparing Tank 5F for closure. The first step in preparing the tank for closure is mechanical sludge removal. Following mechanical sludge removal, SRS performed chemical cleaning with oxalic acid to remove the sludge heel. Personnel are currently assessing the effectiveness of the chemical cleaning. SRS personnel collected liquid samples during chemical cleaning and submitted them to Savannah River National Laboratory (SRNL) for analysis. Following chemical cleaning, they collected a solid sample (also known as 'process sample') and submitted it to SRNL for analysis. The authors analyzed these samples to assess the effectiveness of the chemical cleaning process. The conclusions from this work are: (1) With the exception of iron, the dissolution of sludge components from Tank 5F agreed with results from the actual waste demonstration performed in 2007. The fraction of iron removed from Tank 5F by chemical cleaning was significantly less than the fraction removed in the SRNL demonstrations. The likely cause of this difference is the high pH following the first oxalic acid strike. (2) Most of the sludge mass remaining in the tank is iron and nickel. (3) The remaining sludge contains approximately 26 kg of barium, 37 kg of chromium, and 37 kg of mercury. (4) Most of the radioactivity remaining in the residual material is beta emitters and {sup 90}Sr. (5) The chemical cleaning removed more than {approx} 90% of the uranium isotopes and {sup 137}Cs. (6) The chemical cleaning removed {approx} 70% of the neptunium, {approx} 83% of the {sup 90}Sr, and {approx} 21% of the {sup 60}Co. (7) The chemical cleaning removed less than 10% of the plutonium, americium, and curium isotopes. (8) The chemical cleaning removed more than 90% of the aluminium, calcium, and sodium from the tank. (9) The cleaning operations removed 61% of lithium, 88% of non-radioactive strontium, and 65% of zirconium. The {sup 90}Sr and non-radioactive strontium were

  10. Automated Analysis and Classification of Histological Tissue Features by Multi-Dimensional Microscopic Molecular Profiling.

    Directory of Open Access Journals (Sweden)

    Daniel P Riordan

    Full Text Available Characterization of the molecular attributes and spatial arrangements of cells and features within complex human tissues provides a critical basis for understanding processes involved in development and disease. Moreover, the ability to automate steps in the analysis and interpretation of histological images that currently require manual inspection by pathologists could revolutionize medical diagnostics. Toward this end, we developed a new imaging approach called multidimensional microscopic molecular profiling (MMMP that can measure several independent molecular properties in situ at subcellular resolution for the same tissue specimen. MMMP involves repeated cycles of antibody or histochemical staining, imaging, and signal removal, which ultimately can generate information analogous to a multidimensional flow cytometry analysis on intact tissue sections. We performed a MMMP analysis on a tissue microarray containing a diverse set of 102 human tissues using a panel of 15 informative antibody and 5 histochemical stains plus DAPI. Large-scale unsupervised analysis of MMMP data, and visualization of the resulting classifications, identified molecular profiles that were associated with functional tissue features. We then directly annotated H&E images from this MMMP series such that canonical histological features of interest (e.g. blood vessels, epithelium, red blood cells were individually labeled. By integrating image annotation data, we identified molecular signatures that were associated with specific histological annotations and we developed statistical models for automatically classifying these features. The classification accuracy for automated histology labeling was objectively evaluated using a cross-validation strategy, and significant accuracy (with a median per-pixel rate of 77% per feature from 15 annotated samples for de novo feature prediction was obtained. These results suggest that high-dimensional profiling may advance the

  11. The Development Of Mathematical Model For Automated Fingerprint Identification Systems Analysis

    International Nuclear Information System (INIS)

    Ardisasmita, M. Syamsa

    2001-01-01

    Fingerprint has a strong oriented and periodic structure composed of dark lines of raised skin (ridges) and clear lines of lowered skin (furrows)that twist to form a distinct pattern. Although the manner in which the ridges flow is distinctive, other characteristics of the fingerprint called m inutiae a re what are most unique to the individual. These features are particular patterns consisting of terminations or bifurcations of the ridges. To assert if two fingerprints are from the same finger or not, experts detect those minutiae. AFIS (Automated Fingerprint Identification Systems) extract and compare these features for determining a match. The classic methods of fingerprints recognition are not suitable for direct implementation in form of computer algorithms. The creation of a finger's model was however the necessity of development of new, better algorithms of analysis. This paper presents a new numerical methods of fingerprints' simulation based on mathematical model of arrangement of dermatoglyphics and creation of minutiae. This paper describes also the design and implementation of an automated fingerprint identification systems which operates in two stages: minutiae extraction and minutiae matching

  12. Automated Microfluidic Platform for Serial Polymerase Chain Reaction and High-Resolution Melting Analysis.

    Science.gov (United States)

    Cao, Weidong; Bean, Brian; Corey, Scott; Coursey, Johnathan S; Hasson, Kenton C; Inoue, Hiroshi; Isano, Taisuke; Kanderian, Sami; Lane, Ben; Liang, Hongye; Murphy, Brian; Owen, Greg; Shinoda, Nobuhiko; Zeng, Shulin; Knight, Ivor T

    2016-06-01

    We report the development of an automated genetic analyzer for human sample testing based on microfluidic rapid polymerase chain reaction (PCR) with high-resolution melting analysis (HRMA). The integrated DNA microfluidic cartridge was used on a platform designed with a robotic pipettor system that works by sequentially picking up different test solutions from a 384-well plate, mixing them in the tips, and delivering mixed fluids to the DNA cartridge. A novel image feedback flow control system based on a Canon 5D Mark II digital camera was developed for controlling fluid movement through a complex microfluidic branching network without the use of valves. The same camera was used for measuring the high-resolution melt curve of DNA amplicons that were generated in the microfluidic chip. Owing to fast heating and cooling as well as sensitive temperature measurement in the microfluidic channels, the time frame for PCR and HRMA was dramatically reduced from hours to minutes. Preliminary testing results demonstrated that rapid serial PCR and HRMA are possible while still achieving high data quality that is suitable for human sample testing. © 2015 Society for Laboratory Automation and Screening.

  13. A computer based, automated analysis of process and outcomes of diabetic care in 23 GP practices.

    LENUS (Irish Health Repository)

    Hill, F

    2012-02-01

    The predicted prevalence of diabetes in Ireland by 2015 is 190,000. Structured diabetes care in general practice has outcomes equivalent to secondary care and good diabetes care has been shown to be associated with the use of electronic healthcare records (EHRs). This automated analysis of EHRs in 23 practices took 10 minutes per practice compared with 15 hours per practice for manual searches. Data was extracted for 1901 type II diabetics. There was valid data for >80% of patients for 6 of the 9 key indicators in the previous year. 543 (34%) had a Hba1c > 7.5%, 142 (9%) had a total cholesterol >6 mmol\\/l, 83 (6%) had an LDL cholesterol >4 mmol\\/l, 367 (22%) had Triglycerides > 2.2 mmol\\/l and 162 (10%) had Blood Pressure > 160\\/100 mmHg. Data quality and key indicators of care compare well with manual audits in Ireland and the U.K. electronic healthcare records and automated audits should be a feature of all chronic disease management programs.

  14. Automated rapid analysis for dioxins and PCBs in food, feedingstuff and environmental matrices

    Energy Technology Data Exchange (ETDEWEB)

    Hoelscher, K.; Maulshagen, A.; Behnisch, P.A. [eurofins-GfA, Muenster (Germany); Shirkhan, H. [Fluid Management Systems Inc., Waltham, MA (United States); Lieck, G. [University of Applied Science, Steinfurt (Germany)

    2004-09-15

    Today there is a need to develop high throughput specific and sensitive methods for the determination of dioxins, dioxin-like PCBs and indicator-PCBs to ensure their rapid and reliable quantification in several kinds of food and feedingstuffs. Ideally one method would fit for several matrices with highest quality standards and with the possibility of a cost/time-effective samplehandling. However, generally in case of the numerous different PCDD/Fs, dioxin-like PCBs and indicator-PCBs as well as the large concentration range to cover this is quite difficult to fulfill. The implementation of an automated sample-treatment flow process (''dioxin street''), which contains an accelerated solvent extraction (ASE), a Power-Prep workstation (Fluid Management Systems, FMS) for automated clean-up, a Syncore Polyvap (Buechi, Switzerland) for solvent evaporation and a HRGC/HRMS (VG AutoSpec) analysis as detection method for several kinds of different matrices is described here. The aim of the present study is to confirm the high quality, low limits of quantification (LOQ), low PCB background levels and reliability of the Power-Prep system in combination with ASE extraction for dioxins, dioxin-like PCBs and indicator-PCBs.

  15. A scanning electron microscope method for automated, quantitative analysis of mineral matter in coal

    Energy Technology Data Exchange (ETDEWEB)

    Creelman, R.A.; Ward, C.R. [R.A. Creelman and Associates, Epping, NSW (Australia)

    1996-07-01

    Quantitative mineralogical analysis has been carried out in a series of nine coal samples from Australia, South Africa and China using a newly-developed automated image analysis system coupled to a scanning electron microscopy. The image analysis system (QEM{asterisk}SEM) gathers X-ray spectra and backscattered electron data from a number of points on a conventional grain-mount polished section under the SEM, and interprets the data from each point in mineralogical terms. The cumulative data in each case was integrated to provide a volumetric modal analysis of the species present in the coal samples, expressed as percentages of the respective coals` mineral matter. Comparison was made of the QEM{asterisk}SEM results to data obtained from the same samples using other methods of quantitative mineralogical analysis, namely X-ray diffraction of the low-temperature oxygen-plasma ash and normative calculation from the (high-temperature) ash analysis and carbonate CO{sub 2} data. Good agreement was obtained from all three methods for quartz in the coals, and also for most of the iron-bearing minerals. The correlation between results from the different methods was less strong, however, for individual clay minerals, or for minerals such as calcite, dolomite and phosphate species that made up only relatively small proportions of the mineral matter. The image analysis approach, using the electron microscope for mineralogical studies, has significant potential as a supplement to optical microscopy in quantitative coal characterisation. 36 refs., 3 figs., 4 tabs.

  16. Computational singular perturbation analysis of stochastic chemical systems with stiffness

    Science.gov (United States)

    Wang, Lijin; Han, Xiaoying; Cao, Yanzhao; Najm, Habib N.

    2017-04-01

    Computational singular perturbation (CSP) is a useful method for analysis, reduction, and time integration of stiff ordinary differential equation systems. It has found dominant utility, in particular, in chemical reaction systems with a large range of time scales at continuum and deterministic level. On the other hand, CSP is not directly applicable to chemical reaction systems at micro or meso-scale, where stochasticity plays an non-negligible role and thus has to be taken into account. In this work we develop a novel stochastic computational singular perturbation (SCSP) analysis and time integration framework, and associated algorithm, that can be used to not only construct accurately and efficiently the numerical solutions to stiff stochastic chemical reaction systems, but also analyze the dynamics of the reduced stochastic reaction systems. The algorithm is illustrated by an application to a benchmark stochastic differential equation model, and numerical experiments are carried out to demonstrate the effectiveness of the construction.

  17. Predicting blood transfusion using automated analysis of pulse oximetry signals and laboratory values.

    Science.gov (United States)

    Shackelford, Stacy; Yang, Shiming; Hu, Peter; Miller, Catriona; Anazodo, Amechi; Galvagno, Samuel; Wang, Yulei; Hartsky, Lauren; Fang, Raymond; Mackenzie, Colin

    2015-10-01

    Identification of hemorrhaging trauma patients and prediction of blood transfusion needs in near real time will expedite care of the critically injured. We hypothesized that automated analysis of pulse oximetry signals in combination with laboratory values and vital signs obtained at the time of triage would predict the need for blood transfusion with accuracy greater than that of triage vital signs or pulse oximetry analysis alone. Continuous pulse oximetry signals were recorded for directly admitted trauma patients with abnormal prehospital shock index (heart rate [HR] / systolic blood pressure) of 0.62 or greater. Predictions of blood transfusion within 24 hours were compared using Delong's method for area under the receiver operating characteristic (AUROC) curves to determine the optimal combination of triage vital signs (prehospital HR + systolic blood pressure), pulse oximetry features (40 waveform features, O2 saturation, HR), and laboratory values (hematocrit, electrolytes, bicarbonate, prothrombin time, international normalization ratio, lactate) in multivariate logistic regression models. We enrolled 1,191 patients; 339 were excluded because of incomplete data; 40 received blood within 3 hours; and 14 received massive transfusion. Triage vital signs predicted need for transfusion within 3 hours (AUROC, 0.59) and massive transfusion (AUROC, 0.70). Pulse oximetry for 15 minutes predicted transfusion more accurately than triage vital signs for both time frames (3-hour AUROC, 0.74; p = 0.004) (massive transfusion AUROC, 0.88; p transfusion prediction (3-hour AUROC, 0.84; p transfusion AUROC, 0.91; p blood transfusion during trauma resuscitation more accurately than triage vital signs or pulse oximetry analysis alone. Results suggest automated calculations from a noninvasive vital sign monitor interfaced with a point-of-care laboratory device may support clinical decisions by recognizing patients with hemorrhage sufficient to need transfusion. Epidemiologic

  18. RoboSCell: An automated single cell arraying and analysis instrument

    KAUST Repository

    Sakaki, Kelly

    2009-09-09

    Single cell research has the potential to revolutionize experimental methods in biomedical sciences and contribute to clinical practices. Recent studies suggest analysis of single cells reveals novel features of intracellular processes, cell-to-cell interactions and cell structure. The methods of single cell analysis require mechanical resolution and accuracy that is not possible using conventional techniques. Robotic instruments and novel microdevices can achieve higher throughput and repeatability; however, the development of such instrumentation is a formidable task. A void exists in the state-of-the-art for automated analysis of single cells. With the increase in interest in single cell analyses in stem cell and cancer research the ability to facilitate higher throughput and repeatable procedures is necessary. In this paper, a high-throughput, single cell microarray-based robotic instrument, called the RoboSCell, is described. The proposed instrument employs a partially transparent single cell microarray (SCM) integrated with a robotic biomanipulator for in vitro analyses of live single cells trapped at the array sites. Cells, labeled with immunomagnetic particles, are captured at the array sites by channeling magnetic fields through encapsulated permalloy channels in the SCM. The RoboSCell is capable of systematically scanning the captured cells temporarily immobilized at the array sites and using optical methods to repeatedly measure extracellular and intracellular characteristics over time. The instrument\\'s capabilities are demonstrated by arraying human T lymphocytes and measuring the uptake dynamics of calcein acetoxymethylester-all in a fully automated fashion. © 2009 Springer Science+Business Media, LLC.

  19. Assessment of paclitaxel induced sensory polyneuropathy with "Catwalk" automated gait analysis in mice.

    Directory of Open Access Journals (Sweden)

    Petra Huehnchen

    Full Text Available Neuropathic pain as a symptom of sensory nerve damage is a frequent side effect of chemotherapy. The most common behavioral observation in animal models of chemotherapy induced polyneuropathy is the development of mechanical allodynia, which is quantified with von Frey filaments. The data from one study, however, cannot be easily compared with other studies owing to influences of environmental factors, inter-rater variability and differences in test paradigms. To overcome these limitations, automated quantitative gait analysis was proposed as an alternative, but its usefulness for assessing animals suffering from polyneuropathy has remained unclear. In the present study, we used a novel mouse model of paclitaxel induced polyneuropathy to compare results from electrophysiology and the von Frey method to gait alterations measured with the Catwalk test. To mimic recently improved clinical treatment strategies of gynecological malignancies, we established a mouse model of dose-dense paclitaxel therapy on the common C57Bl/6 background. In this model paclitaxel treated animals developed mechanical allodynia as well as reduced caudal sensory nerve action potential amplitudes indicative of a sensory polyneuropathy. Gait analysis with the Catwalk method detected distinct alterations of gait parameters in animals suffering from sensory neuropathy, revealing a minimized contact of the hind paws with the floor. Treatment of mechanical allodynia with gabapentin improved altered dynamic gait parameters. This study establishes a novel mouse model for investigating the side effects of dose-dense paclitaxel therapy and underlines the usefulness of automated gait analysis as an additional easy-to-use objective test for evaluating painful sensory polyneuropathy.

  20. Rapid Automated Dissolution and Analysis Techniques for Radionuclides in Recycle Process Streams

    International Nuclear Information System (INIS)

    Sudowe, Ralf; Roman, Audrey; Dailey, Ashlee; Go, Elaine

    2013-01-01

    The analysis of process samples for radionuclide content is an important part of current procedures for material balance and accountancy in the different process streams of a recycling plant. The destructive sample analysis techniques currently available necessitate a significant amount of time. It is therefore desirable to develop new sample analysis procedures that allow for a quick turnaround time and increased sample throughput with a minimum of deviation between samples. In particular, new capabilities for rapid sample dissolution and radiochemical separation are required. Most of the radioanalytical techniques currently employed for sample analysis are based on manual laboratory procedures. Such procedures are time- and labor-intensive, and not well suited for situations in which a rapid sample analysis is required and/or large number of samples need to be analyzed. To address this issue we are currently investigating radiochemical separation methods based on extraction chromatography that have been specifically optimized for the analysis of process stream samples. The influence of potential interferences present in the process samples as well as mass loading, flow rate and resin performance is being studied. In addition, the potential to automate these procedures utilizing a robotic platform is evaluated. Initial studies have been carried out using the commercially available DGA resin. This resin shows an affinity for Am, Pu, U, and Th and is also exhibiting signs of a possible synergistic effects in the presence of iron.

  1. Long-term live cell imaging and automated 4D analysis of drosophila neuroblast lineages.

    Directory of Open Access Journals (Sweden)

    Catarina C F Homem

    Full Text Available The developing Drosophila brain is a well-studied model system for neurogenesis and stem cell biology. In the Drosophila central brain, around 200 neural stem cells called neuroblasts undergo repeated rounds of asymmetric cell division. These divisions typically generate a larger self-renewing neuroblast and a smaller ganglion mother cell that undergoes one terminal division to create two differentiating neurons. Although single mitotic divisions of neuroblasts can easily be imaged in real time, the lack of long term imaging procedures has limited the use of neuroblast live imaging for lineage analysis. Here we describe a method that allows live imaging of cultured Drosophila neuroblasts over multiple cell cycles for up to 24 hours. We describe a 4D image analysis protocol that can be used to extract cell cycle times and growth rates from the resulting movies in an automated manner. We use it to perform lineage analysis in type II neuroblasts where clonal analysis has indicated the presence of a transit-amplifying population that potentiates the number of neurons. Indeed, our experiments verify type II lineages and provide quantitative parameters for all cell types in those lineages. As defects in type II neuroblast lineages can result in brain tumor formation, our lineage analysis method will allow more detailed and quantitative analysis of tumorigenesis and asymmetric cell division in the Drosophila brain.

  2. Proteomics wants cRacker: automated standardized data analysis of LC-MS derived proteomic data.

    Science.gov (United States)

    Zauber, Henrik; Schulze, Waltraud X

    2012-11-02

    The large-scale analysis of thousands of proteins under various experimental conditions or in mutant lines has gained more and more importance in hypothesis-driven scientific research and systems biology in the past years. Quantitative analysis by large scale proteomics using modern mass spectrometry usually results in long lists of peptide ion intensities. The main interest for most researchers, however, is to draw conclusions on the protein level. Postprocessing and combining peptide intensities of a proteomic data set requires expert knowledge, and the often repetitive and standardized manual calculations can be time-consuming. The analysis of complex samples can result in very large data sets (lists with several 1000s to 100,000 entries of different peptides) that cannot easily be analyzed using standard spreadsheet programs. To improve speed and consistency of the data analysis of LC-MS derived proteomic data, we developed cRacker. cRacker is an R-based program for automated downstream proteomic data analysis including data normalization strategies for metabolic labeling and label free quantitation. In addition, cRacker includes basic statistical analysis, such as clustering of data, or ANOVA and t tests for comparison between treatments. Results are presented in editable graphic formats and in list files.

  3. Chemical analysis of steel by optical emission spectrometry

    International Nuclear Information System (INIS)

    Hayakawa, M.O.; Kajita, T.; Jeszensky, G.

    1981-01-01

    The development of the chemical analysis for special steels by optical emission spectrometry direct reading method with computer, at the Siderurgica N.S. Aparecida S.A. is presented. Results are presented for the low alloy steels and high speed steel. Also, the contribution of this method to the special steel preparation is commented. (Author) [pt

  4. Physico-chemical analysis and sensory evaluation of bread ...

    African Journals Online (AJOL)

    This study carried out the physico-chemical analysis and sensory evaluation of bread produced using different indigenous yeast isolates in order to offer an insight into the overall quality of the bread. Four (4) different yeast species were isolated from sweet orange, pineapple and palm wine. The yeasts were characterized ...

  5. Chemical and antimicrobial analysis of husk fiber aqueous extract ...

    African Journals Online (AJOL)

    Chemical and antimicrobial analysis of husk fiber aqueous extract from Cocos nucifera L. Davi Oliveira e Silva, Gabriel Rocha Martins, Antônio Jorge Ribeiro da Silva, Daniela Sales Alviano, Rodrigo Pires Nascimento, Maria Auxiliadora Coelho Kaplan, Celuta Sales Alviano ...

  6. Bark chemical analysis explains selective bark damage by rodents

    Czech Academy of Sciences Publication Activity Database

    Heroldová, Marta; Jánová, Eva; Suchomel, J.; Purchart, L.; Homolka, Miloslav

    2009-01-01

    Roč. 2, č. 2 (2009), s. 137-140 ISSN 1803-2451 R&D Projects: GA MZe QH72075 Institutional research plan: CEZ:AV0Z60930519 Keywords : bark damage * bark selection * bark chemical analysis * rowan * beech * spruce * mountain forest regeneration Subject RIV: GK - Forestry

  7. High-throughput peptide mass fingerprinting and protein macroarray analysis using chemical printing strategies

    International Nuclear Information System (INIS)

    Sloane, A.J.; Duff, J.L.; Hopwood, F.G.; Wilson, N.L.; Smith, P.E.; Hill, C.J.; Packer, N.H.; Williams, K.L.; Gooley, A.A.; Cole, R.A.; Cooley, P.W.; Wallace, D.B.

    2001-01-01

    We describe a 'chemical printer' that uses piezoelectric pulsing for rapid and accurate microdispensing of picolitre volumes of fluid for proteomic analysis of 'protein macroarrays'. Unlike positive transfer and pin transfer systems, our printer dispenses fluid in a non-contact process that ensures that the fluid source cannot be contaminated by substrate during a printing event. We demonstrate automated delivery of enzyme and matrix solutions for on-membrane protein digestion and subsequent peptide mass fingerprinting (pmf) analysis directly from the membrane surface using matrix-assisted laser-desorption/ionization time-of-flight (MALDI-TOF) mass spectrometry (MS). This approach bypasses the more commonly used multi-step procedures, thereby permitting a more rapid procedure for protein identification. We also highlight the advantage of printing different chemistries onto an individual protein spot for multiple microscale analyses. This ability is particularly useful when detailed characterisation of rare and valuable sample is required. Using a combination of PNGase F and trypsin we have mapped sites of N-glycosylation using on-membrane digestion strategies. We also demonstrate the ability to print multiple serum samples in a micro-ELISA format and rapidly screen a protein macroarray of human blood plasma for pathogen-derived antigens. We anticipate that the 'chemical printer' will be a major component of proteomic platforms for high-throughput protein identification and characterisation with widespread applications in biomedical and diagnostic discovery

  8. Geena 2, improved automated analysis of MALDI/TOF mass spectra.

    Science.gov (United States)

    Romano, Paolo; Profumo, Aldo; Rocco, Mattia; Mangerini, Rosa; Ferri, Fabio; Facchiano, Angelo

    2016-03-02

    Mass spectrometry (MS) is producing high volumes of data supporting oncological sciences, especially for translational research. Most of related elaborations can be carried out by combining existing tools at different levels, but little is currently available for the automation of the fundamental steps. For the analysis of MALDI/TOF spectra, a number of pre-processing steps are required, including joining of isotopic abundances for a given molecular species, normalization of signals against an internal standard, background noise removal, averaging multiple spectra from the same sample, and aligning spectra from different samples. In this paper, we present Geena 2, a public software tool for the automated execution of these pre-processing steps for MALDI/TOF spectra. Geena 2 has been developed in a Linux-Apache-MySQL-PHP web development environment, with scripts in PHP and Perl. Input and output are managed as simple formats that can be consumed by any database system and spreadsheet software. Input data may also be stored in a MySQL database. Processing methods are based on original heuristic algorithms which are introduced in the paper. Three simple and intuitive web interfaces are available: the Standard Search Interface, which allows a complete control over all parameters, the Bright Search Interface, which leaves to the user the possibility to tune parameters for alignment of spectra, and the Quick Search Interface, which limits the number of parameters to a minimum by using default values for the majority of parameters. Geena 2 has been utilized, in conjunction with a statistical analysis tool, in three published experimental works: a proteomic study on the effects of long-term cryopreservation on the low molecular weight fraction of serum proteome, and two retrospective serum proteomic studies, one on the risk of developing breat cancer in patients affected by gross cystic disease of the breast (GCDB) and the other for the identification of a predictor of

  9. Cost-Effectiveness Analysis of an Automated Medication System Implemented in a Danish Hospital Setting

    DEFF Research Database (Denmark)

    Risoer, Bettina Wulff; Lisby, Marianne; Soerensen, Jan

    2017-01-01

    Objectives To evaluate the cost-effectiveness of an automated medication system (AMS) implemented in a Danish hospital setting. Methods An economic evaluation was performed alongside a controlled before-and-after effectiveness study with one control ward and one intervention ward. The primary...... outcome measure was the number of errors in the medication administration process observed prospectively before and after implementation. To determine the difference in proportion of errors after implementation of the AMS, logistic regression was applied with the presence of error(s) as the dependent...... variable. Time, group, and interaction between time and group were the independent variables. The cost analysis used the hospital perspective with a short-term incremental costing approach. The total 6-month costs with and without the AMS were calculated as well as the incremental costs. The number...

  10. Automated sensitivity analysis of the radionuclide migration code UCB-NE-10.2

    International Nuclear Information System (INIS)

    Pin, F.G.; Worley, B.A.; Oblow, E.M.; Wright, R.Q.; Harper, W.V.

    1985-01-01

    The Salt Repository Project (SRP) of the U.S. Department of Energy is performing ongoing performance assessment analyses for the eventual licensing of an underground high-level nuclear waste repository in salt. As part of these studies, sensitivity and uncertainty analyses play a major role in the identification of important parameters, and in the identification of specific data needs for site characterization. Oak Ridge National Laboratory has supported the SRP in this effort resulting in thee development of an automated procedure for performing large scale sensitivity analysis using computer calculus. GRESS, GRadient Enhanced Software System, is a pre-compiler that can process FORTRAN computer codes and add derivative taking capabilities to the normal calculated results. The GRESS code is described and applied to the code UCB-NE-10.2 which simulates the migration through a sorption medium of the radionuclide members of a decay chain

  11. Automated Text Analysis Based on Skip-Gram Model for Food Evaluation in Predicting Consumer Acceptance.

    Science.gov (United States)

    Kim, Augustine Yongwhi; Ha, Jin Gwan; Choi, Hoduk; Moon, Hyeonjoon

    2018-01-01

    The purpose of this paper is to evaluate food taste, smell, and characteristics from consumers' online reviews. Several studies in food sensory evaluation have been presented for consumer acceptance. However, these studies need taste descriptive word lexicon, and they are not suitable for analyzing large number of evaluators to predict consumer acceptance. In this paper, an automated text analysis method for food evaluation is presented to analyze and compare recently introduced two jjampong ramen types (mixed seafood noodles). To avoid building a sensory word lexicon, consumers' reviews are collected from SNS. Then, by training word embedding model with acquired reviews, words in the large amount of review text are converted into vectors. Based on these words represented as vectors, inference is performed to evaluate taste and smell of two jjampong ramen types. Finally, the reliability and merits of the proposed food evaluation method are confirmed by a comparison with the results from an actual consumer preference taste evaluation.

  12. Automatic sample changer control software for automation of neutron activation analysis process in Malaysian Nuclear Agency

    Science.gov (United States)

    Yussup, N.; Ibrahim, M. M.; Rahman, N. A. A.; Mokhtar, M.; Salim, N. A. A.; Soh@Shaari, S. C.; Azman, A.; Lombigit, L.; Azman, A.; Omar, S. A.

    2018-01-01

    Most of the procedures in neutron activation analysis (NAA) process that has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s were performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient especially for sample counting and measurement process. The sample needs to be changed and the measurement software needs to be setup for every one hour counting time. Both of these procedures are performed manually for every sample. Hence, an automatic sample changer system (ASC) that consists of hardware and software is developed to automate sample counting process for up to 30 samples consecutively. This paper describes the ASC control software for NAA process which is designed and developed to control the ASC hardware and call GammaVision software for sample measurement. The software is developed by using National Instrument LabVIEW development package.

  13. Automation of C-terminal sequence analysis of 2D-PAGE separated proteins

    Directory of Open Access Journals (Sweden)

    P.P. Moerman

    2014-06-01

    Full Text Available Experimental assignment of the protein termini remains essential to define the functional protein structure. Here, we report on the improvement of a proteomic C-terminal sequence analysis method. The approach aims to discriminate the C-terminal peptide in a CNBr-digest where Met-Xxx peptide bonds are cleaved in internal peptides ending at a homoserine lactone (hsl-derivative. pH-dependent partial opening of the lactone ring results in the formation of doublets for all internal peptides. C-terminal peptides are distinguished as singlet peaks by MALDI-TOF MS and MS/MS is then used for their identification. We present a fully automated protocol established on a robotic liquid-handling station.

  14. Mathematical support for automated geometry analysis of lathe machining of oblique peakless round-nose tools

    Science.gov (United States)

    Filippov, A. V.; Tarasov, S. Yu; Podgornyh, O. A.; Shamarin, N. N.; Filippova, E. O.

    2017-01-01

    Automatization of engineering processes requires developing relevant mathematical support and a computer software. Analysis of metal cutting kinematics and tool geometry is a necessary key task at the preproduction stage. This paper is focused on developing a procedure for determining the geometry of oblique peakless round-nose tool lathe machining with the use of vector/matrix transformations. Such an approach allows integration into modern mathematical software packages in distinction to the traditional analytic description. Such an advantage is very promising for developing automated control of the preproduction process. A kinematic criterion for the applicable tool geometry has been developed from the results of this study. The effect of tool blade inclination and curvature on the geometry-dependent process parameters was evaluated.

  15. Selection of Filtration Methods in the Analysis of Motion of Automated Guided Vehicle

    Directory of Open Access Journals (Sweden)

    Dobrzańska Magdalena

    2016-08-01

    Full Text Available In this article the issues related to mapping the route and error correction in automated guided vehicle (AGV movement have been discussed. The nature and size of disruption have been determined using the registered runs in experimental studies. On the basis of the analysis a number of numerical runs have been generated, which mapped possible to obtain runs in a real movement of the vehicle. The obtained data set has been used for further research. The aim of this paper was to test the selected methods of digital filtering on the same data set and determine their effectiveness. The results of simulation studies have been presented in the article. The effectiveness of various methods has been determined and on this basis the conclusions have been drawn.

  16. Results of Automated Retinal Image Analysis for Detection of Diabetic Retinopathy from the Nakuru Study, Kenya

    DEFF Research Database (Denmark)

    Juul Bøgelund Hansen, Morten; Abramoff, M. D.; Folk, J. C.

    2015-01-01

    Objective Digital retinal imaging is an established method of screening for diabetic retinopathy (DR). It has been established that currently about 1% of the world's blind or visually impaired is due to DR. However, the increasing prevalence of diabetes mellitus and DR is creating an increased...... workload on those with expertise in grading retinal images. Safe and reliable automated analysis of retinal images may support screening services worldwide. This study aimed to compare the Iowa Detection Program (IDP) ability to detect diabetic eye diseases (DED) to human grading carried out at Moorfields...... predictive value of IDP versus the human grader as reference standard. Results Altogether 3,460 participants were included. 113 had DED, giving a prevalence of 3.3%(95% CI, 2.7-3.9%). Sensitivity of the IDP to detect DED as by the human grading was 91.0%(95% CI, 88.0-93.4%). The IDP ability to detect DED...

  17. Extraction of the number of peroxisomes in yeast cells by automated image analysis.

    Science.gov (United States)

    Niemistö, Antti; Selinummi, Jyrki; Saleem, Ramsey; Shmulevich, Ilya; Aitchison, John; Yli-Harja, Olli

    2006-01-01

    An automated image analysis method for extracting the number of peroxisomes in yeast cells is presented. Two images of the cell population are required for the method: a bright field microscope image from which the yeast cells are detected and the respective fluorescent image from which the number of peroxisomes in each cell is found. The segmentation of the cells is based on clustering the local mean-variance space. The watershed transformation is thereafter employed to separate cells that are clustered together. The peroxisomes are detected by thresholding the fluorescent image. The method is tested with several images of a budding yeast Saccharomyces cerevisiae population, and the results are compared with manually obtained results.

  18. Semi-automated digital measurement as the method of choice for beta cell mass analysis.

    Directory of Open Access Journals (Sweden)

    Violette Coppens

    Full Text Available Pancreas injury by partial duct ligation (PDL activates beta cell differentiation and proliferation in adult mouse pancreas but remains controversial regarding the anticipated increase in beta cell volume. Several reports unable to show beta cell volume augmentation in PDL pancreas used automated digital image analysis software. We hypothesized that fully automatic beta cell morphometry without manual micrograph artifact remediation introduces bias and therefore might be responsible for reported discrepancies and controversy. However, our present results prove that standard digital image processing with automatic thresholding is sufficiently robust albeit less sensitive and less adequate to demonstrate a significant increase in beta cell volume in PDL versus Sham-operated pancreas. We therefore conclude that other confounding factors such as quality of surgery, selection of samples based on relative abundance of the transcription factor Neurogenin 3 (Ngn3 and tissue processing give rise to inter-laboratory inconsistencies in beta cell volume quantification in PDL pancreas.

  19. Automated Text Analysis Based on Skip-Gram Model for Food Evaluation in Predicting Consumer Acceptance

    Directory of Open Access Journals (Sweden)

    Augustine Yongwhi Kim

    2018-01-01

    Full Text Available The purpose of this paper is to evaluate food taste, smell, and characteristics from consumers’ online reviews. Several studies in food sensory evaluation have been presented for consumer acceptance. However, these studies need taste descriptive word lexicon, and they are not suitable for analyzing large number of evaluators to predict consumer acceptance. In this paper, an automated text analysis method for food evaluation is presented to analyze and compare recently introduced two jjampong ramen types (mixed seafood noodles. To avoid building a sensory word lexicon, consumers’ reviews are collected from SNS. Then, by training word embedding model with acquired reviews, words in the large amount of review text are converted into vectors. Based on these words represented as vectors, inference is performed to evaluate taste and smell of two jjampong ramen types. Finally, the reliability and merits of the proposed food evaluation method are confirmed by a comparison with the results from an actual consumer preference taste evaluation.

  20. A novel method for automated grid generation of ice shapes for local-flow analysis

    Science.gov (United States)

    Ogretim, Egemen; Huebsch, Wade W.

    2004-02-01

    Modelling a complex geometry, such as ice roughness, plays a key role for the computational flow analysis over rough surfaces. This paper presents two enhancement ideas in modelling roughness geometry for local flow analysis over an aerodynamic surface. The first enhancement is use of the leading-edge region of an airfoil as a perturbation to the parabola surface. The reasons for using a parabola as the base geometry are: it resembles the airfoil leading edge in the vicinity of its apex and it allows the use of a lower apparent Reynolds number. The second enhancement makes use of the Fourier analysis for modelling complex ice roughness on the leading edge of airfoils. This method of modelling provides an analytical expression, which describes the roughness geometry and the corresponding derivatives. The factors affecting the performance of the Fourier analysis were also investigated. It was shown that the number of sine-cosine terms and the number of control points are of importance. Finally, these enhancements are incorporated into an automated grid generation method over the airfoil ice accretion surface. The validations for both enhancements demonstrate that they can improve the current capability of grid generation and computational flow field analysis around airfoils with ice roughness.

  1. QuickNGS elevates Next-Generation Sequencing data analysis to a new level of automation.

    Science.gov (United States)

    Wagle, Prerana; Nikolić, Miloš; Frommolt, Peter

    2015-07-01

    Next-Generation Sequencing (NGS) has emerged as a widely used tool in molecular biology. While time and cost for the sequencing itself are decreasing, the analysis of the massive amounts of data remains challenging. Since multiple algorithmic approaches for the basic data analysis have been developed, there is now an increasing need to efficiently use these tools to obtain results in reasonable time. We have developed QuickNGS, a new workflow system for laboratories with the need to analyze data from multiple NGS projects at a time. QuickNGS takes advantage of parallel computing resources, a comprehensive back-end database, and a careful selection of previously published algorithmic approaches to build fully automated data analysis workflows. We demonstrate the efficiency of our new software by a comprehensive analysis of 10 RNA-Seq samples which we can finish in only a few minutes of hands-on time. The approach we have taken is suitable to process even much larger numbers of samples and multiple projects at a time. Our approach considerably reduces the barriers that still limit the usability of the powerful NGS technology and finally decreases the time to be spent before proceeding to further downstream analysis and interpretation of the data.

  2. Network meta-analysis using R: a review of currently available automated packages.

    Directory of Open Access Journals (Sweden)

    Binod Neupane

    Full Text Available Network meta-analysis (NMA--a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously--has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i data input and network plotting, (ii modeling options, (iii assumption checking and diagnostic testing, and (iv inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA.

  3. Safety analysis code input automation using the Nuclear Plant Data Bank

    International Nuclear Information System (INIS)

    Kopp, H.; Leung, J.; Tajbakhsh, A.; Viles, F.

    1985-01-01

    The Nuclear Plant Data Bank (NPDB) is a computer-based system that organizes a nuclear power plant's technical data, providing mechanisms for data storage, retrieval, and computer-aided engineering analysis. It has the specific objective to describe thermohydraulic systems in order to support: rapid information retrieval and display, and thermohydraulic analysis modeling. The Nuclear Plant Data Bank (NPBD) system fully automates the storage and analysis based on this data. The system combines the benefits of a structured data base system and computer-aided modeling with links to large scale codes for engineering analysis. Emphasis on a friendly and very graphically oriented user interface facilitates both initial use and longer term efficiency. Specific features are: organization and storage of thermohydraulic data items, ease in locating specific data items, graphical and tabular display capabilities, interactive model construction, organization and display of model input parameters, input deck construction for TRAC and RELAP analysis programs, and traceability of plant data, user model assumptions, and codes used in the input deck construction process. The major accomplishments of this past year were the development of a RELAP model generation capability and the development of a CRAY version of the code

  4. A Fully Automated and Robust Method to Incorporate Stamping Data in Crash, NVH and Durability Analysis

    Science.gov (United States)

    Palaniswamy, Hariharasudhan; Kanthadai, Narayan; Roy, Subir; Beauchesne, Erwan

    2011-08-01

    Crash, NVH (Noise, Vibration, Harshness), and durability analysis are commonly deployed in structural CAE analysis for mechanical design of components especially in the automotive industry. Components manufactured by stamping constitute a major portion of the automotive structure. In CAE analysis they are modeled at a nominal state with uniform thickness and no residual stresses and strains. However, in reality the stamped components have non-uniformly distributed thickness and residual stresses and strains resulting from stamping. It is essential to consider the stamping information in CAE analysis to accurately model the behavior of the sheet metal structures under different loading conditions. Especially with the current emphasis on weight reduction by replacing conventional steels with aluminum and advanced high strength steels it is imperative to avoid over design. Considering this growing need in industry, a highly automated and robust method has been integrated within Altair Hyperworks® to initialize sheet metal components in CAE models with stamping data. This paper demonstrates this new feature and the influence of stamping data for a full car frontal crash analysis.

  5. Neutron activation analysis of high-purity iron in comparison with chemical analysis

    International Nuclear Information System (INIS)

    Kinomura, Atsushi; Horino, Yuji; Takaki, Seiichi; Abiko, Kenji

    2000-01-01

    Neutron activation analysis of iron samples of three different purity levels has been performed and compared with chemical analysis for 30 metallic and metalloid impurity elements. The concentration of As, Cl, Cu, Sb and V detected by neutron activation analysis was mostly in agreement with that obtained by chemical analysis. The sensitivity limits of neutron activation analysis of three kinds of iron samples were calculated and found to be reasonable compared with measured values or detection limits of chemical analysis; however, most of them were above the detection limits of chemical analysis. Graphite-shielded irradiation to suppress fast neutron reactions was effective for Mn analysis without decreasing sensitivity to the other impurity elements. (author)

  6. Automated multivariate analysis of multi-sensor data submitted online: Real-time environmental monitoring.

    Science.gov (United States)

    Eide, Ingvar; Westad, Frank

    2018-01-01

    A pilot study demonstrating real-time environmental monitoring with automated multivariate analysis of multi-sensor data submitted online has been performed at the cabled LoVe Ocean Observatory located at 258 m depth 20 km off the coast of Lofoten-Vesterålen, Norway. The major purpose was efficient monitoring of many variables simultaneously and early detection of changes and time-trends in the overall response pattern before changes were evident in individual variables. The pilot study was performed with 12 sensors from May 16 to August 31, 2015. The sensors provided data for chlorophyll, turbidity, conductivity, temperature (three sensors), salinity (calculated from temperature and conductivity), biomass at three different depth intervals (5-50, 50-120, 120-250 m), and current speed measured in two directions (east and north) using two sensors covering different depths with overlap. A total of 88 variables were monitored, 78 from the two current speed sensors. The time-resolution varied, thus the data had to be aligned to a common time resolution. After alignment, the data were interpreted using principal component analysis (PCA). Initially, a calibration model was established using data from May 16 to July 31. The data on current speed from two sensors were subject to two separate PCA models and the score vectors from these two models were combined with the other 10 variables in a multi-block PCA model. The observations from August were projected on the calibration model consecutively one at a time and the result was visualized in a score plot. Automated PCA of multi-sensor data submitted online is illustrated with an attached time-lapse video covering the relative short time period used in the pilot study. Methods for statistical validation, and warning and alarm limits are described. Redundant sensors enable sensor diagnostics and quality assurance. In a future perspective, the concept may be used in integrated environmental monitoring.

  7. Automated multivariate analysis of multi-sensor data submitted online: Real-time environmental monitoring.

    Directory of Open Access Journals (Sweden)

    Ingvar Eide

    Full Text Available A pilot study demonstrating real-time environmental monitoring with automated multivariate analysis of multi-sensor data submitted online has been performed at the cabled LoVe Ocean Observatory located at 258 m depth 20 km off the coast of Lofoten-Vesterålen, Norway. The major purpose was efficient monitoring of many variables simultaneously and early detection of changes and time-trends in the overall response pattern before changes were evident in individual variables. The pilot study was performed with 12 sensors from May 16 to August 31, 2015. The sensors provided data for chlorophyll, turbidity, conductivity, temperature (three sensors, salinity (calculated from temperature and conductivity, biomass at three different depth intervals (5-50, 50-120, 120-250 m, and current speed measured in two directions (east and north using two sensors covering different depths with overlap. A total of 88 variables were monitored, 78 from the two current speed sensors. The time-resolution varied, thus the data had to be aligned to a common time resolution. After alignment, the data were interpreted using principal component analysis (PCA. Initially, a calibration model was established using data from May 16 to July 31. The data on current speed from two sensors were subject to two separate PCA models and the score vectors from these two models were combined with the other 10 variables in a multi-block PCA model. The observations from August were projected on the calibration model consecutively one at a time and the result was visualized in a score plot. Automated PCA of multi-sensor data submitted online is illustrated with an attached time-lapse video covering the relative short time period used in the pilot study. Methods for statistical validation, and warning and alarm limits are described. Redundant sensors enable sensor diagnostics and quality assurance. In a future perspective, the concept may be used in integrated environmental monitoring.

  8. Automated analysis of information processing, kinetic independence and modular architecture in biochemical networks using MIDIA.

    Science.gov (United States)

    Bowsher, Clive G

    2011-02-15

    Understanding the encoding and propagation of information by biochemical reaction networks and the relationship of such information processing properties to modular network structure is of fundamental importance in the study of cell signalling and regulation. However, a rigorous, automated approach for general biochemical networks has not been available, and high-throughput analysis has therefore been out of reach. Modularization Identification by Dynamic Independence Algorithms (MIDIA) is a user-friendly, extensible R package that performs automated analysis of how information is processed by biochemical networks. An important component is the algorithm's ability to identify exact network decompositions based on both the mass action kinetics and informational properties of the network. These modularizations are visualized using a tree structure from which important dynamic conditional independence properties can be directly read. Only partial stoichiometric information needs to be used as input to MIDIA, and neither simulations nor knowledge of rate parameters are required. When applied to a signalling network, for example, the method identifies the routes and species involved in the sequential propagation of information between its multiple inputs and outputs. These routes correspond to the relevant paths in the tree structure and may be further visualized using the Input-Output Path Matrix tool. MIDIA remains computationally feasible for the largest network reconstructions currently available and is straightforward to use with models written in Systems Biology Markup Language (SBML). The package is distributed under the GNU General Public License and is available, together with a link to browsable Supplementary Material, at http://code.google.com/p/midia. Further information is at www.maths.bris.ac.uk/~macgb/Software.html.

  9. Application of sodium carbonate prevents sulphur poisoning of catalysts in automated total mercury analysis

    Science.gov (United States)

    McLagan, David S.; Huang, Haiyong; Lei, Ying D.; Wania, Frank; Mitchell, Carl P. J.

    2017-07-01

    Analysis of high sulphur-containing samples for total mercury content using automated thermal decomposition, amalgamation, and atomic absorption spectroscopy instruments (USEPA Method 7473) leads to rapid and costly SO2 poisoning of catalysts. In an effort to overcome this issue, we tested whether the addition of powdered sodium carbonate (Na2CO3) to the catalyst and/or directly on top of sample material increases throughput of sulphur-impregnated (8-15 wt%) activated carbon samples per catalyst tube. Adding 5 g of Na2CO3 to the catalyst alone only marginally increases the functional lifetime of the catalyst (31 ± 4 g of activated carbon analyzed per catalyst tube) in relation to unaltered catalyst of the AMA254 total mercury analyzer (17 ± 4 g of activated carbon). Adding ≈ 0.2 g of Na2CO3 to samples substantially increases (81 ± 17 g of activated carbon) catalyst life over the unaltered catalyst. The greatest improvement is achieved by adding Na2CO3 to both catalyst and samples (200 ± 70 g of activated carbon), which significantly increases catalyst performance over all other treatments and enables an order of magnitude greater sample throughput than the unaltered samples and catalyst. It is likely that Na2CO3 efficiently sequesters SO2, even at high furnace temperatures to produce Na2SO4 and CO2, largely negating the poisonous impact of SO2 on the catalyst material. Increased corrosion of nickel sampling boats resulting from this methodological variation is easily resolved by substituting quartz boats. Overall, this variation enables an efficient and significantly more affordable means of employing automated atomic absorption spectrometry instruments for total mercury analysis of high-sulphur matrices.

  10. SU-E-T-139: Automated Daily EPID Exit Dose Analysis Uncovers Treatment Variations

    Energy Technology Data Exchange (ETDEWEB)

    Olch, A [University of Southern California, Los Angeles, CA (United States)

    2015-06-15

    Purpose: To evaluate a fully automated EPID exit dose system for its ability to detect daily treatment deviations including patient setup, delivery, and anatomy changes. Methods: PerFRACTION (Sun Nuclear Corporation) software is a system that uses integrated EPID images taken during patient treatment and automatically pulled from the Aria database and analyzed based on user-defined comparisons. This was used to monitor 20 plans consisting of a total of 859 fields for 18 patients, for a total of 251 fractions. Nine VMAT, 5 IMRT, and 6 3D plans were monitored. The Gamma analysis was performed for each field within a plan, comparing the first fraction against each of the other fractions in each treatment course. A 2% dose difference, 1 mm distance-to-agreement, and 10% dose threshold was used. These tight tolerances were chosen to achieve a high sensitivity to treatment variations. The field passed if 93% of the pixels had a Gamma of 1 or less. Results: Twenty-nine percent of the fields failed. The average plan passing rate was 92.5%.The average 3D plan passing rate was less than for VMAT or IMRT, 84%, vs. an average of 96.2%. When fields failed, an investigation revealed changes in patient anatomy or setup variations, often also leading to variations of transmission through immobilization devices. Conclusion: PerFRACTION is a fully automated system for determining daily changes in dose transmission through the patient that requires no effort other than for the imager panel to be deployed during treatment. A surprising number of fields failed the analysis and can be attributed to important treatment variations that would otherwise not be appreciated. Further study of inter-fraction treatment variations is possible and warranted. Sun Nuclear Corporation provided a license to the software described.

  11. Automated detection of regions of interest for tissue microarray experiments: an image texture analysis

    International Nuclear Information System (INIS)

    Karaçali, Bilge; Tözeren, Aydin

    2007-01-01

    Recent research with tissue microarrays led to a rapid progress toward quantifying the expressions of large sets of biomarkers in normal and diseased tissue. However, standard procedures for sampling tissue for molecular profiling have not yet been established. This study presents a high throughput analysis of texture heterogeneity on breast tissue images for the purpose of identifying regions of interest in the tissue for molecular profiling via tissue microarray technology. Image texture of breast histology slides was described in terms of three parameters: the percentage of area occupied in an image block by chromatin (B), percentage occupied by stroma-like regions (P), and a statistical heterogeneity index H commonly used in image analysis. Texture parameters were defined and computed for each of the thousands of image blocks in our dataset using both the gray scale and color segmentation. The image blocks were then classified into three categories using the texture feature parameters in a novel statistical learning algorithm. These categories are as follows: image blocks specific to normal breast tissue, blocks specific to cancerous tissue, and those image blocks that are non-specific to normal and disease states. Gray scale and color segmentation techniques led to identification of same regions in histology slides as cancer-specific. Moreover the image blocks identified as cancer-specific belonged to those cell crowded regions in whole section image slides that were marked by two pathologists as regions of interest for further histological studies. These results indicate the high efficiency of our automated method for identifying pathologic regions of interest on histology slides. Automation of critical region identification will help minimize the inter-rater variability among different raters (pathologists) as hundreds of tumors that are used to develop an array have typically been evaluated (graded) by different pathologists. The region of interest

  12. Widely applicable MATLAB routines for automated analysis of saccadic reaction times.

    Science.gov (United States)

    Leppänen, Jukka M; Forssman, Linda; Kaatiala, Jussi; Yrttiaho, Santeri; Wass, Sam

    2015-06-01

    Saccadic reaction time (SRT) is a widely used dependent variable in eye-tracking studies of human cognition and its disorders. SRTs are also frequently measured in studies with special populations, such as infants and young children, who are limited in their ability to follow verbal instructions and remain in a stable position over time. In this article, we describe a library of MATLAB routines (Mathworks, Natick, MA) that are designed to (1) enable completely automated implementation of SRT analysis for multiple data sets and (2) cope with the unique challenges of analyzing SRTs from eye-tracking data collected from poorly cooperating participants. The library includes preprocessing and SRT analysis routines. The preprocessing routines (i.e., moving median filter and interpolation) are designed to remove technical artifacts and missing samples from raw eye-tracking data. The SRTs are detected by a simple algorithm that identifies the last point of gaze in the area of interest, but, critically, the extracted SRTs are further subjected to a number of postanalysis verification checks to exclude values contaminated by artifacts. Example analyses of data from 5- to 11-month-old infants demonstrated that SRTs extracted with the proposed routines were in high agreement with SRTs obtained manually from video records, robust against potential sources of artifact, and exhibited moderate to high test-retest stability. We propose that the present library has wide utility in standardizing and automating SRT-based cognitive testing in various populations. The MATLAB routines are open source and can be downloaded from http://www.uta.fi/med/icl/methods.html .

  13. Activation and chemical analysis of drinking water from shallow aquifers

    International Nuclear Information System (INIS)

    Sharma, H.K.; Mittal, V.K.; Sahota, H.S.

    1991-01-01

    In most of the Indian cities drinking water is drawn from shallow aqiufers with the help of hand pumps. These shallow aquifers get easilyl polluted. In the present work we have measured 20 trace elements using Neutron Activation Analysis (NAA) and 8 chemical parameters using standard chemical methods of drinking water drawn from Rajpura city. It was found that almost all water samples are highly polluted. We attribute this to unplaned disposal of industrial and domestic waste over a period of many decades. (author) 11 refs.; 1 fig.; 1 tab

  14. Activation analysis. A basis for chemical similarity and classification

    Energy Technology Data Exchange (ETDEWEB)

    Beeck, J OP de [Ghent Rijksuniversiteit (Belgium). Instituut voor Kernwetenschappen

    1977-01-01

    It is shown that activation analysis is especially suited to serve as a basis for determining the chemical similarity between samples defined by their trace-element concentration patterns. The general problem of classification and identification is discussed. The nature of possible classification structures and their appropriate clustering strategies is considered. A practical computer method is suggested and its application as well as the graphical representation of classification results are given. The possibility for classification using information theory is mentioned. Classification of chemical elements is discussed and practically realized after Hadamard transformation of the concentration variation patterns in a series of samples.

  15. Chemical analysis of plasma-assisted antimicrobial treatment on cotton

    International Nuclear Information System (INIS)

    Kan, C W; Lam, Y L; Yuen, C W M; Luximon, A; Lau, K W; Chen, K S

    2013-01-01

    This paper explores the use of plasma treatment as a pretreatment process to assist the application of antimicrobial process on cotton fabric with good functional effect. In this paper, antimicrobial finishing agent, Microfresh Liquid Formulation 9200-200 (MF), and a binder (polyurethane dispersion, Microban Liquid Formulation R10800-0, MB) will be used for treating the cotton fabric for improving the antimicrobial property and pre-treatment of cotton fabric by plasma under atmospheric pressure will be employed to improve loading of chemical agents. The chemical analysis of the treated cotton fabric will be conducted by Fourier transform Infrared Spectroscopy.

  16. Analysis of an Automated Vehicle Routing Problem in Logistics considering Path Interruption

    Directory of Open Access Journals (Sweden)

    Yong Zhang

    2017-01-01

    Full Text Available The application of automated vehicles in logistics can efficiently reduce the cost of logistics and reduce the potential risks in the last mile. Considering the path restriction in the initial stage of the application of automated vehicles in logistics, the conventional model for a vehicle routing problem (VRP is modified. Thus, the automated vehicle routing problem with time windows (AVRPTW model considering path interruption is established. Additionally, an improved particle swarm optimisation (PSO algorithm is designed to solve this problem. Finally, a case study is undertaken to test the validity of the model and the algorithm. Four automated vehicles are designated to execute all delivery tasks required by 25 stores. Capacities of all of the automated vehicles are almost fully utilised. It is of considerable significance for the promotion of automated vehicles in last-mile situations to develop such research into real problems arising in the initial period.

  17. Future Trends in Process Automation

    OpenAIRE

    Jämsä-Jounela, Sirkka-Liisa

    2007-01-01

    The importance of automation in the process industries has increased dramatically in recent years. In the highly industrialized countries, process automation serves to enhance product quality, master the whole range of products, improve process safety and plant availability, efficiently utilize resources and lower emissions. In the rapidly developing countries, mass production is the main motivation for applying process automation. The greatest demand for process automation is in the chemical...

  18. Positron annihilation spectroscopy for chemical analysis (PASCA). Chapter 9

    International Nuclear Information System (INIS)

    Cheng, K.L.; Jean, Y.C.

    1988-01-01

    This chapter gives an up to date overview of positron annihilation spectroscopy for chemical analysis (PASCA). As an in situ technique PASCA is especially suitable for studying processes occurring at surfaces. The in situ characteristics of PASCA are treated. The principes of positron annihilation life time spectroscopy (PAL) are discussed and some important analytical applications such as, in determining of total surface areas and cavity volumes in chemical reactions, in the study of chemisorption and catalytic reactions on porous surfaces, in the analysis of bulk materials, in determining molecular association constants in biological systems, in proton and neutron activation analysis, in thin layer chromatography and in tracer technology. 28 refs.; 15 figs.; 8 tabs

  19. The performance of fully automated urine analysis results for predicting the need of urine culture test

    Directory of Open Access Journals (Sweden)

    Hatice Yüksel

    2014-06-01

    Full Text Available Objectives: Urinalysis and urine culture are most common tests for diagnosis of urinary tract infections. The aim of our study is to examine the diagnostic performance of urine analysis and the role of urine analysis to determine the requirements for urine culture. Methods: Urine culture and urine analysis results of 362 patients were retrospectively analyzed. Culture results were taken as a reference for chemical and microscopic examination of urine and diagnostic accuracy of the test parameters, that may be a marker for urinary tract infection, and the performance of urine analysis were calculated for predicting the urine culture requirements. Results: A total of 362 urine culture results of patients were evaluated and 67% of them were negative. The results of leukocyte esterase and nitrite in chemical analysis and leukocytes and bacteria in microscopic analysis were normal in 50.4% of culture negative urines. In diagnostic accuracy calculations, leukocyte esterase (86.1% and microscopy leukocytes (88.0% were found with high sensitivity, nitrite (95.4% and bacteria (86.6% were found with high specificity. The area under the curve was calculated as 0.852 in ROC analysis for microscopic examination for leukocytes. Conclusion: Full-automatic urine devices can provide sufficient diagnostic accuracy for urine analysis. The evaluation of urine analysis results in an effective way can predict the necessity for urine culture requests and especially may contribute to a reduction in the work load and cost. J Clin Exp Invest 2014; 5 (2: 286-289

  20. Monitored Retrievable Storage/Multi-Purpose Canister analysis: Simulation and economics of automation

    International Nuclear Information System (INIS)

    Bennett, P.C.; Stringer, J.B.

    1994-01-01

    Robotic automation is examined as a possible alternative to manual spent nuclear fuel, transport cask and Multi-Purpose canister (MPC) handling at a Monitored Retrievable Storage (MRS) facility. Automation of key operational aspects for the MRS/MPC system are analyzed to determine equipment requirements, through-put times and equipment costs is described. The economic and radiation dose impacts resulting from this automation are compared to manual handling methods