WorldWideScience

Sample records for metrology software tool

  1. 5th Conference on Advanced Mathematical and Computational Tools in Metrology

    CERN Document Server

    Cox, M G; Filipe, E; Pavese, F; Richter, D

    2001-01-01

    Advances in metrology depend on improvements in scientific and technical knowledge and in instrumentation quality, as well as on better use of advanced mathematical tools and development of new ones. In this volume, scientists from both the mathematical and the metrological fields exchange their experiences. Industrial sectors, such as instrumentation and software, will benefit from this exchange, since metrology has a high impact on the overall quality of industrial products, and applied mathematics is becoming more and more important in industrial processes.This book is of interest to people

  2. World wide matching of registration metrology tools of various generations

    Science.gov (United States)

    Laske, F.; Pudnos, A.; Mackey, L.; Tran, P.; Higuchi, M.; Enkrich, C.; Roeth, K.-D.; Schmidt, K.-H.; Adam, D.; Bender, J.

    2008-10-01

    Turn around time/cycle time is a key success criterion in the semiconductor photomask business. Therefore, global mask suppliers typically allocate work loads based on fab capability and utilization capacity. From a logistical point of view, the manufacturing location of a photomask should be transparent to the customer (mask user). Matching capability of production equipment and especially metrology tools is considered a key enabler to guarantee cross site manufacturing flexibility. Toppan, with manufacturing sites in eight countries worldwide, has an on-going program to match the registration metrology systems of all its production sites. This allows for manufacturing flexibility and risk mitigation.In cooperation with Vistec Semiconductor Systems, Toppan has recently completed a program to match the Vistec LMS IPRO systems at all production sites worldwide. Vistec has developed a new software feature which allows for significantly improved matching of LMS IPRO(x) registration metrology tools of various generations. We will report on the results of the global matching campaign of several of the leading Toppan sites.

  3. Analysis of key technologies for virtual instruments metrology

    Science.gov (United States)

    Liu, Guixiong; Xu, Qingui; Gao, Furong; Guan, Qiuju; Fang, Qiang

    2008-12-01

    Virtual instruments (VIs) require metrological verification when applied as measuring instruments. Owing to the software-centered architecture, metrological evaluation of VIs includes two aspects: measurement functions and software characteristics. Complexity of software imposes difficulties on metrological testing of VIs. Key approaches and technologies for metrology evaluation of virtual instruments are investigated and analyzed in this paper. The principal issue is evaluation of measurement uncertainty. The nature and regularity of measurement uncertainty caused by software and algorithms can be evaluated by modeling, simulation, analysis, testing and statistics with support of powerful computing capability of PC. Another concern is evaluation of software features like correctness, reliability, stability, security and real-time of VIs. Technologies from software engineering, software testing and computer security domain can be used for these purposes. For example, a variety of black-box testing, white-box testing and modeling approaches can be used to evaluate the reliability of modules, components, applications and the whole VI software. The security of a VI can be assessed by methods like vulnerability scanning and penetration analysis. In order to facilitate metrology institutions to perform metrological verification of VIs efficiently, an automatic metrological tool for the above validation is essential. Based on technologies of numerical simulation, software testing and system benchmarking, a framework for the automatic tool is proposed in this paper. Investigation on implementation of existing automatic tools that perform calculation of measurement uncertainty, software testing and security assessment demonstrates the feasibility of the automatic framework advanced.

  4. Machine tool metrology an industrial handbook

    CERN Document Server

    Smith, Graham T

    2016-01-01

    Maximizing reader insights into the key scientific disciplines of Machine Tool Metrology, this text will prove useful for the industrial-practitioner and those interested in the operation of machine tools. Within this current level of industrial-content, this book incorporates significant usage of the existing published literature and valid information obtained from a wide-spectrum of manufacturers of plant, equipment and instrumentation before putting forward novel ideas and methodologies. Providing easy to understand bullet points and lucid descriptions of metrological and calibration subjects, this book aids reader understanding of the topics discussed whilst adding a voluminous-amount of footnotes utilised throughout all of the chapters, which adds some additional detail to the subject. Featuring an extensive amount of photographic-support, this book will serve as a key reference text for all those involved in the field. .

  5. In-cell overlay metrology by using optical metrology tool

    Science.gov (United States)

    Lee, Honggoo; Han, Sangjun; Hong, Minhyung; Kim, Seungyoung; Lee, Jieun; Lee, DongYoung; Oh, Eungryong; Choi, Ahlin; Park, Hyowon; Liang, Waley; Choi, DongSub; Kim, Nakyoon; Lee, Jeongpyo; Pandev, Stilian; Jeon, Sanghuck; Robinson, John C.

    2018-03-01

    Overlay is one of the most critical process control steps of semiconductor manufacturing technology. A typical advanced scheme includes an overlay feedback loop based on after litho optical imaging overlay metrology on scribeline targets. The after litho control loop typically involves high frequency sampling: every lot or nearly every lot. An after etch overlay metrology step is often included, at a lower sampling frequency, in order to characterize and compensate for bias. The after etch metrology step often involves CD-SEM metrology, in this case in-cell and ondevice. This work explores an alternative approach using spectroscopic ellipsometry (SE) metrology and a machine learning analysis technique. Advanced 1x nm DRAM wafers were prepared, including both nominal (POR) wafers with mean overlay offsets, as well as DOE wafers with intentional across wafer overlay modulation. After litho metrology was measured using optical imaging metrology, as well as after etch metrology using both SE and CD-SEM for comparison. We investigate 2 types of machine learning techniques with SE data: model-less and model-based, showing excellent performance for after etch in-cell on-device overlay metrology.

  6. The coming of age of the first hybrid metrology software platform dedicated to nanotechnologies (Conference Presentation)

    Science.gov (United States)

    Foucher, Johann; Labrosse, Aurelien; Dervillé, Alexandre; Zimmermann, Yann; Bernard, Guilhem; Martinez, Sergio; Grönqvist, Hanna; Baderot, Julien; Pinzan, Florian

    2017-03-01

    The development and integration of new materials and structures at the nanoscale require multiple parallel characterizations in order to control mostly physico-chemical properties as a function of applications. Among all properties, we can list physical properties such as: size, shape, specific surface area, aspect ratio, agglomeration/aggregation state, size distribution, surface morphology/topography, structure (including crystallinity and defect structure), solubility and chemical properties such as: structural formula/molecular structure, composition (including degree of purity, known impurities or additives), phase identity, surface chemistry (composition, charge, tension, reactive sites, physical structure, photocatalytic properties, zeta potential), hydrophilicity/lipophilicity. Depending on the final material formulation (aerosol, powder, nanostructuration…) and the industrial application (semiconductor, cosmetics, chemistry, automotive…), a fleet of complementary characterization equipments must be used in synergy for accurate process tuning and high production yield. The synergy between equipment so-called hybrid metrology consists in using the strength of each technique in order to reduce the global uncertainty for better and faster process control. The only way to succeed doing this exercise is to use data fusion methodology. In this paper, we will introduce the work that has been done to create the first generic hybrid metrology software platform dedicated to nanotechnologies process control. The first part will be dedicated to process flow modeling that is related to a fleet of metrology tools. The second part will introduce the concept of entity model which describes the various parameters that have to be extracted. The entity model is fed with data analysis as a function of the application (automatic analysis or semi-automated analysis). The final part will introduce two ways of doing data fusion on real data coming from imaging (SEM, TEM, AFM

  7. Development of a free software for laboratory of metrology

    International Nuclear Information System (INIS)

    Silveira, Renata R. da; Benevides, Clayton A.

    2014-01-01

    The Centro Regional de Ciencias Nucleares do Nordeste (CRCN-NE) has a Metrology Laboratory to realize radioactive assays and calibrations in X and gamma radiation. This job, realized before in a manual way, had only paper recording and a hard-working data recovery. The objective of this job was to develop an application with free software to manage the laboratory activities, as service recording, rastreability control and environmental conditions monitoring, beyond automate the certificates and reports. As result, we have obtained the optimization of the routine and the management of the laboratory. (author)

  8. A Software Tool for Optimal Sizing of PV Systems in Malaysia

    Directory of Open Access Journals (Sweden)

    Tamer Khatib

    2012-01-01

    Full Text Available This paper presents a MATLAB based user friendly software tool called as PV.MY for optimal sizing of photovoltaic (PV systems. The software has the capabilities of predicting the metrological variables such as solar energy, ambient temperature and wind speed using artificial neural network (ANN, optimizes the PV module/ array tilt angle, optimizes the inverter size and calculate optimal capacities of PV array, battery, wind turbine and diesel generator in hybrid PV systems. The ANN based model for metrological prediction uses four meteorological variables, namely, sun shine ratio, day number and location coordinates. As for PV system sizing, iterative methods are used for determining the optimal sizing of three types of PV systems, which are standalone PV system, hybrid PV/wind system and hybrid PV/diesel generator system. The loss of load probability (LLP technique is used for optimization in which the energy sources capacities are the variables to be optimized considering very low LLP. As for determining the optimal PV panels tilt angle and inverter size, the Liu and Jordan model for solar energy incident on a tilt surface is used in optimizing the monthly tilt angle, while a model for inverter efficiency curve is used in the optimization of inverter size.

  9. Realizing "value-added" metrology

    Science.gov (United States)

    Bunday, Benjamin; Lipscomb, Pete; Allgair, John; Patel, Dilip; Caldwell, Mark; Solecky, Eric; Archie, Chas; Morningstar, Jennifer; Rice, Bryan J.; Singh, Bhanwar; Cain, Jason; Emami, Iraj; Banke, Bill, Jr.; Herrera, Alfredo; Ukraintsev, Vladamir; Schlessinger, Jerry; Ritchison, Jeff

    2007-03-01

    The conventional premise that metrology is a "non-value-added necessary evil" is a misleading and dangerous assertion, which must be viewed as obsolete thinking. Many metrology applications are key enablers to traditionally labeled "value-added" processing steps in lithography and etch, such that they can be considered integral parts of the processes. Various key trends in modern, state-of-the-art processing such as optical proximity correction (OPC), design for manufacturability (DFM), and advanced process control (APC) are based, at their hearts, on the assumption of fine-tuned metrology, in terms of uncertainty and accuracy. These trends are vehicles where metrology thus has large opportunities to create value through the engineering of tight and targetable process distributions. Such distributions make possible predictability in speed-sorts and in other parameters, which results in high-end product. Additionally, significant reliance has also been placed on defect metrology to predict, improve, and reduce yield variability. The necessary quality metrology is strongly influenced by not only the choice of equipment, but also the quality application of these tools in a production environment. The ultimate value added by metrology is a result of quality tools run by a quality metrology team using quality practices. This paper will explore the relationships among present and future trends and challenges in metrology, including equipment, key applications, and metrology deployment in the manufacturing flow. Of key importance are metrology personnel, with their expertise, practices, and metrics in achieving and maintaining the required level of metrology performance, including where precision, matching, and accuracy fit into these considerations. The value of metrology will be demonstrated to have shifted to "key enabler of large revenues," debunking the out-of-date premise that metrology is "non-value-added." Examples used will be from critical dimension (CD

  10. DABAM: an open-source database of X-ray mirrors metrology

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez del Rio, Manuel, E-mail: srio@esrf.eu [ESRF - The European Synchrotron, 71 Avenue des Martyrs, 38000 Grenoble (France); Bianchi, Davide [AC2T Research GmbH, Viktro-Kaplan-Strasse 2-C, 2700 Wiener Neustadt (Austria); Cocco, Daniele [SLAC National Accelerator Laboratory, 2575 Sand Hill Road, Menlo Park, CA 94025 (United States); Glass, Mark [ESRF - The European Synchrotron, 71 Avenue des Martyrs, 38000 Grenoble (France); Idir, Mourad [NSLS II, Brookhaven National Laboratory, Upton, NY 11973-5000 (United States); Metz, Jim [InSync Inc., 2511C Broadbent Parkway, Albuquerque, NM 87107 (United States); Raimondi, Lorenzo; Rebuffi, Luca [Elettra-Sincrotrone Trieste SCpA, Basovizza (TS) (Italy); Reininger, Ruben; Shi, Xianbo [Advanced Photon Source, Argonne National Laboratory, Argonne, IL 60439 (United States); Siewert, Frank [BESSY II, Helmholtz Zentrum Berlin, Institute for Nanometre Optics and Technology, Albert-Einstein-Strasse 15, 12489 Berlin (Germany); Spielmann-Jaeggi, Sibylle [Swiss Light Source at Paul Scherrer Institut, CH-5232 Villigen PSI (Switzerland); Takacs, Peter [Instrumentation Division, Brookhaven National Laboratory, Upton, NY 11973-5000 (United States); Tomasset, Muriel [Synchrotron Soleil (France); Tonnessen, Tom [InSync Inc., 2511C Broadbent Parkway, Albuquerque, NM 87107 (United States); Vivo, Amparo [ESRF - The European Synchrotron, 71 Avenue des Martyrs, 38000 Grenoble (France); Yashchuk, Valeriy [Advanced Light Source, Lawrence Berkeley National Laboratory, MS 15-R0317, 1 Cyclotron Road, Berkeley, CA 94720-8199 (United States)

    2016-04-20

    DABAM, an open-source database of X-ray mirrors metrology to be used with ray-tracing and wave-propagation codes for simulating the effect of the surface errors on the performance of a synchrotron radiation beamline. An open-source database containing metrology data for X-ray mirrors is presented. It makes available metrology data (mirror heights and slopes profiles) that can be used with simulation tools for calculating the effects of optical surface errors in the performances of an optical instrument, such as a synchrotron beamline. A typical case is the degradation of the intensity profile at the focal position in a beamline due to mirror surface errors. This database for metrology (DABAM) aims to provide to the users of simulation tools the data of real mirrors. The data included in the database are described in this paper, with details of how the mirror parameters are stored. An accompanying software is provided to allow simple access and processing of these data, calculate the most usual statistical parameters, and also include the option of creating input files for most used simulation codes. Some optics simulations are presented and discussed to illustrate the real use of the profiles from the database.

  11. Industrial Photogrammetry - Accepted Metrology Tool or Exotic Niche

    Science.gov (United States)

    Bösemann, Werner

    2016-06-01

    New production technologies like 3D printing and other adaptive manufacturing technologies have changed the industrial manufacturing process, often referred to as next industrial revolution or short industry 4.0. Such Cyber Physical Production Systems combine virtual and real world through digitization, model building process simulation and optimization. It is commonly understood that measurement technologies are the key to combine the real and virtual worlds (eg. [Schmitt 2014]). This change from measurement as a quality control tool to a fully integrated step in the production process has also changed the requirements for 3D metrology solutions. Key words like MAA (Measurement Assisted Assembly) illustrate that new position of metrology in the industrial production process. At the same time it is obvious that these processes not only require more measurements but also systems to deliver the required information in high density in a short time. Here optical solutions including photogrammetry for 3D measurements have big advantages over traditional mechanical CMM's. The paper describes the relevance of different photogrammetric solutions including state of the art, industry requirements and application examples.

  12. INDUSTRIAL PHOTOGRAMMETRY - ACCEPTED METROLOGY TOOL OR EXOTIC NICHE

    Directory of Open Access Journals (Sweden)

    W. Bösemann

    2016-06-01

    Full Text Available New production technologies like 3D printing and other adaptive manufacturing technologies have changed the industrial manufacturing process, often referred to as next industrial revolution or short industry 4.0. Such Cyber Physical Production Systems combine virtual and real world through digitization, model building process simulation and optimization. It is commonly understood that measurement technologies are the key to combine the real and virtual worlds (eg. [Schmitt 2014]. This change from measurement as a quality control tool to a fully integrated step in the production process has also changed the requirements for 3D metrology solutions. Key words like MAA (Measurement Assisted Assembly illustrate that new position of metrology in the industrial production process. At the same time it is obvious that these processes not only require more measurements but also systems to deliver the required information in high density in a short time. Here optical solutions including photogrammetry for 3D measurements have big advantages over traditional mechanical CMM’s. The paper describes the relevance of different photogrammetric solutions including state of the art, industry requirements and application examples.

  13. Alignment of KB mirrors with at-wavelength metrology tool simulated using SRW

    Science.gov (United States)

    Idir, Mourad; Rakitin, Maksim; Gao, Bo; Xue, Junpeng; Huang, Lei; Chubar, Oleg

    2017-08-01

    Synchrotron Radiation Workshop (SRW) is a powerful synchrotron radiation simulation tool and has been widely used at synchrotron facilities all over the world. During the last decade, many types of X-ray wavefront sensors have been developed and used. In this work, we present our recent effort on the development of at-wavelength metrology simulation based on SRW mainly focused on the Hartmann Wavefront Sensor (HWS). Various conditions have been studied to verify that the simulated HWS is performing as expected in terms of accuracy. This at-wavelength metrology simulation tool is then used to align KB mirrors by minimizing the wavefront aberrations. We will present our optimization process to perform an `in situ' alignment using conditions as close as possible to the real experiments (KB mirrors with different levels of figure errors or different misalignment geometry).

  14. Metrology of image placement

    International Nuclear Information System (INIS)

    Starikov, Alexander

    1998-01-01

    Metrology of registration, overlay and alignment offset in microlithography are discussed. Requirements and limitations are traced to the device ground rules and the definitions of edge, linewidth and centerline. Precision, accuracy, system performance and metrology in applications are discussed. The impact of image acquisition and data handling on performance is elucidated. Much attention is given to the manufacturing environment and effects of processing. General new methods of metrology error diagnostics and technology characterization are introduced and illustrated. Applications of these diagnostics to tests of tool performance, error diagnostics and culling, as well as to process integration in manufacturing are described. Realistic overlay reference materials and results of accuracy evaluations are discussed. Requirements in primary standards and alternative metrology are explained. The role and capability of SEM based overlay metrology is described, along with applications to device overlay metrology

  15. Improvement of the software Bernese for SLR data processing in the Main Metrological Centre of the State Time and Frequency Service

    Science.gov (United States)

    Tsyba, E.; Kaufman, M.

    2015-08-01

    Preparatory works for resuming operational calculations of the Earth rotation parameters based on the results of satellite laser ranging data processing (LAGEOS 1, LAGEOS 2) are to be completed in the Main Metrology Centre Of The State Time And Frequency Service (VNIIFTRI) in 2014. For this purpose BERNESE 5.2 software (Dach & Walser, 2014) was chosen as a base software which has been used for many years in the Main Metrological Centre of the State Time and Frequency Service to process phase observations of GLONASS and GPS satellites. Although in the BERNESE 5.2 software announced presentation the possibility of the SLR data processing is declared, it has not been fully implemented. In particular there is no such an essential element as corrective action (as input or resulting parameters) in the local time scale ("time bias"), etc. Therefore, additional program blocks have been developed and integrated into the BERNESE 5.2 software environment. The program blocks are written in Perl and Matlab program languages and can be used both for Windows and Linux, 32-bit and 64-bit platforms.

  16. GUM2DFT—a software tool for uncertainty evaluation of transient signals in the frequency domain

    International Nuclear Information System (INIS)

    Eichstädt, S; Wilkens, V

    2016-01-01

    The Fourier transform and its counterpart for discrete time signals, the discrete Fourier transform (DFT), are common tools in measurement science and application. Although almost every scientific software package offers ready-to-use implementations of the DFT, the propagation of uncertainties in line with the guide to the expression of uncertainty in measurement (GUM) is typically neglected. This is of particular importance in dynamic metrology, when input estimation is carried out by deconvolution in the frequency domain. To this end, we present the new open-source software tool GUM2DFT, which utilizes closed formulas for the efficient propagation of uncertainties for the application of the DFT, inverse DFT and input estimation in the frequency domain. It handles different frequency domain representations, accounts for autocorrelation and takes advantage of the symmetry inherent in the DFT result for real-valued time domain signals. All tools are presented in terms of examples which form part of the software package. GUM2DFT will foster GUM-compliant evaluation of uncertainty in a DFT-based analysis and enable metrologists to include uncertainty evaluations in their routine work. (paper)

  17. Learning Photogrammetry with Interactive Software Tool PhoX

    Directory of Open Access Journals (Sweden)

    T. Luhmann

    2016-06-01

    Full Text Available Photogrammetry is a complex topic in high-level university teaching, especially in the fields of geodesy, geoinformatics and metrology where high quality results are demanded. In addition, more and more black-box solutions for 3D image processing and point cloud generation are available that generate nice results easily, e.g. by structure-from-motion approaches. Within this context, the classical approach of teaching photogrammetry (e.g. focusing on aerial stereophotogrammetry has to be reformed in order to educate students and professionals with new topics and provide them with more information behind the scene. Since around 20 years photogrammetry courses at the Jade University of Applied Sciences in Oldenburg, Germany, include the use of digital photogrammetry software that provide individual exercises, deep analysis of calculation results and a wide range of visualization tools for almost all standard tasks in photogrammetry. During the last years the software package PhoX has been developed that is part of a new didactic concept in photogrammetry and related subjects. It also serves as analysis tool in recent research projects. PhoX consists of a project-oriented data structure for images, image data, measured points and features and 3D objects. It allows for almost all basic photogrammetric measurement tools, image processing, calculation methods, graphical analysis functions, simulations and much more. Students use the program in order to conduct predefined exercises where they have the opportunity to analyse results in a high level of detail. This includes the analysis of statistical quality parameters but also the meaning of transformation parameters, rotation matrices, calibration and orientation data. As one specific advantage, PhoX allows for the interactive modification of single parameters and the direct view of the resulting effect in image or object space.

  18. Learning Photogrammetry with Interactive Software Tool PhoX

    Science.gov (United States)

    Luhmann, T.

    2016-06-01

    Photogrammetry is a complex topic in high-level university teaching, especially in the fields of geodesy, geoinformatics and metrology where high quality results are demanded. In addition, more and more black-box solutions for 3D image processing and point cloud generation are available that generate nice results easily, e.g. by structure-from-motion approaches. Within this context, the classical approach of teaching photogrammetry (e.g. focusing on aerial stereophotogrammetry) has to be reformed in order to educate students and professionals with new topics and provide them with more information behind the scene. Since around 20 years photogrammetry courses at the Jade University of Applied Sciences in Oldenburg, Germany, include the use of digital photogrammetry software that provide individual exercises, deep analysis of calculation results and a wide range of visualization tools for almost all standard tasks in photogrammetry. During the last years the software package PhoX has been developed that is part of a new didactic concept in photogrammetry and related subjects. It also serves as analysis tool in recent research projects. PhoX consists of a project-oriented data structure for images, image data, measured points and features and 3D objects. It allows for almost all basic photogrammetric measurement tools, image processing, calculation methods, graphical analysis functions, simulations and much more. Students use the program in order to conduct predefined exercises where they have the opportunity to analyse results in a high level of detail. This includes the analysis of statistical quality parameters but also the meaning of transformation parameters, rotation matrices, calibration and orientation data. As one specific advantage, PhoX allows for the interactive modification of single parameters and the direct view of the resulting effect in image or object space.

  19. Coherent double-color interference microscope for traceable optical surface metrology

    Science.gov (United States)

    Malinovski, I.; França, R. S.; Bessa, M. S.; Silva, C. R.; Couceiro, I. B.

    2016-06-01

    Interference microscopy is an important field of dimensional surface metrology because it provides direct traceability of the measurements to the SI base unit definition of the metre. With a typical measurement range from micrometres to nanometres interference microscopy (IM) covers the gap between classic metrology and nanometrology, providing continuous transfer of dimensional metrology into new areas of nanoscience and nanotechnology. Therefore IM is considered to be an indispensable tool for traceable transfer of the metre unit to different instruments. We report here the metrological study of an absolute Linnik interference microscope (IM) based on two frequency stabilized lasers. The design permits the flexible use of both lasers for measurements depending on the demand of the concrete measurement task. By principle of operation IM is combination of imaging and phase-shifting interferometry (PSI). The traceability is provided by the wavelength reference, that is, a He-Ne 633 nm stabilized laser. The second laser source, that is, a Blue-Green 488 nm grating stabilized laser diode, is used for improvements of resolution, and also for resolving integer fringe discontinuities on sharp features of the surface. The IM was optimized for surface height metrology. We have performed the study of the systematic effects of the measurements. This study allowed us to improve the hardware and software of IM and to find corrections for main systematic errors. The IM is purposed for 1D to 3D height metrology and surface topography in an extended range from nanometres to micrometres. The advantages and disadvantages of the design and developed methods are discussed.

  20. Coherent double-color interference microscope for traceable optical surface metrology

    International Nuclear Information System (INIS)

    Malinovski, I; França, R S; Bessa, M S; Silva, C R; Couceiro, I B

    2016-01-01

    Interference microscopy is an important field of dimensional surface metrology because it provides direct traceability of the measurements to the SI base unit definition of the metre. With a typical measurement range from micrometres to nanometres interference microscopy (IM) covers the gap between classic metrology and nanometrology, providing continuous transfer of dimensional metrology into new areas of nanoscience and nanotechnology. Therefore IM is considered to be an indispensable tool for traceable transfer of the metre unit to different instruments. We report here the metrological study of an absolute Linnik interference microscope (IM) based on two frequency stabilized lasers. The design permits the flexible use of both lasers for measurements depending on the demand of the concrete measurement task. By principle of operation IM is combination of imaging and phase-shifting interferometry (PSI). The traceability is provided by the wavelength reference, that is, a He-Ne 633 nm stabilized laser. The second laser source, that is, a Blue-Green 488 nm grating stabilized laser diode, is used for improvements of resolution, and also for resolving integer fringe discontinuities on sharp features of the surface. The IM was optimized for surface height metrology. We have performed the study of the systematic effects of the measurements. This study allowed us to improve the hardware and software of IM and to find corrections for main systematic errors. The IM is purposed for 1D to 3D height metrology and surface topography in an extended range from nanometres to micrometres. The advantages and disadvantages of the design and developed methods are discussed. (paper)

  1. Machine Tool Software

    Science.gov (United States)

    1988-01-01

    A NASA-developed software package has played a part in technical education of students who major in Mechanical Engineering Technology at William Rainey Harper College. Professor Hack has been using (APT) Automatically Programmed Tool Software since 1969 in his CAD/CAM Computer Aided Design and Manufacturing curriculum. Professor Hack teaches the use of APT programming languages for control of metal cutting machines. Machine tool instructions are geometry definitions written in APT Language to constitute a "part program." The part program is processed by the machine tool. CAD/CAM students go from writing a program to cutting steel in the course of a semester.

  2. Tools for Embedded Computing Systems Software

    Science.gov (United States)

    1978-01-01

    A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.

  3. La Metrología Óptica y sus Aplicaciones La Metrología Óptica y sus Aplicaciones

    OpenAIRE

    Daniel Malacara Hernández

    2012-01-01

    En este trabajo se presenta una introducción al campo de la metrología óptica y de su herramienta principal que es la interferometría. También se presenta un panorama de los diferentes métodos empleados en metrología describiendo con especial detalle los avances más recientes en este campo. In this work an introduction to optical metrology is presented with a brief description of its main tool which is interferometry. Also, a survey of the main different methods used in optical metrology is ...

  4. Software architecture analysis tool : software architecture metrics collection

    NARCIS (Netherlands)

    Muskens, J.; Chaudron, M.R.V.; Westgeest, R.

    2002-01-01

    The Software Engineering discipline lacks the ability to evaluate software architectures. Here we describe a tool for software architecture analysis that is based on metrics. Metrics can be used to detect possible problems and bottlenecks in software architectures. Even though metrics do not give a

  5. Efficiency improvements of offline metrology job creation

    Science.gov (United States)

    Zuniga, Victor J.; Carlson, Alan; Podlesny, John C.; Knutrud, Paul C.

    1999-06-01

    Progress of the first lot of a new design through the production line is watched very closely. All performance metrics, cycle-time, in-line measurement results and final electrical performance are critical. Rapid movement of this lot through the line has serious time-to-market implications. Having this material waiting at a metrology operation for an engineer to create a measurement job plan wastes valuable turnaround time. Further, efficient use of a metrology system is compromised by the time required to create and maintain these measurement job plans. Thus, having a method to develop metrology job plans prior to the actual running of the material through the manufacture area can significantly improve both cycle time and overall equipment efficiency. Motorola and Schlumberger have worked together to develop and test such a system. The Remote Job Generator (RJG) created job plans for new device sin a manufacturing process from an NT host or workstation, offline. This increases available system tim effort making production measurements, decreases turnaround time on job plan creation and editing, and improves consistency across job plans. Most importantly this allows job plans for new devices to be available before the first wafers of the device arrive at the tool for measurement. The software also includes a database manager which allows updates of existing job plans to incorporate measurement changes required by process changes or measurement optimization. This paper will review the result of productivity enhancements through the increased metrology utilization and decreased cycle time associated with the use of RJG. Finally, improvements in process control through better control of Job Plans across different devices and layers will be discussed.

  6. International symposium on in situ nuclear metrology as a tool for radioecology INSINUME

    International Nuclear Information System (INIS)

    2008-01-01

    Full Text: This symposium, which is the natural continuation of the previous INSINUME conferences, held in Fleurus-Belgium, Albena-Bulgaria and Kusadasi-Turkey, has a dual purpose. First of all, it wants to bring together Radioecologists, Regulatory Authorities as well as Radiological Monitoring System Operators, in order to allow a wide exchange of information regarding practical experience and difficulties encountered in daily radiological monitoring of environment. On the other hand, the symposium intends to focus on the modern nuclear metrological tools, which could be used nowadays to ease the direct remote surveillance of the radiological status of seas, rivers, lakes and earth surface. In the past, these tools were suffering from a lack of sensitivity and reliability and were for that reason mainly used for health physics control that didn't require such a high accuracy. New systems are now at the disposal of the mathematical model users and radioecologists for investigating radioactive contaminants dispersion in normal conditions as well as in case of incidents. On basis of acquired experience and metrology progress, the final object of the symposium is to help the environment radioprotection world, to harmonise its rules, and thus to perform in the future a realistic and useful radiological monitoring. [fr

  7. Improvement of Computer Software Quality through Software Automated Tools.

    Science.gov (United States)

    1986-08-30

    information that are returned from the tools to the human user, and the forms in which these outputs are presented. Page 2 of 4 STAGE OF DEVELOPMENT: What... AUTOMIATED SOFTWARE TOOL MONITORING SYSTEM APPENDIX 2 2-1 INTRODUCTION This document and Automated Software Tool Monitoring Program (Appendix 1) are...t Output Output features provide links from the tool to both the human user and the target machine (where applicable). They describe the types

  8. Software engineering methodologies and tools

    Science.gov (United States)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  9. UPWIND 1A2 Metrology. Final Report

    DEFF Research Database (Denmark)

    Eecen, P.J.; Wagenaar, J.W.; Stefanatos, N.

    . Since this problem covers many areas of wind energy, the work package is defined as a crosscutting activity. The objectives of the metrology work package are to develop metrology tools in wind energy to significantly enhance the quality of measurement and testing techniques. The first deliverable...... is a valuable tool for the further assessment and interest has been shown from other work packages, such as Training. This report describes the activities that have been carried out in the Work Package 1A2 Metrology of the UpWind project. Activities from Risø are described in a separate report: T.F. Pedersen...... was to perform a state of the art assessment to identify all relevant measurands. The required accuracies and required sampling frequencies have been identified from the perspective of the users of the data (the other work packages in UpWind). This work led to the definition of the Metrology Database, which...

  10. Metrology Techniques for the Assembly of NCSX

    International Nuclear Information System (INIS)

    Priniski, C.; Dodson, T.; Duco, M.; Raftopoulos, S.; Ellis, R.; Brooks, A.

    2009-01-01

    In support of the National Compact Stellerator Experiment (NCSX), stellerator assembly activities continued this past year at the Princeton Plasma Physics Laboratory (PPPL) in partnership with the Oak Ridge National Laboratory (ORNL). The construction program saw the completion of the first two Half Field-Period Assemblies (HPA), each consisting of three modular coils. The full machine includes six such sub-assemblies. A single HPA consists of three of the NCSX modular coils wound and assembled at PPPL. These geometrically-complex three dimensional coils were wound using computer-aided metrology and CAD models to tolerances within +/- 0.5mm. The assembly of these coils required similar accuracy on a larger scale with the added complexity of more individual parts and fewer degrees of freedom for correction. Several new potential positioning issues developed for which measurement and control techniques were developed. To accomplish this, CAD coordinate-based computer metrology equipment and software similar to the solutions employed for winding the modular coils was used. Given the size of the assemblies, the primary tools were both interferometer aided and Absolute Distance Measurement (ADM)-only based laser trackers. In addition, portable Coordinate Measurement Machine (CMM) arms and some novel indirect measurement techniques were employed. This paper will detail both the use of CAD coordinate-based metrology technology and the techniques developed and employed for dimensional control of NSCX subassemblies. The results achieved and possible improvements to techniques will be discussed.

  11. Tools intented to nuclear metrology

    International Nuclear Information System (INIS)

    Munayco Tasayco, A.F.

    1980-08-01

    The study undertaken in the metrological laboratory of the C.E.N. Saclay Electronics Services is intended to improve the measurement methods in two fields concerning nuclear instrumentation: the current's measurement in the range 1pA to 0,01 pA and the study of a measurement's system for the linear circuits used in spectrometer gamma ray with semiconductor. Two systems are now working. Its permit an improvement of precision measurement, an automation of the measurement process and many possibilities in the choice of parameters and the laying-out of results [fr

  12. Digital Holography, a metrological tool for quantitative analysis: Trends and future applications

    Science.gov (United States)

    Paturzo, Melania; Pagliarulo, Vito; Bianco, Vittorio; Memmolo, Pasquale; Miccio, Lisa; Merola, Francesco; Ferraro, Pietro

    2018-05-01

    A review on the last achievements of Digital Holography is reported in this paper, showing that this powerful method can be a key metrological tool for the quantitative analysis and non-invasive inspection of a variety of materials, devices and processes. Nowadays, its range of applications has been greatly extended, including the study of live biological matter and biomedical applications. This paper overviews the main progresses and future perspectives of digital holography, showing new optical configurations and investigating the numerical issues to be tackled for the processing and display of quantitative data.

  13. Emerging technology for astronomical optics metrology

    Science.gov (United States)

    Trumper, Isaac; Jannuzi, Buell T.; Kim, Dae Wook

    2018-05-01

    Next generation astronomical optics will enable science discoveries across all fields and impact the way we perceive the Universe in which we live. To build these systems, optical metrology tools have been developed that push the boundary of what is possible. We present a summary of a few key metrology technologies that we believe are critical for the coming generation of optical surfaces.

  14. Software Tools for Battery Design | Transportation Research | NREL

    Science.gov (United States)

    Software Tools for Battery Design Software Tools for Battery Design Under the Computer-Aided Engineering for Electric Drive Vehicle Batteries (CAEBAT) project, NREL has developed software tools to help using CAEBAT software tools. Knowledge of the interplay of multi-physics at varied scales is imperative

  15. Software tool for portal dosimetry research.

    Science.gov (United States)

    Vial, P; Hunt, P; Greer, P B; Oliver, L; Baldock, C

    2008-09-01

    This paper describes a software tool developed for research into the use of an electronic portal imaging device (EPID) to verify dose for intensity modulated radiation therapy (IMRT) beams. A portal dose image prediction (PDIP) model that predicts the EPID response to IMRT beams has been implemented into a commercially available treatment planning system (TPS). The software tool described in this work was developed to modify the TPS PDIP model by incorporating correction factors into the predicted EPID image to account for the difference in EPID response to open beam radiation and multileaf collimator (MLC) transmitted radiation. The processes performed by the software tool include; i) read the MLC file and the PDIP from the TPS, ii) calculate the fraction of beam-on time that each point in the IMRT beam is shielded by MLC leaves, iii) interpolate correction factors from look-up tables, iv) create a corrected PDIP image from the product of the original PDIP and the correction factors and write the corrected image to file, v) display, analyse, and export various image datasets. The software tool was developed using the Microsoft Visual Studio.NET framework with the C# compiler. The operation of the software tool was validated. This software provided useful tools for EPID dosimetry research, and it is being utilised and further developed in ongoing EPID dosimetry and IMRT dosimetry projects.

  16. Towards E-CASE Tools for Software Engineering

    OpenAIRE

    Nabil Arman

    2013-01-01

    CASE tools are having an important role in all phases of software systems development and engineering. This is evident in the huge benefits obtained from using these tools including their cost-effectiveness, rapid software application development, and improving the possibility of software reuse to name just a few. In this paper, the idea of moving towards E-CASE tools, rather than traditional CASE tools, is advocated since these E-CASE tools have all the benefits and advantages of traditional...

  17. Software project management tools in global software development: a systematic mapping study.

    Science.gov (United States)

    Chadli, Saad Yasser; Idri, Ali; Ros, Joaquín Nicolás; Fernández-Alemán, José Luis; de Gea, Juan M Carrillo; Toval, Ambrosio

    2016-01-01

    Global software development (GSD) which is a growing trend in the software industry is characterized by a highly distributed environment. Performing software project management (SPM) in such conditions implies the need to overcome new limitations resulting from cultural, temporal and geographic separation. The aim of this research is to discover and classify the various tools mentioned in literature that provide GSD project managers with support and to identify in what way they support group interaction. A systematic mapping study has been performed by means of automatic searches in five sources. We have then synthesized the data extracted and presented the results of this study. A total of 102 tools were identified as being used in SPM activities in GSD. We have classified these tools, according to the software life cycle process on which they focus and how they support the 3C collaboration model (communication, coordination and cooperation). The majority of the tools found are standalone tools (77%). A small number of platforms (8%) also offer a set of interacting tools that cover the software development lifecycle. Results also indicate that SPM areas in GSD are not adequately supported by corresponding tools and deserve more attention from tool builders.

  18. A three-fingered, touch-sensitive, metrological micro-robotic assembly tool

    International Nuclear Information System (INIS)

    Torralba, Marta; Hastings, D J; Thousand, Jeffery D; Nowakowski, Bartosz K; Smith, Stuart T

    2015-01-01

    This article describes a metrological, robotic hand to manipulate and measure micrometer size objects. The presented work demonstrates not only assembly operations, but also positioning control and metrology capability. Sample motion is achieved by a commercial positioning stage, which provides XYZ-displacements for assembly of components. A designed and manufactured gripper tool that incorporates 21 degrees-of-freedom for independent alignment of actuators, sensors, and the three fingers of this hand is presented. These fingers can be opened and closed by piezoelectric actuators through levered flexures providing an 80 μm displacement range measured with calibrated opto-interrupter based, knife-edge sensors. The operational ends of the fingers comprise of a quartz tuning fork with a 7 μm diameter 3.2 mm long carbon fiber extending from the end of one tuning fork tine. Finger-tip force-sensing is achieved by the monitoring of individual finger resonances typically at around 32 kHz. Experimental results included are focused on probe performance analysis. Pick and place operation using the three fingers is demonstrated with all fingers being continuously oscillated, a capability not possible with the previous single or two finger tweezer type designs. By monitoring electrical feedback during pick and place operations, changes in the response of the three probes demonstrate the ability to identify both grab and release operations. Component metrology has been assessed by contacting different micro-spheres of diameters 50(±7.5) μm, 135(±20) μm, and 140(±20) μm. These were measured by the micro robot to have diameters of 67, 133, and 126 μm respectively with corresponding deviations of 4.2, 4.9, and 4.3 μm. This deviation in the measured results was primarily due to the manual, joystick-based, contacting of the fingers, difficulties associated with centering the components to the axis of the hand, and lower contact sensitivity for the smallest sphere

  19. Software Tools for Software Maintenance

    Science.gov (United States)

    1988-10-01

    COMMUNICATIONS, AND COMPUTER SCIENCES I ,(AIRMICS) FO~SOFTWARE TOOLS (.o FOR SOF1 ’ARE MAINTENANCE (ASQBG-1-89-001) October, 1988 DTIC ELECTE -ifB...SUNWW~. B..c Program An~Iysw HA.c C-Tractr C Cobol Stncturing Facility VS Cobol 11 F-Scan Foctma Futbol Cobol Fortran Sltiuc Code Anaiyaer Fortran IS

  20. Enhanced resolution and accuracy of freeform metrology through Subaperture Stitching Interferometry

    Science.gov (United States)

    Supranowitz, Chris; Maloney, Chris; Murphy, Paul; Dumas, Paul

    2017-10-01

    Recent advances in polishing and metrology have addressed many of the challenges in the fabrication and metrology of freeform surfaces, and the manufacture of these surfaces is possible today. However, achieving the form and mid-spatial frequency (MSF) specifications that are typical of visible imaging systems remains a challenge. Interferometric metrology for freeform surfaces is thus highly desirable for such applications, but the capability is currently quite limited for freeforms. In this paper, we provide preliminary results that demonstrate accurate, high-resolution measurements of freeform surfaces using prototype software on QED's ASI™ (Aspheric Stitching Interferometer).

  1. Software-based acoustical measurements

    CERN Document Server

    Miyara, Federico

    2017-01-01

    This textbook provides a detailed introduction to the use of software in combination with simple and economical hardware (a sound level meter with calibrated AC output and a digital recording system) to obtain sophisticated measurements usually requiring expensive equipment. It emphasizes the use of free, open source, and multiplatform software. Many commercial acoustical measurement systems use software algorithms as an integral component; however the methods are not disclosed. This book enables the reader to develop useful algorithms and provides insight into the use of digital audio editing tools to document features in the signal. Topics covered include acoustical measurement principles, in-depth critical study of uncertainty applied to acoustical measurements, digital signal processing from the basics, and metrologically-oriented spectral and statistical analysis of signals. The student will gain a deep understanding of the use of software for measurement purposes; the ability to implement software-based...

  2. Techniques and tools for software qualification in KNICS

    International Nuclear Information System (INIS)

    Cha, Kyung H.; Lee, Yeong J.; Cheon, Se W.; Kim, Jang Y.; Lee, Jang S.; Kwon, Kee C.

    2004-01-01

    This paper describes techniques and tools for qualifying safety software in Korea Nuclear Instrumentation and Control System (KNICS). Safety software are developed and applied for a Reactor Protection System (RPS), an Engineered Safety Features and Component Control System (ESF-CCS), and a safety Programmable Logic Controller (PLC) in the KNICS. Requirements and design specifications of safety software are written by both natural language and formal specification languages. Statechart is used for formal specification of software of the ESF-CCS and the safety PLC while NuSCR is used for formal specification of them of the RPS. pSET (POSCON Software Engineering Tool) as a software development tool has been developed and utilized for the IEC61131-3 based PLC programming. The qualification of the safety software consists of software verification and validation (V and V) through software life cycle, software safety analysis, and software configuration management, software quality assurance, and COTS (Commercial-Off-The-Shelf) dedication. The criteria and requirements for qualifying the safety software have been established with them in Software Review Plan (SRP)/Branch Technical Positions (BTP)-14, IEEE Std. 7-4.3.2-1998, NUREG/CR-6463, IEEE Std. 1012-1998, and so on. Figure 1 summarizes qualification techniques and tools for the safety software

  3. Advanced metrology by offline SEM data processing

    Science.gov (United States)

    Lakcher, Amine; Schneider, Loïc.; Le-Gratiet, Bertrand; Ducoté, Julien; Farys, Vincent; Besacier, Maxime

    2017-06-01

    Today's technology nodes contain more and more complex designs bringing increasing challenges to chip manufacturing process steps. It is necessary to have an efficient metrology to assess process variability of these complex patterns and thus extract relevant data to generate process aware design rules and to improve OPC models. Today process variability is mostly addressed through the analysis of in-line monitoring features which are often designed to support robust measurements and as a consequence are not always very representative of critical design rules. CD-SEM is the main CD metrology technique used in chip manufacturing process but it is challenged when it comes to measure metrics like tip to tip, tip to line, areas or necking in high quantity and with robustness. CD-SEM images contain a lot of information that is not always used in metrology. Suppliers have provided tools that allow engineers to extract the SEM contours of their features and to convert them into a GDS. Contours can be seen as the signature of the shape as it contains all the dimensional data. Thus the methodology is to use the CD-SEM to take high quality images then generate SEM contours and create a data base out of them. Contours are used to feed an offline metrology tool that will process them to extract different metrics. It was shown in two previous papers that it is possible to perform complex measurements on hotspots at different process steps (lithography, etch, copper CMP) by using SEM contours with an in-house offline metrology tool. In the current paper, the methodology presented previously will be expanded to improve its robustness and combined with the use of phylogeny to classify the SEM images according to their geometrical proximities.

  4. A Software Tool for Legal Drafting

    Directory of Open Access Journals (Sweden)

    Daniel Gorín

    2011-09-01

    Full Text Available Although many attempts at automated aids for legal drafting have been made, they were based on the construction of a new tool, completely from scratch. This is at least curious, considering that a strong parallelism can be established between a normative document and a software specification: both describe what an entity should or should not do, can or cannot do. In this article we compare normative documents and software specifications to find out their similarities and differences. The comparison shows that there are distinctive particularities, but they are restricted to a very specific subclass of normative propositions. The rest, we postulate, can be dealt with software tools. For such an enterprise the FormaLex tool set was devised: an LTL-based language and companion tools that utilize model checking to find out normative incoherences in regulations, contracts and other legal documents. A feature-rich case study is analyzed with the presented tools.

  5. Workshop on Software Development Tools for Petascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Vetter, Jeffrey [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Georgia Inst. of Technology, Atlanta, GA (United States)

    2007-08-01

    Petascale computing systems will soon be available to the DOE science community. Recent studies in the productivity of HPC platforms point to better software environments as a key enabler to science on these systems. To prepare for the deployment and productive use of these petascale platforms, the DOE science and general HPC community must have the software development tools, such as performance analyzers and debuggers that meet application requirements for scalability, functionality, reliability, and ease of use. In this report, we identify and prioritize the research opportunities in the area of software development tools for high performance computing. To facilitate this effort, DOE hosted a group of 55 leading international experts in this area at the Software Development Tools for PetaScale Computing (SDTPC) Workshop, which was held in Washington, D.C. on August 1 and 2, 2007. Software development tools serve as an important interface between the application teams and the target HPC architectures. Broadly speaking, these roles can be decomposed into three categories: performance tools, correctness tools, and development environments. Accordingly, this SDTPC report has four technical thrusts: performance tools, correctness tools, development environment infrastructures, and scalable tool infrastructures. The last thrust primarily targets tool developers per se, rather than end users. Finally, this report identifies non-technical strategic challenges that impact most tool development. The organizing committee emphasizes that many critical areas are outside the scope of this charter; these important areas include system software, compilers, and I/O.

  6. Software tool for physics chart checks.

    Science.gov (United States)

    Li, H Harold; Wu, Yu; Yang, Deshan; Mutic, Sasa

    2014-01-01

    Physics chart check has long been a central quality assurance (QC) measure in radiation oncology. The purpose of this work is to describe a software tool that aims to accomplish simplification, standardization, automation, and forced functions in the process. Nationally recognized guidelines, including American College of Radiology and American Society for Radiation Oncology guidelines and technical standards, and the American Association of Physicists in Medicine Task Group reports were identified, studied, and summarized. Meanwhile, the reported events related to physics chart check service were analyzed using an event reporting and learning system. A number of shortfalls in the chart check process were identified. To address these problems, a software tool was designed and developed under Microsoft. Net in C# to hardwire as many components as possible at each stage of the process. The software consists of the following 4 independent modules: (1) chart check management; (2) pretreatment and during treatment chart check assistant; (3) posttreatment chart check assistant; and (4) quarterly peer-review management. The users were a large group of physicists in the author's radiation oncology clinic. During over 1 year of use the tool has proven very helpful in chart checking management, communication, documentation, and maintaining consistency. The software tool presented in this work aims to assist physicists at each stage of the physics chart check process. The software tool is potentially useful for any radiation oncology clinics that are either in the process of pursuing or maintaining the American College of Radiology accreditation.

  7. 7th International Workshop on Advanced Optical Imaging and Metrology

    CERN Document Server

    2014-01-01

    In continuation of the FRINGE Workshop Series this Proceeding contains all contributions presented at the 7. International Workshop on Advanced Optical Imaging and Metrology. The FRINGE Workshop Series is dedicated to the presentation, discussion and dissemination of recent results in Optical Imaging and Metrology. Topics of particular interest for the 7. Workshop are: - New methods and tools for the generation, acquisition, processing, and evaluation of data in Optical Imaging and Metrology (digital wavefront engineering, computational imaging, model-based reconstruction, compressed sensing, inverse problems solution) - Application-driven technologies in Optical Imaging and Metrology (high-resolution, adaptive, active, robust, reliable, flexible, in-line, real-time) - High-dynamic range solutions in Optical Imaging and Metrology (from macro to nano) - Hybrid technologies in Optical Imaging and Metrology (hybrid optics, sensor and data fusion, model-based solutions, multimodality) - New optical sensors, imagi...

  8. Tools & training for more secure software

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Just by fate of nature, software today is shipped out as “beta”, coming with vulnerabilities and weaknesses, which should already have been fixed at the programming stage. This presentation will show the consequences of suboptimal software, why good programming, thorough software design, and a proper software development process is imperative for the overall security of the Organization, and how a few simple tools and training are supposed to make CERN software more secure.

  9. Software Development Methods and Tools: a New Zealand study

    Directory of Open Access Journals (Sweden)

    Chris Phillips

    2005-05-01

    Full Text Available This study is a more detailed follow-up to a preliminary investigation of the practices of software engineers in New Zealand. The focus of this study is on the methods and tools used by software developers in their current organisation. The project involved detailed questionnaires being piloted and sent out to several hundred software developers. A central part of the research involved the identification of factors affecting the use and take-up of existing software development tools in the workplace. The full spectrum of tools from fully integrated I-CASE tools to individual software applications, such as drawing tools was investigated. This paper describes the project and presents the findings.

  10. Criteria and tools for scientific software quality measurements

    Energy Technology Data Exchange (ETDEWEB)

    Tseng, M Y [Previse Inc., Willowdale ON (Canada)

    1995-12-01

    Not all software used in the nuclear industry needs the rigorous formal verification, reliability testing and quality assessment that are being applied to safety critical software. Recently, however, there is increasing recognition that systematic and objective quality assessment of the scientific software used in design and safety analyses of nuclear facilities is necessary to support safety and licensing decisions. Because of the complexity and large size of these programs and the resource constraints faced by the AECB reviewer, it is desirable that appropriate automated tools are used wherever practical. To objectively assess the quality of software, a set of attributes of a software product by which its quality is described and evaluated must be established. These attributes must be relevant to the application domain of software under evaluation. To effectively assess the quality of software, metrics defining quantitative scale and method appropriate to determine the value of attributes need to be applied. To cost-effectively perform the evaluation, use of suitable automated tools is desirable. In this project, criteria for evaluating the quality of scientific software are presented; metrics for which those criteria can be evaluated are identified; a survey of automated tools to measure those metrics was conducted and the most appropriate tool (QA Fortran) was acquired; and the tool usage was demonstrated on three sample programs. (author) 5 refs.

  11. Criteria and tools for scientific software quality measurements

    International Nuclear Information System (INIS)

    Tseng, M.Y.

    1995-12-01

    Not all software used in the nuclear industry needs the rigorous formal verification, reliability testing and quality assessment that are being applied to safety critical software. Recently, however, there is increasing recognition that systematic and objective quality assessment of the scientific software used in design and safety analyses of nuclear facilities is necessary to support safety and licensing decisions. Because of the complexity and large size of these programs and the resource constraints faced by the AECB reviewer, it is desirable that appropriate automated tools are used wherever practical. To objectively assess the quality of software, a set of attributes of a software product by which its quality is described and evaluated must be established. These attributes must be relevant to the application domain of software under evaluation. To effectively assess the quality of software, metrics defining quantitative scale and method appropriate to determine the value of attributes need to be applied. To cost-effectively perform the evaluation, use of suitable automated tools is desirable. In this project, criteria for evaluating the quality of scientific software are presented; metrics for which those criteria can be evaluated are identified; a survey of automated tools to measure those metrics was conducted and the most appropriate tool (QA Fortran) was acquired; and the tool usage was demonstrated on three sample programs. (author) 5 refs

  12. Applying CASE Tools for On-Board Software Development

    Science.gov (United States)

    Brammer, U.; Hönle, A.

    For many space projects the software development is facing great pressure with respect to quality, costs and schedule. One way to cope with these challenges is the application of CASE tools for automatic generation of code and documentation. This paper describes two CASE tools: Rhapsody (I-Logix) featuring UML and ISG (BSSE) that provides modeling of finite state machines. Both tools have been used at Kayser-Threde in different space projects for the development of on-board software. The tools are discussed with regard to the full software development cycle.

  13. Tool Use Within NASA Software Quality Assurance

    Science.gov (United States)

    Shigeta, Denise; Port, Dan; Nikora, Allen P.; Wilf, Joel

    2013-01-01

    As space mission software systems become larger and more complex, it is increasingly important for the software assurance effort to have the ability to effectively assess both the artifacts produced during software system development and the development process itself. Conceptually, assurance is a straightforward idea - it is the result of activities carried out by an organization independent of the software developers to better inform project management of potential technical and programmatic risks, and thus increase management's confidence in the decisions they ultimately make. In practice, effective assurance for large, complex systems often entails assessing large, complex software artifacts (e.g., requirements specifications, architectural descriptions) as well as substantial amounts of unstructured information (e.g., anomaly reports resulting from testing activities during development). In such an environment, assurance engineers can benefit greatly from appropriate tool support. In order to do so, an assurance organization will need accurate and timely information on the tool support available for various types of assurance activities. In this paper, we investigate the current use of tool support for assurance organizations within NASA, and describe on-going work at JPL for providing assurance organizations with the information about tools they need to use them effectively.

  14. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    Science.gov (United States)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  15. Software Tools for Development on the Peregrine System | High-Performance

    Science.gov (United States)

    Computing | NREL Software Tools for Development on the Peregrine System Software Tools for and manage software at the source code level. Cross-Platform Make and SCons The "Cross-Platform Make" (CMake) package is from Kitware, and SCons is a modern software build tool based on Python

  16. Reference metrology in a research fab: the NIST clean calibrations thrust

    Science.gov (United States)

    Dixson, Ronald; Fu, Joe; Orji, Ndubuisi; Renegar, Thomas; Zheng, Alan; Vorburger, Theodore; Hilton, Al; Cangemi, Marc; Chen, Lei; Hernandez, Mike; Hajdaj, Russell; Bishop, Michael; Cordes, Aaron

    2009-03-01

    In 2004, the National Institute of Standards and Technology (NIST) commissioned the Advanced Measurement Laboratory (AML) - a state-of-the-art, five-wing laboratory complex for leading edge NIST research. The NIST NanoFab - a 1765 m2 (19,000 ft2) clean room with 743 m2 (8000 ft2) of class 100 space - is the anchor of this facility and an integral component of the new Center for Nanoscale Science and Technology (CNST) at NIST. Although the CNST/NanoFab is a nanotechnology research facility with a different strategic focus than a current high volume semiconductor fab, metrology tools still play an important role in the nanofabrication research conducted here. Some of the metrology tools available to users of the NanoFab include stylus profiling, scanning electron microscopy (SEM), and atomic force microscopy (AFM). Since 2001, NIST has collaborated with SEMATECH to implement a reference measurement system (RMS) using critical dimension atomic force microscopy (CD-AFM). NIST brought metrology expertise to the table and SEMATECH provided access to leading edge metrology tools in their clean room facility in Austin. Now, in the newly launched "clean calibrations" thrust at NIST, we are implementing the reference metrology paradigm on several tools in the CNST/NanoFab. Initially, we have focused on calibration, monitoring, and uncertainty analysis for a three-tool set consisting of a stylus profiler, an SEM, and an AFM. Our larger goal is the development of new and supplemental calibrations and standards that will benefit from the Class 100 environment available in the NanoFab and offering our customers calibration options that do not require exposing their samples to less clean environments. Toward this end, we have completed a preliminary evaluation of the performance of these instruments. The results of these evaluations suggest that the achievable uncertainties are generally consistent with our measurement goals.

  17. PISCES: A Tool for Predicting Software Testability

    Science.gov (United States)

    Voas, Jeffrey M.; Miller, Keith W.; Payne, Jeffery E.

    1991-01-01

    Before a program can fail, a software fault must be executed, that execution must alter the data state, and the incorrect data state must propagate to a state that results directly in an incorrect output. This paper describes a tool called PISCES (developed by Reliable Software Technologies Corporation) for predicting the probability that faults in a particular program location will accomplish all three of these steps causing program failure. PISCES is a tool that is used during software verification and validation to predict a program's testability.

  18. EISA 432 Energy Audits Best Practices: Software Tools

    Energy Technology Data Exchange (ETDEWEB)

    Maryl Fisher

    2014-11-01

    Five whole building analysis software tools that can aid an energy manager with fulfilling energy audit and commissioning/retro-commissioning requirements were selected for review in this best practices study. A description of each software tool is provided as well as a discussion of the user interface and level of expertise required for each tool, a review of how to use the tool for analyzing energy conservation opportunities, the format and content of reports generated by the tool, and a discussion on the applicability of the tool for commissioning.

  19. An OCD perspective of line edge and line width roughness metrology

    Science.gov (United States)

    Bonam, Ravi; Muthinti, Raja; Breton, Mary; Liu, Chi-Chun; Sieg, Stuart; Seshadri, Indira; Saulnier, Nicole; Shearer, Jeffrey; Patlolla, Raghuveer; Huang, Huai

    2017-03-01

    Metrology of nanoscale patterns poses multiple challenges that range from measurement noise, metrology errors, probe size etc. Optical Metrology has gained a lot of significance in the semiconductor industry due to its fast turn around and reliable accuracy, particularly to monitor in-line process variations. Apart from monitoring critical dimension, thickness of films, there are multiple parameters that can be extracted from Optical Metrology models3. Sidewall angles, material compositions etc., can also be modeled to acceptable accuracy. Line edge and Line Width roughness are much sought of metrology following critical dimension and its uniformity, although there has not been much development in them with optical metrology. Scanning Electron Microscopy is still used as a standard metrology technique for assessment of Line Edge and Line Width roughness. In this work we present an assessment of Optical Metrology and its ability to model roughness from a set of structures with intentional jogs to simulate both Line edge and Line width roughness at multiple amplitudes and frequencies. We also present multiple models to represent roughness and extract relevant parameters from Optical metrology. Another critical aspect of optical metrology setup is correlation of measurement to a complementary technique to calibrate models. In this work, we also present comparison of roughness parameters extracted and measured with variation of image processing conditions on a commercially available CD-SEM tool.

  20. Next generation of decision making software for nanopatterns characterization: application to semiconductor industry

    Science.gov (United States)

    Dervilllé, A.; Labrosse, A.; Zimmermann, Y.; Foucher, J.; Gronheid, R.; Boeckx, C.; Singh, A.; Leray, P.; Halder, S.

    2016-03-01

    The dimensional scaling in IC manufacturing strongly drives the demands on CD and defect metrology techniques and their measurement uncertainties. Defect review has become as important as CD metrology and both of them create a new metrology paradigm because it creates a completely new need for flexible, robust and scalable metrology software. Current, software architectures and metrology algorithms are performant but it must be pushed to another higher level in order to follow roadmap speed and requirements. For example: manage defect and CD in one step algorithm, customize algorithms and outputs features for each R&D team environment, provide software update every day or every week for R&D teams in order to explore easily various development strategies. The final goal is to avoid spending hours and days to manually tune algorithm to analyze metrology data and to allow R&D teams to stay focus on their expertise. The benefits are drastic costs reduction, more efficient R&D team and better process quality. In this paper, we propose a new generation of software platform and development infrastructure which can integrate specific metrology business modules. For example, we will show the integration of a chemistry module dedicated to electronics materials like Direct Self Assembly features. We will show a new generation of image analysis algorithms which are able to manage at the same time defect rates, images classifications, CD and roughness measurements with high throughput performances in order to be compatible with HVM. In a second part, we will assess the reliability, the customization of algorithm and the software platform capabilities to follow new specific semiconductor metrology software requirements: flexibility, robustness, high throughput and scalability. Finally, we will demonstrate how such environment has allowed a drastic reduction of data analysis cycle time.

  1. Optical metrology for advanced process control: full module metrology solutions

    Science.gov (United States)

    Bozdog, Cornel; Turovets, Igor

    2016-03-01

    Optical metrology is the workhorse metrology in manufacturing and key enabler to patterning process control. Recent advances in device architecture are gradually shifting the need for process control from the lithography module to other patterning processes (etch, trim, clean, LER/LWR treatments, etc..). Complex multi-patterning integration solutions, where the final pattern is the result of multiple process steps require a step-by-step holistic process control and a uniformly accurate holistic metrology solution for pattern transfer for the entire module. For effective process control, more process "knobs" are needed, and a tighter integration of metrology with process architecture.

  2. In-situ virtual metrology for the silicon-dioxide etch rate by using optical emission spectroscopy data

    International Nuclear Information System (INIS)

    Kim, Boomsoo; Hong, Sangjeen

    2014-01-01

    As a useful tool for process control in a high volume semiconductor manufacturing environment, virtual metrology for the etch rate in a plasma etch process is investigated using optical emission spectroscopy (OES) data. Virtual metrology is a surrogate measurement taken from the process instead of from direct measurement, and it can provide in-situ metrology of a wafer's geometry from a predictive model. A statistical regression model that correlates the selected wavelengths of the optical emission spectra to the etch rate is established using the OES data collected over 20 experimental runs. In addition, an argon actinometry study is employed to quantify the OES data, and it provides valuable insight into the analysis of the OES data. The established virtual metrology model is further verified with an additional 20 runs of data. As a result, the virtual metrology model with both process recipe tool data and in-situ data shows higher prediction accuracy by as much as 56% compared with either the process recipe tool data or the in-situ data alone.

  3. Metrology in CNEN NN 3.05/13 standard

    International Nuclear Information System (INIS)

    Mello, Marina Santiago de

    2014-01-01

    The nuclear medicine exams are widely used tools in health services for a reliable clinical and functional diagnosis of a disease. In Brazil, the National Nuclear Energy Commission, through the norm CNEN-NN 3:05/13, provides for the requirements of safety and radiological protection in nuclear medicine services. The objective of this review article was to emphasize the importance of metrology in compliance with this norm. We observed that metrology plays a vital role as it ensures the quality, accuracy, reproducibility and consistency of the measurements in the field of nuclear medicine. (author)

  4. Testing tool for software concerning nuclear power plant safety

    International Nuclear Information System (INIS)

    Boulc'h, J.; Le Meur, M.; Collart, J.M.; Segalard, J.; Uberschlag, J.

    1984-11-01

    In the present case, softwares to be analyzed are all written in assembler language. This paper presents the study and the realization of a tool to analyze softwares which have an important role for nuclear reactor protection and sauvegarde: principles of the tool design, working principle, realization and evolution of dynamic analyze tool [fr

  5. Structure and software tools of AIDA.

    Science.gov (United States)

    Duisterhout, J S; Franken, B; Witte, F

    1987-01-01

    AIDA consists of a set of software tools to allow for fast development and easy-to-maintain Medical Information Systems. AIDA supports all aspects of such a system both during development and operation. It contains tools to build and maintain forms for interactive data entry and on-line input validation, a database management system including a data dictionary and a set of run-time routines for database access, and routines for querying the database and output formatting. Unlike an application generator, the user of AIDA may select parts of the tools to fulfill his needs and program other subsystems not developed with AIDA. The AIDA software uses as host language the ANSI-standard programming language MUMPS, an interpreted language embedded in an integrated database and programming environment. This greatly facilitates the portability of AIDA applications. The database facilities supported by AIDA are based on a relational data model. This data model is built on top of the MUMPS database, the so-called global structure. This relational model overcomes the restrictions of the global structure regarding string length. The global structure is especially powerful for sorting purposes. Using MUMPS as a host language allows the user an easy interface between user-defined data validation checks or other user-defined code and the AIDA tools. AIDA has been designed primarily for prototyping and for the construction of Medical Information Systems in a research environment which requires a flexible approach. The prototyping facility of AIDA operates terminal independent and is even to a great extent multi-lingual. Most of these features are table-driven; this allows on-line changes in the use of terminal type and language, but also causes overhead. AIDA has a set of optimizing tools by which it is possible to build a faster, but (of course) less flexible code from these table definitions. By separating the AIDA software in a source and a run-time version, one is able to write

  6. Caesy: A software tool for computer-aided engineering

    Science.gov (United States)

    Wette, Matt

    1993-01-01

    A new software tool, Caesy, is described. This tool provides a strongly typed programming environment for research in the development of algorithms and software for computer-aided control system design. A description of the user language and its implementation as they currently stand are presented along with a description of work in progress and areas of future work.

  7. Software Development Methods and Tools: a New Zealand study

    OpenAIRE

    Chris Phillips; Elizabeth Kemp; Duncan Hedderley

    2005-01-01

    This study is a more detailed follow-up to a preliminary investigation of the practices of software engineers in New Zealand. The focus of this study is on the methods and tools used by software developers in their current organisation. The project involved detailed questionnaires being piloted and sent out to several hundred software developers. A central part of the research involved the identification of factors affecting the use and take-up of existing software development tools in the wo...

  8. Toxicity Estimation Software Tool (TEST)

    Science.gov (United States)

    The Toxicity Estimation Software Tool (TEST) was developed to allow users to easily estimate the toxicity of chemicals using Quantitative Structure Activity Relationships (QSARs) methodologies. QSARs are mathematical models used to predict measures of toxicity from the physical c...

  9. Software development tools using GPGPU potentialities

    International Nuclear Information System (INIS)

    Dudnik, V.A.; Kudryavtsev, V.I.; Sereda, T.M.; Us, S.A.; Shestakov, M.V.

    2011-01-01

    The paper deals with potentialities of various up-to-date software development tools for making use of graphic processor (GPU) parallel computing resources. Examples are given to illustrate the use of present-day software tools for the development of applications and realization of algorithms for scientific-technical calculations performed by GPGPU. The paper presents some classes of hard mathematical problems of scientific-technical calculations, for which the GPGPU can be efficiently used. is possible. To reduce the time of calculation program development with the use of GPGPU capabilities, various dedicated programming systems and problem-oriented subroutine libraries are recommended. Performance parameters when solving the problems with and without the use of GPGPU potentialities are compared.

  10. Runtime Performance Monitoring Tool for RTEMS System Software

    Science.gov (United States)

    Cho, B.; Kim, S.; Park, H.; Kim, H.; Choi, J.; Chae, D.; Lee, J.

    2007-08-01

    RTEMS is a commercial-grade real-time operating system that supports multi-processor computers. However, there are not many development tools for RTEMS. In this paper, we report new RTEMS-based runtime performance monitoring tool. We have implemented a light weight runtime monitoring task with an extension to the RTEMS APIs. Using our tool, software developers can verify various performance- related parameters during runtime. Our tool can be used during software development phase and in-orbit operation as well. Our implemented target agent is light weight and has small overhead using SpaceWire interface. Efforts to reduce overhead and to add other monitoring parameters are currently under research.

  11. FFI: A software tool for ecological monitoring

    Science.gov (United States)

    Duncan C. Lutes; Nathan C. Benson; MaryBeth Keifer; John F. Caratti; S. Austin Streetman

    2009-01-01

    A new monitoring tool called FFI (FEAT/FIREMON Integrated) has been developed to assist managers with collection, storage and analysis of ecological information. The tool was developed through the complementary integration of two fire effects monitoring systems commonly used in the United States: FIREMON and the Fire Ecology Assessment Tool. FFI provides software...

  12. Software tool for data mining and its applications

    Science.gov (United States)

    Yang, Jie; Ye, Chenzhou; Chen, Nianyi

    2002-03-01

    A software tool for data mining is introduced, which integrates pattern recognition (PCA, Fisher, clustering, hyperenvelop, regression), artificial intelligence (knowledge representation, decision trees), statistical learning (rough set, support vector machine), computational intelligence (neural network, genetic algorithm, fuzzy systems). It consists of nine function models: pattern recognition, decision trees, association rule, fuzzy rule, neural network, genetic algorithm, Hyper Envelop, support vector machine, visualization. The principle and knowledge representation of some function models of data mining are described. The software tool of data mining is realized by Visual C++ under Windows 2000. Nonmonotony in data mining is dealt with by concept hierarchy and layered mining. The software tool of data mining has satisfactorily applied in the prediction of regularities of the formation of ternary intermetallic compounds in alloy systems, and diagnosis of brain glioma.

  13. Claire, a simulation and testing tool for critical softwares

    International Nuclear Information System (INIS)

    Gassino, J.; Henry, J.Y.

    1996-01-01

    The CEA and IPSN (Institute of Nuclear Protection and Safety) needs concerning the testing of critical softwares, have led to the development of the CLAIRE tool which is able to test the softwares without modification. This tool allows to graphically model the system and its environment and to include components into the model which observe and do not modify the behaviour of the system to be tested. The executable codes are integrated in the model. The tool uses target machine simulators (microprocessors). The technique used (the event simulation) allows to associate actions with events such as the execution of an instruction, the access to a variable etc.. The simulation results are exploited using graphic, states research and test cover measurement tools. In particular, this tool can give help to the evaluation of critical softwares with pre-existing components. (J.S.)

  14. Metrology for fire experiments in outdoor conditions

    CERN Document Server

    Silvani, Xavier

    2013-01-01

    Natural fires can be considered as scale-dependant, non-linear processes of mass, momentum and heat transport, resulting from a turbulent reactive and radiative fluid medium flowing over a complex medium, the vegetal fuel. In natural outdoor conditions, the experimental study of natural fires at real scale needs the development of an original metrology, one able to capture the large range of time and length scales involved in its dynamic nature and also able to resist the thermal, mechanical and chemical aggression of flames on devices. Robust, accurate and poorly intrusive tools must be carefully set-up and used for gaining very fluctuating data over long periods. These signals also need the development of original post-processing tools that take into account the non-steady nature of their stochastic components. Metrology for Fire Experiments in Outdoor Conditions closely analyzes these features, and also describes measurements techniques, the thermal insulation of fragile electronic systems, data acquisitio...

  15. Nanoelectronics: Metrology and Computation

    International Nuclear Information System (INIS)

    Lundstrom, Mark; Clark, Jason V.; Klimeck, Gerhard; Raman, Arvind

    2007-01-01

    Research in nanoelectronics poses new challenges for metrology, but advances in theory, simulation and computing and networking technology provide new opportunities to couple simulation and metrology. This paper begins with a brief overview of current work in computational nanoelectronics. Three examples of how computation can assist metrology will then be discussed. The paper concludes with a discussion of how cyberinfrastructure can help connect computing and metrology using the nanoHUB (www.nanoHUB.org) as a specific example

  16. Metrology Measurement Capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Glen E. Gronniger

    2007-10-02

    This document contains descriptions of Federal Manufacturing & Technologies (FM&T) Metrology capabilities, traceability flow charts, and the measurement uncertainty of each measurement capability. Metrology provides NIST traceable precision measurements or equipment calibration for a wide variety of parameters, ranges, and state-of-the-art uncertainties. Metrology laboratories conform to the requirements of the Department of Energy Development and Production Manual Chapter 13.2, ANSI/ISO/IEC ANSI/ISO/IEC 17025:2005, and ANSI/NCSL Z540-1. FM&T Metrology laboratories are accredited by NVLAP for the parameters, ranges, and uncertainties listed in the specific scope of accreditation under NVLAP Lab code 200108-0. See the Internet at http://ts.nist.gov/Standards/scopes/2001080.pdf. These parameters are summarized. The Honeywell Federal Manufacturing & Technologies (FM&T) Metrology Department has developed measurement technology and calibration capability in four major fields of measurement: (1) Mechanical; (2) Environmental, Gas, Liquid; (3) Electrical (DC, AC, RF/Microwave); and (4) Optical and Radiation. Metrology Engineering provides the expertise to develop measurement capabilities for virtually any type of measurement in the fields listed above. A strong audit function has been developed to provide a means to evaluate the calibration programs of our suppliers and internal calibration organizations. Evaluation includes measurement audits and technical surveys.

  17. Installing and Setting Up Git Software Tool on Windows | High-Performance

    Science.gov (United States)

    Computing | NREL Git Software Tool on Windows Installing and Setting Up Git Software Tool on Windows Learn how to set up the Git software tool on Windows for use with the Peregrine system. Git is this doc, we'll show you how to get git installed on Windows 7, and how to get things set up on NREL's

  18. Software tools for microprocessor based systems

    International Nuclear Information System (INIS)

    Halatsis, C.

    1981-01-01

    After a short review of the hardware and/or software tools for the development of single-chip, fixed instruction set microprocessor-based sytems we focus on the software tools for designing systems based on microprogrammed bit-sliced microprocessors. Emphasis is placed on meta-microassemblers and simulation facilties at the register-transfer-level and architecture level. We review available meta-microassemblers giving their most important features, advantages and disadvantages. We also make extentions to higher-level microprogramming languages and associated systems specifically developed for bit-slices. In the area of simulation facilities we first discuss the simulation objectives and the criteria for chosing the right simulation language. We consertrate to simulation facilities already used in bit-slices projects and discuss the gained experience. We conclude by describing the way the Signetics meta-microassembler and the ISPS simulation tool have been employed in the design of a fast microprogrammed machine, called MICE, made out of ECL bit-slices. (orig.)

  19. EQ-10 electrodeless Z-pinch EUV source for metrology applications

    Science.gov (United States)

    Gustafson, Deborah; Horne, Stephen F.; Partlow, Matthew J.; Besen, Matthew M.; Smith, Donald K.; Blackborow, Paul A.

    2011-11-01

    With EUV Lithography systems shipping, the requirements for highly reliable EUV sources for mask inspection and resist outgassing are becoming better defined, and more urgent. The sources needed for metrology applications are very different than that needed for lithography; brightness (not power) is the key requirement. Suppliers for HVM EUV sources have all resources working on high power and have not entered the smaller market for metrology. Energetiq Technology has been shipping the EQ-10 Electrodeless Z-pinchTM light source since 19951. The source is currently being used for metrology, mask inspection, and resist development2-4. These applications require especially stable performance in both output power and plasma size and position. Over the last 6 years Energetiq has made many source modifications which have included better thermal management to increase the brightness and power of the source. We now have introduced a new source that will meet requirements of some of the mask metrology first generation tools; this source will be reviewed.

  20. Estimation of toxicity using a Java based software tool

    Science.gov (United States)

    A software tool has been developed that will allow a user to estimate the toxicity for a variety of endpoints (such as acute aquatic toxicity). The software tool is coded in Java and can be accessed using a web browser (or alternatively downloaded and ran as a stand alone applic...

  1. The classification and evaluation of Computer-Aided Software Engineering tools

    OpenAIRE

    Manley, Gary W.

    1990-01-01

    Approved for public release; distribution unlimited. The use of Computer-Aided Software Engineering (CASE) tools has been viewed as a remedy for the software development crisis by achieving improved productivity and system quality via the automation of all or part of the software engineering process. The proliferation and tremendous variety of tools available have stretched the understanding of experienced practitioners and has had a profound impact on the software engineering process itse...

  2. A coherent environment of software improvement tools for CMS

    International Nuclear Information System (INIS)

    Eulisse, G.; Muzaffar, S.; Osborne, I.; Taylor, L.; Tuura, L.A.

    2004-01-01

    CMS has developed approximately one million lines of C++ code and uses many more from HEP, Grid and public domain projects. We describe a suite of tools which help to manage this complexity by measuring software dependencies, quality metrics, and CPU and memory performance. This coherent environment integrates and extends existing open-source tools where possible and provides new in-house components where a suitable solution does not already exist. This is a freely available environment with graphical user interface which can be run on any software without the need to recompile or instrument it. We have developed ignominy which performs software dependency analysis of source code, binary products and external software. CPU profiling is provided based on oprofile, with added features such as profile snapshots, distributed profiling and aggregate profiles for farm systems including server-side tools for collecting profile data. Finally, we have developed a low-overhead performance and memory profiling tool, MemProf, which can perform (gprof-style) hierarchical performance profiling, in a way that works with multiple threads and dynamically loaded libraries (unlike gprof). It also gathers exact memory allocation profiles including which code allocates most, in what sizes of chunks, for how long, where the memory is getting freed and where it is getting leaked. We describe this tool suite and how it has been used to enhance the quality of CMS software

  3. Enabling CD SEM metrology for 5nm technology node and beyond

    Science.gov (United States)

    Lorusso, Gian Francesco; Ohashi, Takeyoshi; Yamaguchi, Astuko; Inoue, Osamu; Sutani, Takumichi; Horiguchi, Naoto; Bömmels, Jürgen; Wilson, Christopher J.; Briggs, Basoene; Tan, Chi Lim; Raymaekers, Tom; Delhougne, Romain; Van den Bosch, Geert; Di Piazza, Luca; Kar, Gouri Sankar; Furnémont, Arnaud; Fantini, Andrea; Donadio, Gabriele Luca; Souriau, Laurent; Crotti, Davide; Yasin, Farrukh; Appeltans, Raf; Rao, Siddharth; De Simone, Danilo; Rincon Delgadillo, Paulina; Leray, Philippe; Charley, Anne-Laure; Zhou, Daisy; Veloso, Anabela; Collaert, Nadine; Hasumi, Kazuhisa; Koshihara, Shunsuke; Ikota, Masami; Okagawa, Yutaka; Ishimoto, Toru

    2017-03-01

    The CD SEM (Critical Dimension Scanning Electron Microscope) is one of the main tools used to estimate Critical Dimension (CD) in semiconductor manufacturing nowadays, but, as all metrology tools, it will face considerable challenges to keep up with the requirements of the future technology nodes. The root causes of these challenges are not uniquely related to the shrinking CD values, as one might expect, but to the increase in complexity of the devices in terms of morphology and chemical composition as well. In fact, complicated threedimensional device architectures, high aspect ratio features, and wide variety of materials are some of the unavoidable characteristics of the future metrology nodes. This means that, beside an improvement in resolution, it is critical to develop a CD SEM metrology capable of satisfying the specific needs of the devices of the nodes to come, needs that sometimes will have to be addressed through dramatic changes in approach with respect to traditional CD SEM metrology. In this paper, we report on the development of advanced CD SEM metrology at imec on a variety of device platform and processes, for both logic and memories. We discuss newly developed approaches for standard, IIIV, and germanium FinFETs (Fin Field Effect Transistors), for lateral and vertical nanowires (NW), 3D NAND (three-dimensional NAND), STT-MRAM (Spin Transfer Magnetic Torque Random-Access Memory), and ReRAM (Resistive Random Access Memory). Applications for both front-end of line (FEOL) and back-end of line (BEOL) are developed. In terms of process, S/D Epi (Source Drain Epitaxy), SAQP (Self-Aligned Quadruple Patterning), DSA (Dynamic Self-Assembly), and EUVL (Extreme Ultraviolet Lithography) have been used. The work reported here has been performed on Hitachi CG5000, CG6300, and CV5000. In terms of logic, we discuss here the S/D epi defect classification, the metrology optimization for STI (Shallow Trench Isolation) Ge FinFETs, the defectivity of III-V STI Fin

  4. Overlay metrology for double patterning processes

    Science.gov (United States)

    Leray, Philippe; Cheng, Shaunee; Laidler, David; Kandel, Daniel; Adel, Mike; Dinu, Berta; Polli, Marco; Vasconi, Mauro; Salski, Bartlomiej

    2009-03-01

    The double patterning (DPT) process is foreseen by the industry to be the main solution for the 32 nm technology node and even beyond. Meanwhile process compatibility has to be maintained and the performance of overlay metrology has to improve. To achieve this for Image Based Overlay (IBO), usually the optics of overlay tools are improved. It was also demonstrated that these requirements are achievable with a Diffraction Based Overlay (DBO) technique named SCOLTM [1]. In addition, we believe that overlay measurements with respect to a reference grid are required to achieve the required overlay control [2]. This induces at least a three-fold increase in the number of measurements (2 for double patterned layers to the reference grid and 1 between the double patterned layers). The requirements of process compatibility, enhanced performance and large number of measurements make the choice of overlay metrology for DPT very challenging. In this work we use different flavors of the standard overlay metrology technique (IBO) as well as the new technique (SCOL) to address these three requirements. The compatibility of the corresponding overlay targets with double patterning processes (Litho-Etch-Litho-Etch (LELE); Litho-Freeze-Litho-Etch (LFLE), Spacer defined) is tested. The process impact on different target types is discussed (CD bias LELE, Contrast for LFLE). We compare the standard imaging overlay metrology with non-standard imaging techniques dedicated to double patterning processes (multilayer imaging targets allowing one overlay target instead of three, very small imaging targets). In addition to standard designs already discussed [1], we investigate SCOL target designs specific to double patterning processes. The feedback to the scanner is determined using the different techniques. The final overlay results obtained are compared accordingly. We conclude with the pros and cons of each technique and suggest the optimal metrology strategy for overlay control in double

  5. Technology Transfer Challenges for High-Assurance Software Engineering Tools

    Science.gov (United States)

    Koga, Dennis (Technical Monitor); Penix, John; Markosian, Lawrence Z.

    2003-01-01

    In this paper, we describe our experience with the challenges thar we are currently facing in our effort to develop advanced software verification and validation tools. We categorize these challenges into several areas: cost benefits modeling, tool usability, customer application domain, and organizational issues. We provide examples of challenges in each area and identrfj, open research issues in areas which limit our ability to transfer high-assurance software engineering tools into practice.

  6. Computer-aided design in power engineering. Application of software tools

    International Nuclear Information System (INIS)

    Stojkovic, Zlatan

    2012-01-01

    Demonstrates the use software tools in the practice of design in the field of power systems. Presents many applications in the design in the field of power systems. Useful for educative purposes and practical work. This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel and Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems is discussed. Here, the emphasis is put on the standard software MS Excel and MS Project.

  7. Computer-aided design in power engineering. Application of software tools

    Energy Technology Data Exchange (ETDEWEB)

    Stojkovic, Zlatan

    2012-07-01

    Demonstrates the use software tools in the practice of design in the field of power systems. Presents many applications in the design in the field of power systems. Useful for educative purposes and practical work. This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel and Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems is discussed. Here, the emphasis is put on the standard software MS Excel and MS Project.

  8. User Studies: Developing Learning Strategy Tool Software for Children.

    Science.gov (United States)

    Fitzgerald, Gail E.; Koury, Kevin A.; Peng, Hsinyi

    This paper is a report of user studies for developing learning strategy tool software for children. The prototype software demonstrated is designed for children with learning and behavioral disabilities. The tools consist of easy-to-use templates for creating organizational, memory, and learning approach guides for use in classrooms and at home.…

  9. A communication protocol for interactively controlling software tools

    NARCIS (Netherlands)

    Wulp, van der J.

    2008-01-01

    We present a protocol for interactively using software tools in a loosely coupled tool environment. Such an environment can assist the user in doing tasks that require the use of multiple tools. For example, it can invoke tools on certain input, set processing parameters, await task completion and

  10. Capability Handbook- offline metrology

    DEFF Research Database (Denmark)

    Islam, Aminul; Marhöfer, David Maximilian; Tosello, Guido

    This offline metrological capability handbook has been made in relation to HiMicro Task 3.3. The purpose of this document is to assess the metrological capability of the HiMicro partners and to gather the information of all available metrological instruments in the one single document. It provides...

  11. Breakthrough In Current In Plane Metrology For Monitoring Large Scale MRAM Production

    DEFF Research Database (Denmark)

    Cagliani, Alberto; Østerberg, Frederik Westergaard; Hansen, Ole

    2017-01-01

    The current-in-plane tunneling technique (CIPT) has been a crucial tool in the development of magnetic tunnel junction stacks suitable for Magnetic Random Access Memories (MRAM) for more than a decade. The MRAM development has now reached the maturity to make the transition from R&D to large...... of the Resistance Area product (RA) and the Tunnel Magnetoresistance (TMR) measurements, compared to state of the art CIPT metrology tools dedicated to R&D. On two test wafers, the repeatability of RA and MR was improved up to 350% and the measurement reproducibility up to 1700%. We believe that CIPT metrology now...

  12. Reducing measurement uncertainty drives the use of multiple technologies for supporting metrology

    Science.gov (United States)

    Banke, Bill, Jr.; Archie, Charles N.; Sendelbach, Matthew; Robert, Jim; Slinkman, James A.; Kaszuba, Phil; Kontra, Rick; DeVries, Mick; Solecky, Eric P.

    2004-05-01

    Perhaps never before in semiconductor microlithography has there been such an interest in the accuracy of measurement. This interest places new demands on our in-line metrology systems as well as the supporting metrology for verification. This also puts a burden on the users and suppliers of new measurement tools, which both challenge and complement existing manufacturing metrology. The metrology community needs to respond to these challenges by using new methods to assess the fab metrologies. An important part of this assessment process is the ability to obtain accepted reference measurements as a way of determining the accuracy and Total Measurement Uncertainty (TMU) of an in-line critical dimension (CD). In this paper, CD can mean any critical dimension including, for example, such measures as feature height or sidewall angle. This paper describes the trade-offs of in-line metrology systems as well as the limitations of Reference Measurement Systems (RMS). Many factors influence each application such as feature shape, material properties, proximity, sampling, and critical dimension. These factors, along with the metrology probe size, interaction volume, and probe type such as e-beam, optical beam, and mechanical probe, are considered. As the size of features shrinks below 100nm some of the stalwarts of reference metrology come into question, such as the electrically determined transistor gate length. The concept of the RMS is expanded to show how multiple metrologies are needed to achieve the right balance of accuracy and sampling. This is also demonstrated for manufacturing metrology. Various comparisons of CDSEM, scatterometry, AFM, cross section SEM, electrically determined CDs, and TEM are shown. An example is given which demonstrates the importance in obtaining TMU by balancing accuracy and precision for selecting manufacturing measurement strategy and optimizing manufacturing metrology. It is also demonstrated how the necessary supporting metrology will

  13. The evolution of CACSD tools-a software engineering perspective

    DEFF Research Database (Denmark)

    Ravn, Ole; Szymkat, Maciej

    1992-01-01

    The earlier evolution of computer-aided control system design (CACSD) tools is discussed from a software engineering perspective. A model of the design process is presented as the basis for principles and requirements of future CACSD tools. Combinability, interfacing in memory, and an open...... workspace are seen as important concepts in CACSD. Some points are made about the problem of buy or make when new software is required, and the idea of buy and make is put forward. Emphasis is put on the time perspective and the life cycle of the software...

  14. Vendor-based laser damage metrology equipment supporting the National Ignition Facility

    International Nuclear Information System (INIS)

    Campbell, J. H; Jennings, R. T.; Kimmons, J. F.; Kozlowski, M. R.; Mouser, R. P.; Schwatz, S.; Stolz, C. J.; Weinzapfel, C. L.

    1998-01-01

    A sizable laser damage metrology effort is required as part of optics production and installation for the 192 beam National Ignition Facility (NIF) laser. The large quantities, high damage thresholds, and large apertures of polished and coated optics necessitates vendor-based metrology equipment to assure component quality during production. This equipment must be optimized to provide the required information as rapidly as possible with limited operator experience. The damage metrology tools include: (1) platinum inclusion damage test systems for laser amplifier slabs, (2) laser conditioning stations for mirrors and polarizers, and (3) mapping and damage testing stations for UV transmissive optics. Each system includes a commercial Nd:YAG laser, a translation stage for the optics, and diagnostics to evaluate damage. The scanning parameters, optical layout, and diagnostics vary with the test fluences required and the damage morphologies expected. This paper describes the technical objectives and milestones involved in fulfilling these metrology requirements

  15. Metrological challenges introduced by new tolerancing standards

    International Nuclear Information System (INIS)

    Morse, Edward; Peng, Yue; Srinivasan, Vijay; Shakarji, Craig

    2014-01-01

    The recent release of ISO 14405-1 has provided designers with a richer set of specification tools for the size of part features, so that various functional requirements can be captured with greater fidelity. However, these tools also bring new challenges and pitfalls to an inspector using a coordinate metrology system. A sampling strategy that might have worked well in the past could lead to erroneous results that go undetected when used to evaluate these new specifications. In this paper we investigate how measurement strategies for sampled coordinate metrology systems influence different algorithms for the evaluation of these new specifications. Of particular interest are those specifications where the order statistics of feature cross-sections are required. Here the inspector must decide not only how many points are required for an individual cross-section, but the number and spacing of cross-sections measured on the feature. The results of these decisions are compared with an analytic estimate of the ‘true value’ of the measurand specified using this new standard. (paper)

  16. A software communication tool for the tele-ICU.

    Science.gov (United States)

    Pimintel, Denise M; Wei, Shang Heng; Odor, Alberto

    2013-01-01

    The Tele Intensive Care Unit (tele-ICU) supports a high volume, high acuity population of patients. There is a high-volume of incoming and outgoing calls, especially during the evening and night hours, through the tele-ICU hubs. The tele-ICU clinicians must be able to communicate effectively to team members in order to support the care of complex and critically ill patients while supporting and maintaining a standard to improve time to intervention. This study describes a software communication tool that will improve the time to intervention, over the paper-driven communication format presently used in the tele-ICU. The software provides a multi-relational database of message instances to mine information for evaluation and quality improvement for all entities that touch the tele-ICU. The software design incorporates years of critical care and software design experience combined with new skills acquired in an applied Health Informatics program. This software tool will function in the tele-ICU environment and perform as a front-end application that gathers, routes, and displays internal communication messages for intervention by priority and provider.

  17. Tuning COCOMO-II for Software Process Improvement: A Tool Based Approach

    Directory of Open Access Journals (Sweden)

    SYEDA UMEMA HANI

    2016-10-01

    Full Text Available In order to compete in the international software development market the software organizations have to adopt internationally accepted software practices i.e. standard like ISO (International Standard Organization or CMMI (Capability Maturity Model Integration in spite of having scarce resources and tools. The aim of this study is to develop a tool which could be used to present an actual picture of Software Process Improvement benefits in front of the software development companies. However, there are few tools available to assist in making predictions, they are too expensive and could not cover dataset that reflect the cultural behavior of organizations for software development in developing countries. In extension to our previously done research reported elsewhere for Pakistani software development organizations which has quantified benefits of SDPI (Software Development Process Improvement, this research has used sixty-two datasets from three different software development organizations against the set of metrics used in COCOMO-II (Constructive Cost Model 2000. It derived a verifiable equation for calculating ISF (Ideal Scale Factor and tuned the COCOMO-II model to bring prediction capability for SDPI (benefit measurement classes such as ESCP (Effort, Schedule, Cost, and Productivity. This research has contributed towards software industry by giving a reliable and low-cost mechanism for generating prediction models with high prediction accuracy. Hopefully, this study will help software organizations to use this tool not only to predict ESCP but also to predict an exact impact of SDPI.

  18. Computer- Aided Design in Power Engineering Application of Software Tools

    CERN Document Server

    Stojkovic, Zlatan

    2012-01-01

    This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents  application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel & Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems ...

  19. Effect of measurement error budgets and hybrid metrology on qualification metrology sampling

    Science.gov (United States)

    Sendelbach, Matthew; Sarig, Niv; Wakamoto, Koichi; Kim, Hyang Kyun (Helen); Isbester, Paul; Asano, Masafumi; Matsuki, Kazuto; Osorio, Carmen; Archie, Chas

    2014-10-01

    Until now, metrologists had no statistics-based method to determine the sampling needed for an experiment before the start that accuracy experiment. We show a solution to this problem called inverse total measurement uncertainty (TMU) analysis, by presenting statistically based equations that allow the user to estimate the needed sampling after providing appropriate inputs, allowing him to make important "risk versus reward" sampling, cost, and equipment decisions. Application examples using experimental data from scatterometry and critical dimension scanning electron microscope tools are used first to demonstrate how the inverse TMU analysis methodology can be used to make intelligent sampling decisions and then to reveal why low sampling can lead to unstable and misleading results. One model is developed that can help experimenters minimize sampling costs. A second cost model reveals the inadequacy of some current sampling practices-and the enormous costs associated with sampling that provides reasonable levels of certainty in the result. We introduce the strategies on how to manage and mitigate these costs and begin the discussion on how fabs are able to manufacture devices using minimal reference sampling when qualifying metrology steps. Finally, the relationship between inverse TMU analysis and hybrid metrology is explored.

  20. Sandia software guidelines: Volume 5, Tools, techniques, and methodologies

    Energy Technology Data Exchange (ETDEWEB)

    1989-07-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. This volume describes software tools and methodologies available to Sandia personnel for the development of software, and outlines techniques that have proven useful within the Laboratories and elsewhere. References and evaluations by Sandia personnel are included. 6 figs.

  1. OST: analysis tool for real time software by simulation of material and software environments

    International Nuclear Information System (INIS)

    Boulc'h; Le Meur; Lapassat; Salichon; Segalard

    1988-07-01

    The utilization of microprocessors systems in a nuclear installation control oblige a great operation safety in the installation operation and in the environment protection. For the safety analysis of these installations the Institute of Protection and Nuclear Safety (IPSN) will dispose tools which permit to make controls during all the life of the software. The simulation and test tool (OST) which have been created is completely made by softwares. It is used on VAX calculators and can be easily transportable on other calculators [fr

  2. Metrology of electrical quantum

    International Nuclear Information System (INIS)

    Camon, A.

    1996-01-01

    Since 1989 the electrical metrology laboratory of TPYCEA and the low temperature physics department of ICMA have been collaborating in the development of electrical quantum metrology. ICMA has been mainly dedicated to implement the state of the art quantum standards for which its experience on cryogenics, superconductivity and low noise instrumentation was essential. On the other hand TPYCEA concentrated its efforts on the metrological aspects, in which it has great experience. The complimentary knowledge of both laboratories, as well as the advice obtained from several prestigious metrology institutes was the key to successful completion of the two projects so far developed: i) The Josephson voltage standard (1989-1991) ii) The quantum Hall resistance standard (1991-1996) This report contains a description of both projects. Even though we can consider that the two projects are finished from the instrumental and metrological point of view, there is still a strong cooperation between ICMA and TPYCEA on the improvement of these standards, as well as on their international validation

  3. Economic benefits of metrology in manufacturing

    DEFF Research Database (Denmark)

    Savio, Enrico; De Chiffre, Leonardo; Carmignato, S.

    2016-01-01

    examples from industrial production, in which the added value of metrology in manufacturing is discussed and quantified. Case studies include: general manufacturing, forging, machining, and related metrology. The focus of the paper is on the improved effectiveness of metrology when used at product...... and process design stages, as well as on the improved accuracy and efficiency of manufacturing through better measuring equipment and process chains with integrated metrology for process control.......In streamlined manufacturing systems, the added value of inspection activities is often questioned, and metrology in particular is sometimes considered only as an avoidable expense. Documented quantification of economic benefits of metrology is generally not available. This work presents concrete...

  4. A software tool for ecosystem services assessments

    Science.gov (United States)

    Riegels, Niels; Klinting, Anders; Butts, Michael; Middelboe, Anne Lise; Mark, Ole

    2017-04-01

    The EU FP7 DESSIN project is developing methods and tools for assessment of ecosystem services (ESS) and associated economic values, with a focus on freshwater ESS in urban settings. Although the ESS approach has gained considerable visibility over the past ten years, operationalizing the approach remains a challenge. Therefore, DESSSIN is also supporting development of a free software tool to support users implementing the DESSIN ESS evaluation framework. The DESSIN ESS evaluation framework is a structured approach to measuring changes in ecosystem services. The main purpose of the framework is to facilitate the application of the ESS approach in the appraisal of projects that have impacts on freshwater ecosystems and their services. The DESSIN framework helps users evaluate changes in ESS by linking biophysical, economic, and sustainability assessments sequentially. It was developed using the Common International Classification of Ecosystem Services (CICES) and the DPSIR (Drivers, Pressures, States, Impacts, Responses) adaptive management cycle. The former is a standardized system for the classification of ESS developed by the European Union to enhance the consistency and comparability of ESS assessments. The latter is a well-known concept to disentangle the biophysical and social aspects of a system under study. As part of its analytical component, the DESSIN framework also integrates elements of the Final Ecosystem Goods and Services-Classification System (FEGS-CS) of the US Environmental Protection Agency (USEPA). As implemented in the software tool, the DESSIN framework consists of five parts: • In part I of the evaluation, the ecosystem is defined and described and the local stakeholders are identified. In addition, administrative details and objectives of the assessment are defined. • In part II, drivers and pressures are identified. Once these first two elements of the DPSIR scheme have been characterized, the claimed/expected capabilities of a

  5. [The EFS metrology: From the production to the reason].

    Science.gov (United States)

    Reifenberg, J-M; Riout, E; Leroy, A; Begue, S

    2014-06-01

    In order to answer statutory requirements and to anticipate the future needs and standards, the EFS is committed, since a few years, in a process of harmonization of its metrology function. In particular, the institution has opted for the skills development by internalizing the metrological traceability of the main critical quantities (temperature, volumetric) measurements. The development of metrology so resulted in a significant increase in calibration and testing activities. Methods are homogenized and improved through accreditations. The investment strategies are based on more and more demanding specifications. The performance of the equipments is better known and mastered. Technical expertise and maturity of the national metrology function today are assets to review in more informed ways the appropriateness of the applied periodicities. Analysis of numerous information and data in the calibration and testing reports could be pooled and operated on behalf of the unique establishment. The objective of this article is to illustrate these reflections with a few examples from of a feedback of the EFS Pyrénées Méditerranée. The analysis of some methods of qualification, the exploitation of the historical metrology in order to quantify the risk of non-compliance, and to adapt the control strategy, analysis of the criticality of an instrument in a measurement process, risk analyses are tools that deserve to be more widely exploited for that discipline wins in efficiency at the national level. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  6. The Remarkable Metrological History of Radiocarbon Dating [II].

    Science.gov (United States)

    Currie, Lloyd A

    2004-01-01

    This article traces the metrological history of radiocarbon, from the initial breakthrough devised by Libby, to minor (evolutionary) and major (revolutionary) advances that have brought (14)C measurement from a crude, bulk [8 g carbon] dating tool, to a refined probe for dating tiny amounts of precious artifacts, and for "molecular dating" at the 10 µg to 100 µg level. The metrological advances led to opportunities and surprises, such as the non-monotonic dendrochronological calibration curve and the "bomb effect," that gave rise to new multidisciplinary areas of application, ranging from archaeology and anthropology to cosmic ray physics to oceanography to apportionment of anthropogenic pollutants to the reconstruction of environmental history. Beyond the specific topic of natural (14)C, it is hoped that this account may serve as a metaphor for young scientists, illustrating that just when a scientific discipline may appear to be approaching maturity, unanticipated metrological advances in their own chosen fields, and unanticipated anthropogenic or natural chemical events in the environment, can spawn new areas of research having exciting theoretical and practical implications.

  7. Quantum metrology

    International Nuclear Information System (INIS)

    Xiang Guo-Yong; Guo Guang-Can

    2013-01-01

    The statistical error is ineluctable in any measurement. Quantum techniques, especially with the development of quantum information, can help us squeeze the statistical error and enhance the precision of measurement. In a quantum system, there are some quantum parameters, such as the quantum state, quantum operator, and quantum dimension, which have no classical counterparts. So quantum metrology deals with not only the traditional parameters, but also the quantum parameters. Quantum metrology includes two important parts: measuring the physical parameters with a precision beating the classical physics limit and measuring the quantum parameters precisely. In this review, we will introduce how quantum characters (e.g., squeezed state and quantum entanglement) yield a higher precision, what the research areas are scientists most interesting in, and what the development status of quantum metrology and its perspectives are. (topical review - quantum information)

  8. Improving design processes through structured reflection : a prototype software tool

    OpenAIRE

    Reymen, I.M.M.J.; Melby, E.

    2001-01-01

    A prototype software tool facilitating the use of a design method supporting structured reflection on design processes is presented. The prototype, called Echo, has been developed to explore the benefits of using a software system to facilitate the use of the design method. Both the prototype software tool and the design method are developed as part of the Ph.D. project of Isabelle Reymen. The goal of the design method is supporting designers with reflection on design processes in a systemati...

  9. Software Tools to Support the Assessment of System Health

    Science.gov (United States)

    Melcher, Kevin J.

    2013-01-01

    This presentation provides an overview of three software tools that were developed by the NASA Glenn Research Center to support the assessment of system health: the Propulsion Diagnostic Method Evaluation Strategy (ProDIMES), the Systematic Sensor Selection Strategy (S4), and the Extended Testability Analysis (ETA) tool. Originally developed to support specific NASA projects in aeronautics and space, these software tools are currently available to U.S. citizens through the NASA Glenn Software Catalog. The ProDiMES software tool was developed to support a uniform comparison of propulsion gas path diagnostic methods. Methods published in the open literature are typically applied to dissimilar platforms with different levels of complexity. They often address different diagnostic problems and use inconsistent metrics for evaluating performance. As a result, it is difficult to perform a one ]to ]one comparison of the various diagnostic methods. ProDIMES solves this problem by serving as a theme problem to aid in propulsion gas path diagnostic technology development and evaluation. The overall goal is to provide a tool that will serve as an industry standard, and will truly facilitate the development and evaluation of significant Engine Health Management (EHM) capabilities. ProDiMES has been developed under a collaborative project of The Technical Cooperation Program (TTCP) based on feedback provided by individuals within the aircraft engine health management community. The S4 software tool provides a framework that supports the optimal selection of sensors for health management assessments. S4 is structured to accommodate user ]defined applications, diagnostic systems, search techniques, and system requirements/constraints. One or more sensor suites that maximize this performance while meeting other user ]defined system requirements that are presumed to exist. S4 provides a systematic approach for evaluating combinations of sensors to determine the set or sets of

  10. Dimensional quality control of Ti-Ni dental file by optical coordinate metrology and computed tomography

    DEFF Research Database (Denmark)

    Yagüe-Fabra, J.A.; Tosello, Guido; Ontiveros, S.

    2014-01-01

    Endodontic dental files usually present complex 3D geometries, which make the complete measurement of the component very challenging with conventional micro metrology tools. Computed Tomography (CT) can represent a suitable alternative solution to micro metrology tools based on optical and tactile...... techniques. However, the establishment of CT systems traceability when measuring 3D complex geometries is still an open issue. In this work, to verify the quality of the CT dimensional measurements, the dental file has been measured both with a μCT system and an optical CMM (OCMM). The uncertainty...

  11. Metrological AFMs and its application for versatile nano-dimensional metrology tasks

    Science.gov (United States)

    Dai, Gaoliang; Dziomba, T.; Pohlenz, F.; Danzebrink, H.-U.; Koenders, L.

    2010-08-01

    Traceable calibrations of various micro and nano measurement devices are crucial tasks for ensuring reliable measurements for micro and nanotechnology. Today metrological AFM are widely used for traceable calibrations of nano dimensional standards. In this paper, we introduced the developments of metrological force microscopes at PTB. Of the three metrological AFMs described here, one is capable of measuring in a volume of 25 mm x 25 mm x 5 mm. All instruments feature interferometers and the three-dimensional position measurements are thus directly traceable to the metre definition. Some calibration examples on, for instance, flatness standards, step height standards, one and two dimensional gratings are demonstrated.

  12. Automation of metrological operations on measuring apparatuses of radiation monitoring system

    International Nuclear Information System (INIS)

    Kulich, V.; Studeny, J.

    1995-01-01

    (J.K.)In this paper the measuring apparatuses of ionizing radiation for the radiation monitoring of NPP Dukovany operation is described. The increase of metrological operations number has been made possible only by a timely reconstruction of the laboratory and by computerization of the measuring procedure and of administrative work which consists mainly of recording of a great number information pieces about the observed measuring apparatuses. There are three working places in the laboratory: 1) irradiation gamma stand with cesium-137 sources; 2) irradiation stand with plutonium-beryllium neutron sources; 3) spectrometric working place. With the regard to the uniqueness of the laboratory operation, all the works in the sphere of hardware as well as software has been implemented by own forces. The equipment of the laboratory makes possible to test metrologically all the radiation monitoring apparatuses used in NPP Dukovany. The quantity of operation of he laboratory of ionizing metrology qualifies the proper functioning of the radiation monitoring system, which directly influences the ensurance of nuclear safety of NPP Dukovany

  13. Automation of metrological operations on measuring apparatuses of radiation monitoring system

    Energy Technology Data Exchange (ETDEWEB)

    Kulich, V; Studeny, J [NPP Dukovany (Czech Republic)

    1996-12-31

    (J.K.)In this paper the measuring apparatuses of ionizing radiation for the radiation monitoring of NPP Dukovany operation is described. The increase of metrological operations number has been made possible only by a timely reconstruction of the laboratory and by computerization of the measuring procedure and of administrative work which consists mainly of recording of a great number information pieces about the observed measuring apparatuses. There are three working places in the laboratory: 1) irradiation gamma stand with cesium-137 sources; 2) irradiation stand with plutonium-beryllium neutron sources; 3) spectrometric working place. With the regard to the uniqueness of the laboratory operation, all the works in the sphere of hardware as well as software has been implemented by own forces. The equipment of the laboratory makes possible to test metrologically all the radiation monitoring apparatuses used in NPP Dukovany. The quantity of operation of he laboratory of ionizing metrology qualifies the proper functioning of the radiation monitoring system, which directly influences the ensurance of nuclear safety of NPP Dukovany.

  14. REVEAL - A tool for rule driven analysis of safety critical software

    International Nuclear Information System (INIS)

    Miedl, H.; Kersken, M.

    1998-01-01

    As the determination of ultrahigh reliability figures for safety critical software is hardly possible, national and international guidelines and standards give mainly requirements for the qualitative evaluation of software. An analysis whether all these requirements are fulfilled is time and effort consuming and prone to errors, if performed manually by analysts, and should instead be dedicated to tools as far as possible. There are many ''general-purpose'' software analysis tools, both static and dynamic, which help analyzing the source code. However, they are not designed to assess the adherence to specific requirements of guidelines and standards in the nuclear field. Against the background of the development of I and C systems in the nuclear field which are based on digital techniques and implemented in high level language, it is essential that the assessor or licenser has a tool with which he can automatically and uniformly qualify as many aspects as possible of the high level language software. For this purpose the software analysis tool REVEAL has been developed at ISTec and the Halden Reactor Project. (author)

  15. Lessons learned applying CASE methods/tools to Ada software development projects

    Science.gov (United States)

    Blumberg, Maurice H.; Randall, Richard L.

    1993-01-01

    This paper describes the lessons learned from introducing CASE methods/tools into organizations and applying them to actual Ada software development projects. This paper will be useful to any organization planning to introduce a software engineering environment (SEE) or evolving an existing one. It contains management level lessons learned, as well as lessons learned in using specific SEE tools/methods. The experiences presented are from Alpha Test projects established under the STARS (Software Technology for Adaptable and Reliable Systems) project. They reflect the front end efforts by those projects to understand the tools/methods, initial experiences in their introduction and use, and later experiences in the use of specific tools/methods and the introduction of new ones.

  16. Holistic Framework For Establishing Interoperability of Heterogeneous Software Development Tools

    National Research Council Canada - National Science Library

    Puett, Joseph

    2003-01-01

    This dissertation presents a Holistic Framework for Software Engineering (HFSE) that establishes collaborative mechanisms by which existing heterogeneous software development tools and models will interoperate...

  17. Software Engineering Tools for Scientific Models

    Science.gov (United States)

    Abrams, Marc; Saboo, Pallabi; Sonsini, Mike

    2013-01-01

    Software tools were constructed to address issues the NASA Fortran development community faces, and they were tested on real models currently in use at NASA. These proof-of-concept tools address the High-End Computing Program and the Modeling, Analysis, and Prediction Program. Two examples are the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) atmospheric model in Cell Fortran on the Cell Broadband Engine, and the Goddard Institute for Space Studies (GISS) coupled atmosphere- ocean model called ModelE, written in fixed format Fortran.

  18. Metrology and testing

    International Nuclear Information System (INIS)

    2010-01-01

    The chapter presents the Metrology Service of Ionizing Radiation (SEMRI), the Metrology Service of Radioisotopes (SEMRA), the External Individual Monitoring Service (SEMEX), the Internal Individual Monitoring Service (SEMIN) and the associated laboratories, the analysis of environmental samples, system for management of quality from IRD and the National Program for intercomparison results of environmental samples analysis to radioisotopes determination

  19. Module Testing Techniques for Nuclear Safety Critical Software Using LDRA Testing Tool

    International Nuclear Information System (INIS)

    Moon, Kwon-Ki; Kim, Do-Yeon; Chang, Hoon-Seon; Chang, Young-Woo; Yun, Jae-Hee; Park, Jee-Duck; Kim, Jae-Hack

    2006-01-01

    The safety critical software in the I and C systems of nuclear power plants requires high functional integrity and reliability. To achieve those requirement goals, the safety critical software should be verified and tested according to related codes and standards through verification and validation (V and V) activities. The safety critical software testing is performed at various stages during the development of the software, and is generally classified as three major activities: module testing, system integration testing, and system validation testing. Module testing involves the evaluation of module level functions of hardware and software. System integration testing investigates the characteristics of a collection of modules and aims at establishing their correct interactions. System validation testing demonstrates that the complete system satisfies its functional requirements. In order to generate reliable software and reduce high maintenance cost, it is important that software testing is carried out at module level. Module testing for the nuclear safety critical software has rarely been performed by formal and proven testing tools because of its various constraints. LDRA testing tool is a widely used and proven tool set that provides powerful source code testing and analysis facilities for the V and V of general purpose software and safety critical software. Use of the tool set is indispensable where software is required to be reliable and as error-free as possible, and its use brings in substantial time and cost savings, and efficiency

  20. A new approach to stitching optical metrology data

    Science.gov (United States)

    King, Christopher W.

    The next generation of optical instruments, including telescopes and imaging apparatus, will generate an increased requirement for larger and more complex optical forms. A major limiting factor for the production of such optical components is the metrology: how do we measure such parts and with respect to what reference datum This metrology can be thought of as part of a complete cycle in the production of optical components and it is currently the most challenging aspect of production. This thesis investigates a new and complete approach to stitching optical metrology data to extend the effective aperture or, in future, the dynamic range of optical metrology instruments. A practical approach is used to build up a complete process for stitching on piano and spherical parts. The work forms a basis upon which a stitching system for aspheres might be developed in the future, which is inherently more complicated. Beginning with a historical perspective and a review of optical polishing and metrology, the work presented relates the commercially available metrology instruments to the stitching process developed. The stitching is then performed by a numerical optimization routine that seeks to join together overlapping sub-aperture measurements by consideration of the aberrations introduced by the measurement scenario, and by the overlap areas between measurements. The stitching is part of a larger project, the PPARC Optical Manipulation and Metrology project, and was to benefit from new wavefront sensing technology developed by a project partner, and to be used for the sub-aperture measurement. Difficult mathematical problems meant that such a wavefront sensor was not avail able for this work and a work-around was therefore developed using commercial instruments. The techniques developed can be adapted to work on commercial ma chine platforms, and in partuicular, the OMAM NPL/UCL swing-arm profilometer described in chapter 5, or the computer controlled polishing machines

  1. Characteristics and possibilities of software tool for metal-oxide surge arresters selection

    Directory of Open Access Journals (Sweden)

    Đorđević Dragan

    2012-01-01

    Full Text Available This paper presents a procedure for the selection of metal-oxide surge arresters based on the instructions given in the Siemens and ABB catalogues, respecting their differences and the characteristics and possibilities of the software tool. The software tool was developed during the preparation of a Master's thesis titled, 'Automation of Metal-Oxide Surge Arresters Selection'. An example is presented of the selection of metal-oxide surge arresters using the developed software tool.

  2. The Systems Biology Research Tool: evolvable open-source software

    Directory of Open Access Journals (Sweden)

    Wright Jeremiah

    2008-06-01

    Full Text Available Abstract Background Research in the field of systems biology requires software for a variety of purposes. Software must be used to store, retrieve, analyze, and sometimes even to collect the data obtained from system-level (often high-throughput experiments. Software must also be used to implement mathematical models and algorithms required for simulation and theoretical predictions on the system-level. Results We introduce a free, easy-to-use, open-source, integrated software platform called the Systems Biology Research Tool (SBRT to facilitate the computational aspects of systems biology. The SBRT currently performs 35 methods for analyzing stoichiometric networks and 16 methods from fields such as graph theory, geometry, algebra, and combinatorics. New computational techniques can be added to the SBRT via process plug-ins, providing a high degree of evolvability and a unifying framework for software development in systems biology. Conclusion The Systems Biology Research Tool represents a technological advance for systems biology. This software can be used to make sophisticated computational techniques accessible to everyone (including those with no programming ability, to facilitate cooperation among researchers, and to expedite progress in the field of systems biology.

  3. Experience with case tools in the design of process-oriented software

    Science.gov (United States)

    Novakov, Ognian; Sicard, Claude-Henri

    1994-12-01

    In Accelerator systems such as the CERN PS complex, process equipment has a life time which may exceed the typical life cycle of its related software. Taking into account the variety of such equipment, it is important to keep the analysis and design of the software in a system-independent form. This paper discusses the experience gathered in using commercial CASE tools for analysis, design and reverse engineering of different process-oriented software modules, with a principal emphasis on maintaining the initial analysis in a standardized form. Such tools have been in existence for several years, but this paper shows that they are not fully adapted to our needs. In particular, the paper stresses the problems of integrating such a tool into an existing data-base-dependent development chain, the lack of real-time simulation tools and of Object-Oriented concepts in existing commercial packages. Finally, the paper gives a broader view of software engineering needs in our particular context.

  4. Radiation protection - quality and metrology

    International Nuclear Information System (INIS)

    Broutin, J.P.

    2002-01-01

    The radiation protection gathers three occupations: radiation protection agents; environment agents ( control and monitoring); metrology agents ( activities measurement and calibration). The quality and the metrology constitute a contribution in the technique competence and the guarantee of the service quality. This article, after a historical aspect of quality and metrology in France explains the advantages of such a policy. (N.C.)

  5. Software Tools Used for Continuous Assessment

    Directory of Open Access Journals (Sweden)

    Corina SBUGHEA

    2016-04-01

    Full Text Available he present paper addresses the subject of continuous evaluation and of the IT tools that support it. The approach starts from the main concepts and methods used in the teaching process, according to the assessment methodology and, then, it focuses on their implementation in the Wondershare QuizCreator software.

  6. Optical vortex metrology: Are phase singularities foes or friends in optical metrology?

    DEFF Research Database (Denmark)

    Takeda, M.; Wang, W.; Hanson, Steen Grüner

    2008-01-01

    We raise an issue whether phase singularities are foes or friends in optical metrology, and give an answer by introducing the principle and applications of a new technique which we recently proposed for displacement and flow measurements. The technique is called optical vortex metrology because i...

  7. A multicenter study benchmarks software tools for label-free proteome quantification.

    Science.gov (United States)

    Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan

    2016-11-01

    Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.

  8. A Roadmap for Thermal Metrology

    Science.gov (United States)

    Bojkovski, J.; Fischer, J.; Machin, G.; Pavese, F.; Peruzzi, A.; Renaot, E.; Tegeler, E.

    2009-02-01

    A provisional roadmap for thermal metrology was developed in Spring 2006 as part of the EUROMET iMERA activity toward increasing impact from national investment in European metrology R&D. This consisted of two parts: one addressing the influence of thermal metrology on society, industry, and science, and the other specifying the requirements of enabling thermal metrology to serve future needs. The roadmap represents the shared vision of the EUROMET TC Therm committee as to how thermal metrology should develop to meet future requirements over the next 15 years. It is important to stress that these documents are a first attempt to roadmap the whole of thermal metrology and will certainly need regular review and revision to remain relevant and useful to the community they seek to serve. The first part of the roadmap, “Thermal metrology for society, industry, and science,” identifies the main social and economic triggers driving developments in thermal metrology—notably citizen safety and security, new production technologies, environment and global climate change, energy, and health. Stemming from these triggers, key targets are identified that require improved thermal measurements. The second part of the roadmap, “Enabling thermal metrology to serve future needs” identifies another set of triggers, like global trade and interoperability, future needs in transport, and the earth radiation budget. Stemming from these triggers, key targets are identified, such as improved realizations and dissemination of the SI unit the kelvin, anchoring the kelvin to the Boltzmann constant, k B, and calculating thermal properties from first principles. To facilitate these outcomes, the roadmap identifies the technical advances required in thermal measurement standards.

  9. A multi-professional software tool for radiation therapy treatment verification

    International Nuclear Information System (INIS)

    Fox, Tim; Brooks, Ken; Davis, Larry

    1996-01-01

    Purpose: Verification of patient setup is important in conformal therapy because it provides a means of quality assurance for treatment delivery. Electronic portal imaging systems have led to software tools for performing digital comparison and verification of patient setup. However, these software tools are typically designed from a radiation oncologist's perspective even though treatment verification is a team effort involving oncologists, physicists, and therapists. A new software tool, Treatment Verification Tool (TVT), has been developed as an interactive, multi-professional application for reviewing and verifying treatment plan setup using conventional personal computers. This study will describe our approach to electronic treatment verification and demonstrate the features of TVT. Methods and Materials: TVT is an object-oriented software tool written in C++ using the PC-based Windows NT environment. The software utilizes the selection of a patient's images from a database. The software is also developed as a single window interface to reduce the amount of windows presented to the user. However, the user can select from four different possible views of the patient data. One of the views is side-by-side comparison of portal images (on-line portal images or digitized port film) with a prescription image (digitized simulator film or digitally reconstructed radiograph), and another view is a textual summary of the grades of each portal image. The grades of a portal image are assigned by a radiation oncologist using an evaluation method, and the physicists and therapists may only review these results. All users of TVT can perform image enhancement processes, measure distances, and perform semi-automated registration methods. An electronic dialogue can be established through a set of annotations and notes among the radiation oncologists and the technical staff. Results: Features of TVT include: 1) side-by-side comparison of portal images and a prescription image; 2

  10. Possibilities for using software tools in the process of secuirty design

    Directory of Open Access Journals (Sweden)

    Ladislav Mariš

    2013-07-01

    Full Text Available The authors deal with the use of software support the process of security design. The article proposes the theoretical basis of the implementation of software tools to design activities. Based on the selected design standards of electrical safety systems application design solutions, especially in drawing documentation. The article should serve the needs of the project team members in order to use selected software tools and a subsequent increase in the degree of automation of design activities.

  11. Leveraging Existing Mission Tools in a Re-Usable, Component-Based Software Environment

    Science.gov (United States)

    Greene, Kevin; Grenander, Sven; Kurien, James; z,s (fshir. z[orttr); z,scer; O'Reilly, Taifun

    2006-01-01

    Emerging methods in component-based software development offer significant advantages but may seem incompatible with existing mission operations applications. In this paper we relate our positive experiences integrating existing mission applications into component-based tools we are delivering to three missions. In most operations environments, a number of software applications have been integrated together to form the mission operations software. In contrast, with component-based software development chunks of related functionality and data structures, referred to as components, can be individually delivered, integrated and re-used. With the advent of powerful tools for managing component-based development, complex software systems can potentially see significant benefits in ease of integration, testability and reusability from these techniques. These benefits motivate us to ask how component-based development techniques can be relevant in a mission operations environment, where there is significant investment in software tools that are not component-based and may not be written in languages for which component-based tools even exist. Trusted and complex software tools for sequencing, validation, navigation, and other vital functions cannot simply be re-written or abandoned in order to gain the advantages offered by emerging component-based software techniques. Thus some middle ground must be found. We have faced exactly this issue, and have found several solutions. Ensemble is an open platform for development, integration, and deployment of mission operations software that we are developing. Ensemble itself is an extension of an open source, component-based software development platform called Eclipse. Due to the advantages of component-based development, we have been able to vary rapidly develop mission operations tools for three surface missions by mixing and matching from a common set of mission operation components. We have also had to determine how to

  12. A computer-aided software-tool for sustainable process synthesis-intensification

    DEFF Research Database (Denmark)

    Kumar Tula, Anjan; Babi, Deenesh K.; Bottlaender, Jack

    2017-01-01

    and determine within the design space, the more sustainable processes. In this paper, an integrated computer-aided software-tool that searches the design space for hybrid/intensified more sustainable process options is presented. Embedded within the software architecture are process synthesis...... operations as well as reported hybrid/intensified unit operations is large and can be difficult to manually navigate in order to determine the best process flowsheet for the production of a desired chemical product. Therefore, it is beneficial to utilize computer-aided methods and tools to enumerate, analyze...... constraints while also matching the design targets, they are therefore more sustainable than the base case. The application of the software-tool to the production of biodiesel is presented, highlighting the main features of the computer-aided, multi-stage, multi-scale methods that are able to determine more...

  13. Westinghouse waste simulation and optimization software tool

    International Nuclear Information System (INIS)

    Mennicken, Kim; Aign, Jorg

    2013-01-01

    Applications for dynamic simulation can be found in virtually all areas of process engineering. The tangible benefits of using dynamic simulation can be seen in tighter design, smoother start-ups and optimized operation. Thus, proper implementation of dynamic simulation can deliver substantial benefits. These benefits are typically derived from improved process understanding. Simulation gives confidence in evidence based decisions and enables users to try out lots of 'what if' scenarios until one is sure that a decision is the right one. In radioactive waste treatment tasks different kinds of waste with different volumes and properties have to be treated, e.g. from NPP operation or D and D activities. Finding a commercially and technically optimized waste treatment concept is a time consuming and difficult task. The Westinghouse Waste Simulation and Optimization Software Tool will enable the user to quickly generate reliable simulation models of various process applications based on equipment modules. These modules can be built with ease and be integrated into the simulation model. This capability ensures that this tool is applicable to typical waste treatment tasks. The identified waste streams and the selected treatment methods are the basis of the simulation and optimization software. After implementing suitable equipment data into the model, process requirements and waste treatment data are fed into the simulation to finally generate primary simulation results. A sensitivity analysis of automated optimization features of the software generates the lowest possible lifecycle cost for the simulated waste stream. In combination with proven waste management equipments and integrated waste management solutions, this tool provides reliable qualitative results that lead to an effective planning and minimizes the total project planning risk of any waste management activity. It is thus the ideal tool for designing a waste treatment facility in an optimum manner

  14. Westinghouse waste simulation and optimization software tool

    Energy Technology Data Exchange (ETDEWEB)

    Mennicken, Kim; Aign, Jorg [Westinghouse Electric Germany GmbH, Hamburg (Germany)

    2013-07-01

    Applications for dynamic simulation can be found in virtually all areas of process engineering. The tangible benefits of using dynamic simulation can be seen in tighter design, smoother start-ups and optimized operation. Thus, proper implementation of dynamic simulation can deliver substantial benefits. These benefits are typically derived from improved process understanding. Simulation gives confidence in evidence based decisions and enables users to try out lots of 'what if' scenarios until one is sure that a decision is the right one. In radioactive waste treatment tasks different kinds of waste with different volumes and properties have to be treated, e.g. from NPP operation or D and D activities. Finding a commercially and technically optimized waste treatment concept is a time consuming and difficult task. The Westinghouse Waste Simulation and Optimization Software Tool will enable the user to quickly generate reliable simulation models of various process applications based on equipment modules. These modules can be built with ease and be integrated into the simulation model. This capability ensures that this tool is applicable to typical waste treatment tasks. The identified waste streams and the selected treatment methods are the basis of the simulation and optimization software. After implementing suitable equipment data into the model, process requirements and waste treatment data are fed into the simulation to finally generate primary simulation results. A sensitivity analysis of automated optimization features of the software generates the lowest possible lifecycle cost for the simulated waste stream. In combination with proven waste management equipments and integrated waste management solutions, this tool provides reliable qualitative results that lead to an effective planning and minimizes the total project planning risk of any waste management activity. It is thus the ideal tool for designing a waste treatment facility in an optimum manner

  15. Three novel software tools for ASDEX Upgrade

    International Nuclear Information System (INIS)

    Martinov, S.; Löbhard, T.; Lunt, T.; Behler, K.; Drube, R.; Eixenberger, H.; Herrmann, A.; Lohs, A.; Lüddecke, K.; Merkel, R.; Neu, G.; ASDEX Upgrade Team; MPCDF Garching

    2016-01-01

    Highlights: • Key features of innovative software tools for data visualization and inspection are presented to the nuclear fusion research community. • 3D animation of experiment geometry together with diagnostic data and images allow better understanding of measurements and influence of machine construction details behind them. • Multi-video viewer with fusion relevant image manipulation abilities and event database features allows faster and better decision making from video streams coming from various plasma and machine diagnostics. • Platform independant Web technologies enable the inspection of diagnostic raw signals with virtually any kind of display device. - Abstract: Visualization of measurements together with experimental settings is a general subject in experiments analysis. The complex engineering design, 3D geometry, and manifold of diagnostics in larger fusion research experiments justify the development of special analysis and visualization programs. Novel ASDEX Upgrade (AUG) software tools bring together virtual navigation through 3D device models and advanced play-back and interpretation of video streams from plasma discharges. A third little tool allows the web-based platform independent observation of real-time diagnostic signals. While all three tools stem from spontaneous development ideas and are not considered mission critical for the operation of a fusion device, they with time and growing completeness shaped up as valuable helpers to visualize acquired data in fusion research. A short overview on the goals, the features, and the design as well as the operation of these tools is given in this paper.

  16. Three novel software tools for ASDEX Upgrade

    Energy Technology Data Exchange (ETDEWEB)

    Martinov, S. [Max-Planck-Institut für Plasmaphysik, Boltzmannstr. 2, D-85748 Garching bei München (Germany); Löbhard, T. [Conovum GmbH & Co. KG, Nymphenburger Straße 13, D-80335 München (Germany); Lunt, T. [Max-Planck-Institut für Plasmaphysik, Boltzmannstr. 2, D-85748 Garching bei München (Germany); Behler, K., E-mail: karl.behler@ipp.mpg.de [Max-Planck-Institut für Plasmaphysik, Boltzmannstr. 2, D-85748 Garching bei München (Germany); Drube, R.; Eixenberger, H.; Herrmann, A.; Lohs, A. [Max-Planck-Institut für Plasmaphysik, Boltzmannstr. 2, D-85748 Garching bei München (Germany); Lüddecke, K. [Unlimited Computer Systems GmbH, Seeshaupterstr. 15, D-82393 Iffeldorf (Germany); Merkel, R.; Neu, G.; ASDEX Upgrade Team [Max-Planck-Institut für Plasmaphysik, Boltzmannstr. 2, D-85748 Garching bei München (Germany); MPCDF Garching [Max Planck Compu ting and Data Facility, Boltzmannstr. 2, D-85748 Garching (Germany)

    2016-11-15

    Highlights: • Key features of innovative software tools for data visualization and inspection are presented to the nuclear fusion research community. • 3D animation of experiment geometry together with diagnostic data and images allow better understanding of measurements and influence of machine construction details behind them. • Multi-video viewer with fusion relevant image manipulation abilities and event database features allows faster and better decision making from video streams coming from various plasma and machine diagnostics. • Platform independant Web technologies enable the inspection of diagnostic raw signals with virtually any kind of display device. - Abstract: Visualization of measurements together with experimental settings is a general subject in experiments analysis. The complex engineering design, 3D geometry, and manifold of diagnostics in larger fusion research experiments justify the development of special analysis and visualization programs. Novel ASDEX Upgrade (AUG) software tools bring together virtual navigation through 3D device models and advanced play-back and interpretation of video streams from plasma discharges. A third little tool allows the web-based platform independent observation of real-time diagnostic signals. While all three tools stem from spontaneous development ideas and are not considered mission critical for the operation of a fusion device, they with time and growing completeness shaped up as valuable helpers to visualize acquired data in fusion research. A short overview on the goals, the features, and the design as well as the operation of these tools is given in this paper.

  17. Use of Software Tools in Teaching Relational Database Design.

    Science.gov (United States)

    McIntyre, D. R.; And Others

    1995-01-01

    Discusses the use of state-of-the-art software tools in teaching a graduate, advanced, relational database design course. Results indicated a positive student response to the prototype of expert systems software and a willingness to utilize this new technology both in their studies and in future work applications. (JKP)

  18. Improvement of Simulation Method in Validation of Software of the Coordinate Measuring Systems

    Science.gov (United States)

    Nieciąg, Halina

    2015-10-01

    Software is used in order to accomplish various tasks at each stage of the functioning of modern measuring systems. Before metrological confirmation of measuring equipment, the system has to be validated. This paper discusses the method for conducting validation studies of a fragment of software to calculate the values of measurands. Due to the number and nature of the variables affecting the coordinate measurement results and the complex character and multi-dimensionality of measurands, the study used the Monte Carlo method of numerical simulation. The article presents an attempt of possible improvement of results obtained by classic Monte Carlo tools. The algorithm LHS (Latin Hypercube Sampling) was implemented as alternative to the simple sampling schema of classic algorithm.

  19. PTaaS: Platform for Providing Software Developing Applications and Tools as a Service

    DEFF Research Database (Denmark)

    Chauhan, Muhammad Aufeef; Babar, Muhammad Ali

    2014-01-01

    technological support for it that is not limited to one specific tools and a particular phase of software development life cycle. In this thesis, we have explored the possibility of offering software development applications and tools as services that can be acquired on demand according to the software...... with process. Information gained from the review of literature on GSD tools and processes is used to extract functional requirements for the middleware platform for provisioning of software development applications and tools as services. Finding from the review of literature on architecture solutions for cloud......Cloud computing has become an established paradigm for enabling organizations to build scalable software systems and to meet challenges of rapid demand of computing and storage resources. There has been a significant success in building cloud-enabled applications for many disciplines ranging from...

  20. Knickpoint finder: A software tool that improves neotectonic analysis

    Science.gov (United States)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  1. The Value of Open Source Software Tools in Qualitative Research

    Science.gov (United States)

    Greenberg, Gary

    2011-01-01

    In an era of global networks, researchers using qualitative methods must consider the impact of any software they use on the sharing of data and findings. In this essay, I identify researchers' main areas of concern regarding the use of qualitative software packages for research. I then examine how open source software tools, wherein the publisher…

  2. Improving Metrological Reliability of Information-Measuring Systems Using Mathematical Modeling of Their Metrological Characteristics

    Science.gov (United States)

    Kurnosov, R. Yu; Chernyshova, T. I.; Chernyshov, V. N.

    2018-05-01

    The algorithms for improving the metrological reliability of analogue blocks of measuring channels and information-measuring systems are developed. The proposed algorithms ensure the optimum values of their metrological reliability indices for a given analogue circuit block solution.

  3. Software simulation: a tool for enhancing control system design

    International Nuclear Information System (INIS)

    Sze, B.; Ridgway, G.H.

    2008-01-01

    The creation, implementation and management of engineering design tools are important to the quality and efficiency of any large engineering project. Some of the most complicated tools to develop are system simulators. The development and implementation of system simulators to support replacement fuel handling control systems is of particular interest to the Canadian nuclear industry given the current age of installations and the risk of obsolescence to many utilities. The use of such simulator tools has been known to significantly improve successful deployment of new software packages and maintenance-related software changes while reducing the time required for their overall development. Moreover, these simulation systems can also serve as operator training stations and provide a virtual environment for site engineers to test operational changes before they are uploaded to the actual system. (author)

  4. Biology Needs Evolutionary Software Tools: Let’s Build Them Right

    Science.gov (United States)

    Team, Galaxy; Goecks, Jeremy; Taylor, James

    2018-01-01

    Abstract Research in population genetics and evolutionary biology has always provided a computational backbone for life sciences as a whole. Today evolutionary and population biology reasoning are essential for interpretation of large complex datasets that are characteristic of all domains of today’s life sciences ranging from cancer biology to microbial ecology. This situation makes algorithms and software tools developed by our community more important than ever before. This means that we, developers of software tool for molecular evolutionary analyses, now have a shared responsibility to make these tools accessible using modern technological developments as well as provide adequate documentation and training. PMID:29688462

  5. PAnalyzer: A software tool for protein inference in shotgun proteomics

    Directory of Open Access Journals (Sweden)

    Prieto Gorka

    2012-11-01

    Full Text Available Abstract Background Protein inference from peptide identifications in shotgun proteomics must deal with ambiguities that arise due to the presence of peptides shared between different proteins, which is common in higher eukaryotes. Recently data independent acquisition (DIA approaches have emerged as an alternative to the traditional data dependent acquisition (DDA in shotgun proteomics experiments. MSE is the term used to name one of the DIA approaches used in QTOF instruments. MSE data require specialized software to process acquired spectra and to perform peptide and protein identifications. However the software available at the moment does not group the identified proteins in a transparent way by taking into account peptide evidence categories. Furthermore the inspection, comparison and report of the obtained results require tedious manual intervention. Here we report a software tool to address these limitations for MSE data. Results In this paper we present PAnalyzer, a software tool focused on the protein inference process of shotgun proteomics. Our approach considers all the identified proteins and groups them when necessary indicating their confidence using different evidence categories. PAnalyzer can read protein identification files in the XML output format of the ProteinLynx Global Server (PLGS software provided by Waters Corporation for their MSE data, and also in the mzIdentML format recently standardized by HUPO-PSI. Multiple files can also be read simultaneously and are considered as technical replicates. Results are saved to CSV, HTML and mzIdentML (in the case of a single mzIdentML input file files. An MSE analysis of a real sample is presented to compare the results of PAnalyzer and ProteinLynx Global Server. Conclusions We present a software tool to deal with the ambiguities that arise in the protein inference process. Key contributions are support for MSE data analysis by ProteinLynx Global Server and technical replicates

  6. Software engineering techniques and CASE tools in RD13

    Science.gov (United States)

    Buono, S.; Gaponenko, I.; Jones, R.; Khodabandeh, A.; Mapelli, L.; Mornacchi, G.; Prigent, D.; Sanchez-Corral, E.; Skiadelli, M.; Toppers, A.; Duval, P. Y.; Ferrato, D.; Le Van Suu, A.; Qian, Z.; Rondot, C.; Ambrosini, G.; Fumagalli, G.; Polesello, G.; Aguer, M.; Huet, M.

    1994-12-01

    The RD13 project was approved in April 1991 for the development of a scalable data-taking system suitable for hosting various LHC studies. One of its goals is the exploitation of software engineering techniques, in order to indicate their overall suitability for data acquisition (DAQ), software design and implementation. This paper describes how such techniques have been applied to the development of components of the RD13 DAQ used in test-beam runs at CERN. We describe our experience with the Artifex CASE tool and its associated methodology. The issues raised when code generated by a CASE tool has to be integrated into an existing environment are also discussed.

  7. A software tool for design of process monitoring and analysis systems

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2009-01-01

    A well designed process monitoring and analysis system is necessary to consistently achieve any predefined end product quality. Systematic computer aided methods and tools provide the means to design the necessary process monitoring and analysis systems and/or to validate any existing monitoring...... and analysis system. A software to achieve this has been developed. Two developed supporting tools for the design, a knowledge base (consisting of the process knowledge as well as the knowledge on measurement methods & tools) and a model library (consisting of the process operational models) have been extended...... rigorously and integrated with the user interface, which made the software more generic and applicable to a wide range of problems. The software for the design of a process monitoring and analysis system is presented and illustrated with a tablet manufacturing process example....

  8. Design of parametric software tools

    DEFF Research Database (Denmark)

    Sabra, Jakob Borrits; Mullins, Michael

    2011-01-01

    The studies investigate the field of evidence-based design used in architectural design practice and propose a method using 2D/3D CAD applications to: 1) enhance integration of evidence-based design knowledge in architectural design phases with a focus on lighting and interior design and 2) assess...... fulfilment of evidence-based design criterion regarding light distribution and location in relation to patient safety in architectural health care design proposals. The study uses 2D/3D CAD modelling software Rhinoceros 3D with plug-in Grasshopper to create parametric tool prototypes to exemplify...... the operations and functions of the design method. To evaluate the prototype potentials, surveys with architectural and healthcare design companies are conducted. Evaluation is done by the administration of questionnaires being part of the development of the tools. The results show that architects, designers...

  9. Software for evaluation of EPR-dosimetry performance

    International Nuclear Information System (INIS)

    Shishkina, E.A.; Timofeev, Yu.S.; Ivanov, D.V.

    2014-01-01

    Electron paramagnetic resonance (EPR) with tooth enamel is a method extensively used for retrospective external dosimetry. Different research groups apply different equipment, sample preparation procedures and spectrum processing algorithms for EPR dosimetry. A uniform algorithm for description and comparison of performances was designed and implemented in a new computer code. The aim of the paper is to introduce the new software 'EPR-dosimetry performance'. The computer code is a user-friendly tool for providing a full description of method-specific capabilities of EPR tooth dosimetry, from metrological characteristics to practical limitations in applications. The software designed for scientists and engineers has several applications, including support of method calibration by evaluation of calibration parameters, evaluation of critical value and detection limit for registration of radiation-induced signal amplitude, estimation of critical value and detection limit for dose evaluation, estimation of minimal detectable value for anthropogenic dose assessment and description of method uncertainty. (authors)

  10. Innovative Software Algorithms and Tools parallel sessions summary

    International Nuclear Information System (INIS)

    Gaines, Irwin

    2001-01-01

    A variety of results were presented in the poster and 5 parallel sessions of the Innovative Software, Algorithms and Tools (ISAT) sessions. I will briefly summarize these presentations and attempt to identify some unifying trends

  11. A Century of Acoustic Metrology

    DEFF Research Database (Denmark)

    Rasmussen, Knud

    1998-01-01

    The development in acoustic measurement technique over the last century is reviewed with special emphasis on the metrological aspect.......The development in acoustic measurement technique over the last century is reviewed with special emphasis on the metrological aspect....

  12. Software Tools Streamline Project Management

    Science.gov (United States)

    2009-01-01

    Three innovative software inventions from Ames Research Center (NETMARK, Program Management Tool, and Query-Based Document Management) are finding their way into NASA missions as well as industry applications. The first, NETMARK, is a program that enables integrated searching of data stored in a variety of databases and documents, meaning that users no longer have to look in several places for related information. NETMARK allows users to search and query information across all of these sources in one step. This cross-cutting capability in information analysis has exponentially reduced the amount of time needed to mine data from days or weeks to mere seconds. NETMARK has been used widely throughout NASA, enabling this automatic integration of information across many documents and databases. NASA projects that use NETMARK include the internal reporting system and project performance dashboard, Erasmus, NASA s enterprise management tool, which enhances organizational collaboration and information sharing through document routing and review; the Integrated Financial Management Program; International Space Station Knowledge Management; Mishap and Anomaly Information Reporting System; and management of the Mars Exploration Rovers. Approximately $1 billion worth of NASA s projects are currently managed using Program Management Tool (PMT), which is based on NETMARK. PMT is a comprehensive, Web-enabled application tool used to assist program and project managers within NASA enterprises in monitoring, disseminating, and tracking the progress of program and project milestones and other relevant resources. The PMT consists of an integrated knowledge repository built upon advanced enterprise-wide database integration techniques and the latest Web-enabled technologies. The current system is in a pilot operational mode allowing users to automatically manage, track, define, update, and view customizable milestone objectives and goals. The third software invention, Query

  13. Improving design processes through structured reflection : a prototype software tool

    NARCIS (Netherlands)

    Reymen, I.M.M.J.; Melby, E.

    2001-01-01

    A prototype software tool facilitating the use of a design method supporting structured reflection on design processes is presented. The prototype, called Echo, has been developed to explore the benefits of using a software system to facilitate the use of the design method. Both the prototype

  14. Manufacturing and metrology for IR conformal windows and domes

    Science.gov (United States)

    Ferralli, Ian; Blalock, Todd; Brunelle, Matt; Lynch, Timothy; Myer, Brian; Medicus, Kate

    2017-05-01

    Freeform and conformal optics have the potential to dramatically improve optical systems by enabling systems with fewer optical components, reduced aberrations, and improved aerodynamic performance. These optical components differ from standard components in their surface shape, typically a non-symmetric equation based definition, and material properties. Traditional grinding and polishing tools are unable to handle these freeform shapes. Additionally, standard metrology tools cannot measure these surfaces. Desired substrates are typically hard ceramics, including poly-crystalline alumina or aluminum oxynitride. Notwithstanding the challenges that the hardness provides to manufacturing, these crystalline materials can be highly susceptible to grain decoration creating unacceptable scatter in optical systems. In this presentation, we will show progress towards addressing the unique challenges of manufacturing conformal windows and domes. Particular attention is given to our robotic polishing platform. This platform is based on an industrial robot adapted to accept a wide range of tooling and parts. The robot's flexibility has provided us an opportunity to address the unique challenges of conformal windows. Slurries and polishing active layers can easily be changed to adapt to varying materials and address grain decoration. We have the flexibility to change tool size and shape to address the varying sizes and shapes of conformal optics. In addition, the robotic platform can be a base for a deflectometry-based metrology tool to measure surface form error. This system, whose precision is independent of the robot's positioning accuracy, will allow us to measure optics in-situ saving time and reducing part risk. In conclusion, we will show examples of the conformal windows manufactured using our developed processes.

  15. Improvement of Simulation Method in Validation of Software of the Coordinate Measuring Systems

    Directory of Open Access Journals (Sweden)

    Nieciąg Halina

    2015-10-01

    Full Text Available Software is used in order to accomplish various tasks at each stage of the functioning of modern measuring systems. Before metrological confirmation of measuring equipment, the system has to be validated. This paper discusses the method for conducting validation studies of a fragment of software to calculate the values of measurands. Due to the number and nature of the variables affecting the coordinate measurement results and the complex character and multi-dimensionality of measurands, the study used the Monte Carlo method of numerical simulation. The article presents an attempt of possible improvement of results obtained by classic Monte Carlo tools. The algorithm LHS (Latin Hypercube Sampling was implemented as alternative to the simple sampling schema of classic algorithm.

  16. Atrioventricular junction (AVJ) motion tracking: a software tool with ITK/VTK/Qt.

    Science.gov (United States)

    Pengdong Xiao; Shuang Leng; Xiaodan Zhao; Hua Zou; Ru San Tan; Wong, Philip; Liang Zhong

    2016-08-01

    The quantitative measurement of the Atrioventricular Junction (AVJ) motion is an important index for ventricular functions of one cardiac cycle including systole and diastole. In this paper, a software tool that can conduct AVJ motion tracking from cardiovascular magnetic resonance (CMR) images is presented by using Insight Segmentation and Registration Toolkit (ITK), The Visualization Toolkit (VTK) and Qt. The software tool is written in C++ by using Visual Studio Community 2013 integrated development environment (IDE) containing both an editor and a Microsoft complier. The software package has been successfully implemented. From the software engineering practice, it is concluded that ITK, VTK, and Qt are very handy software systems to implement automatic image analysis functions for CMR images such as quantitative measure of motion by visual tracking.

  17. A Web-based modeling tool for the SEMAT Essence theory of software engineering

    Directory of Open Access Journals (Sweden)

    Daniel Graziotin

    2013-09-01

    Full Text Available As opposed to more mature subjects, software engineering lacks general theories that establish its foundations as a discipline. The Essence Theory of software engineering (Essence has been proposed by the Software Engineering Methods and Theory (SEMAT initiative. The goal of Essence is to develop a theoretically sound basis for software engineering practice and its wide adoption. However, Essence is far from reaching academic- and industry-wide adoption. The reasons for this include a struggle to foresee its utilization potential and a lack of tools for implementation. SEMAT Accelerator (SematAcc is a Web-positioning tool for a software engineering endeavor, which implements the SEMAT’s Essence kernel. SematAcc permits the use of Essence, thus helping to understand it. The tool enables the teaching, adoption, and research of Essence in controlled experiments and case studies.

  18. PT-SAFE: a software tool for development and annunciation of medical audible alarms.

    Science.gov (United States)

    Bennett, Christopher L; McNeer, Richard R

    2012-03-01

    Recent reports by The Joint Commission as well as the Anesthesia Patient Safety Foundation have indicated that medical audible alarm effectiveness needs to be improved. Several recent studies have explored various approaches to improving the audible alarms, motivating the authors to develop real-time software capable of comparing such alarms. We sought to devise software that would allow for the development of a variety of audible alarm designs that could also integrate into existing operating room equipment configurations. The software is meant to be used as a tool for alarm researchers to quickly evaluate novel alarm designs. A software tool was developed for the purpose of creating and annunciating audible alarms. The alarms consisted of annunciators that were mapped to vital sign data received from a patient monitor. An object-oriented approach to software design was used to create a tool that is flexible and modular at run-time, can annunciate wave-files from disk, and can be programmed with MATLAB by the user to create custom alarm algorithms. The software was tested in a simulated operating room to measure technical performance and to validate the time-to-annunciation against existing equipment alarms. The software tool showed efficacy in a simulated operating room environment by providing alarm annunciation in response to physiologic and ventilator signals generated by a human patient simulator, on average 6.2 seconds faster than existing equipment alarms. Performance analysis showed that the software was capable of supporting up to 15 audible alarms on a mid-grade laptop computer before audio dropouts occurred. These results suggest that this software tool provides a foundation for rapidly staging multiple audible alarm sets from the laboratory to a simulation environment for the purpose of evaluating novel alarm designs, thus producing valuable findings for medical audible alarm standardization.

  19. New software tools for enhanced precision in robot-assisted laser phonomicrosurgery.

    Science.gov (United States)

    Dagnino, Giulio; Mattos, Leonardo S; Caldwell, Darwin G

    2012-01-01

    This paper describes a new software package created to enhance precision during robot-assisted laser phonomicrosurgery procedures. The new software is composed of three tools for camera calibration, automatic tumor segmentation, and laser tracking. These were designed and developed to improve the outcome of this demanding microsurgical technique, and were tested herein to produce quantitative performance data. The experimental setup was based on the motorized laser micromanipulator created by Istituto Italiano di Tecnologia and the experimental protocols followed are fully described in this paper. The results show the new tools are robust and effective: The camera calibration tool reduced residual errors (RMSE) to 0.009 ± 0.002 mm under 40× microscope magnification; the automatic tumor segmentation tool resulted in deep lesion segmentations comparable to manual segmentations (RMSE= 0.160 ± 0.028 mm under 40× magnification); and the laser tracker tool proved to be reliable even during cutting procedures (RMSE= 0.073 ± 0.023 mm under 40× magnification). These results demonstrate the new software package can provide excellent improvements to the previous microsurgical system, leading to important enhancements in surgical outcome.

  20. SAQP pitch walk metrology using single target metrology

    Science.gov (United States)

    Fang, Fang; Herrera, Pedro; Kagalwala, Taher; Camp, Janay; Vaid, Alok; Pandev, Stilian; Zach, Franz

    2017-03-01

    Self-aligned quadruple patterning (SAQP) processes have found widespread acceptance in advanced technology nodes to drive device scaling beyond the resolution limitations of immersion scanners. Of the four spaces generated in this process from one lithography pattern two tend to be equivalent as they are derived from the first spacer deposition. The three independent spaces are commonly labelled as α, β and γ. α, β and γ are controlled by multiple process steps including the initial lithographic patterning process, the two mandrel and spacer etches as well as the two spacer depositions. Scatterometry has been the preferred metrology approach, however is restricted to repetitive arrays. In these arrays independent measurements, in particular of alpha and gamma, are not possible due to degeneracy of the standard array targets. . In this work we present a single target approach which lifts the degeneracies commonly encountered while using product relevant layout geometries. We will first describe the metrology approach which includes the previously described SRM (signal response metrology) combined with reference data derived from CD SEM data. The performance of the methodology is shown in figures 1-3. In these figures the optically determined values for alpha, beta and gamma are compared to the CD SEM reference data. The variations are achieved using controlled process experiments varying Mandrel CD and Spacer deposition thicknesses.

  1. Quantifying Human Response: Linking metrological and psychometric characterisations of Man as a Measurement Instrument

    International Nuclear Information System (INIS)

    Pendrill, L R; Fisher, William P Jr

    2013-01-01

    A better understanding of how to characterise human response is essential to improved person-centred care and other situations where human factors are crucial. Challenges to introducing classical metrological concepts such as measurement uncertainty and traceability when characterising Man as a Measurement Instrument include the failure of many statistical tools when applied to ordinal measurement scales and a lack of metrological references in, for instance, healthcare. The present work attempts to link metrological and psychometric (Rasch) characterisation of Man as a Measurement Instrument in a study of elementary tasks, such as counting dots, where one knows independently the expected value because the measurement object (collection of dots) is prepared in advance. The analysis is compared and contrasted with recent approaches to this problem by others, for instance using signal error fidelity

  2. A multi-center study benchmarks software tools for label-free proteome quantification

    Science.gov (United States)

    Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan

    2016-01-01

    The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404

  3. QInfer: Statistical inference software for quantum applications

    Directory of Open Access Journals (Sweden)

    Christopher Granade

    2017-04-01

    Full Text Available Characterizing quantum systems through experimental data is critical to applications as diverse as metrology and quantum computing. Analyzing this experimental data in a robust and reproducible manner is made challenging, however, by the lack of readily-available software for performing principled statistical analysis. We improve the robustness and reproducibility of characterization by introducing an open-source library, QInfer, to address this need. Our library makes it easy to analyze data from tomography, randomized benchmarking, and Hamiltonian learning experiments either in post-processing, or online as data is acquired. QInfer also provides functionality for predicting the performance of proposed experimental protocols from simulated runs. By delivering easy-to-use characterization tools based on principled statistical analysis, QInfer helps address many outstanding challenges facing quantum technology.

  4. Software Tools for Measuring and Calculating Electromagnetic Shielding Effectiveness

    National Research Council Canada - National Science Library

    Tesny, Neal

    2005-01-01

    The evaluation and the analysis of high-altitude electromagnetic pulse response of shielded enclosures require the availability of software tools able to acquire data and calculate shielding effectiveness...

  5. ProteoWizard: open source software for rapid proteomics tools development.

    Science.gov (United States)

    Kessner, Darren; Chambers, Matt; Burke, Robert; Agus, David; Mallick, Parag

    2008-11-01

    The ProteoWizard software project provides a modular and extensible set of open-source, cross-platform tools and libraries. The tools perform proteomics data analyses; the libraries enable rapid tool creation by providing a robust, pluggable development framework that simplifies and unifies data file access, and performs standard proteomics and LCMS dataset computations. The library contains readers and writers of the mzML data format, which has been written using modern C++ techniques and design principles and supports a variety of platforms with native compilers. The software has been specifically released under the Apache v2 license to ensure it can be used in both academic and commercial projects. In addition to the library, we also introduce a rapidly growing set of companion tools whose implementation helps to illustrate the simplicity of developing applications on top of the ProteoWizard library. Cross-platform software that compiles using native compilers (i.e. GCC on Linux, MSVC on Windows and XCode on OSX) is available for download free of charge, at http://proteowizard.sourceforge.net. This website also provides code examples, and documentation. It is our hope the ProteoWizard project will become a standard platform for proteomics development; consequently, code use, contribution and further development are strongly encouraged.

  6. Proceedings of the Workshop on software tools for distributed intelligent control systems

    Energy Technology Data Exchange (ETDEWEB)

    Herget, C.J. (ed.)

    1990-09-01

    The Workshop on Software Tools for Distributed Intelligent Control Systems was organized by Lawrence Livermore National Laboratory for the United States Army Headquarters Training and Doctrine Command and the Defense Advanced Research Projects Agency. The goals of the workshop were to the identify the current state of the art in tools which support control systems engineering design and implementation, identify research issues associated with writing software tools which would provide a design environment to assist engineers in multidisciplinary control design and implementation, formulate a potential investment strategy to resolve the research issues and develop public domain code which can form the core of more powerful engineering design tools, and recommend test cases to focus the software development process and test associated performance metrics. Recognizing that the development of software tools for distributed intelligent control systems will require a multidisciplinary effort, experts in systems engineering, control systems engineering, and compute science were invited to participate in the workshop. In particular, experts who could address the following topics were selected: operating systems, engineering data representation and manipulation, emerging standards for manufacturing data, mathematical foundations, coupling of symbolic and numerical computation, user interface, system identification, system representation at different levels of abstraction, system specification, system design, verification and validation, automatic code generation, and integration of modular, reusable code.

  7. COSTMODL - AN AUTOMATED SOFTWARE DEVELOPMENT COST ESTIMATION TOOL

    Science.gov (United States)

    Roush, G. B.

    1994-01-01

    The cost of developing computer software consumes an increasing portion of many organizations' budgets. As this trend continues, the capability to estimate the effort and schedule required to develop a candidate software product becomes increasingly important. COSTMODL is an automated software development estimation tool which fulfills this need. Assimilating COSTMODL to any organization's particular environment can yield significant reduction in the risk of cost overruns and failed projects. This user-customization capability is unmatched by any other available estimation tool. COSTMODL accepts a description of a software product to be developed and computes estimates of the effort required to produce it, the calendar schedule required, and the distribution of effort and staffing as a function of the defined set of development life-cycle phases. This is accomplished by the five cost estimation algorithms incorporated into COSTMODL: the NASA-developed KISS model; the Basic, Intermediate, and Ada COCOMO models; and the Incremental Development model. This choice affords the user the ability to handle project complexities ranging from small, relatively simple projects to very large projects. Unique to COSTMODL is the ability to redefine the life-cycle phases of development and the capability to display a graphic representation of the optimum organizational structure required to develop the subject project, along with required staffing levels and skills. The program is menu-driven and mouse sensitive with an extensive context-sensitive help system that makes it possible for a new user to easily install and operate the program and to learn the fundamentals of cost estimation without having prior training or separate documentation. The implementation of these functions, along with the customization feature, into one program makes COSTMODL unique within the industry. COSTMODL was written for IBM PC compatibles, and it requires Turbo Pascal 5.0 or later and Turbo

  8. Software tools for the particle accelerator designs

    International Nuclear Information System (INIS)

    Sugimoto, Masayoshi

    1988-01-01

    The software tools used for the designs of the particle accelerators are going to be implemented on the small computer systems, such as the personal computers or the work stations. These are called from the interactive environment like a window application program. The environment contains the small expert system to make easy to select the design parameters. (author)

  9. Computed tomography for dimensional metrology

    DEFF Research Database (Denmark)

    Kruth, J.P.; Bartscher, M.; Carmignato, S.

    2011-01-01

    metrology, putting emphasis on issues as accuracy, traceability to the unit of length (the meter) and measurement uncertainty. It provides a state of the art (anno 2011) and application examples, showing the aptitude of CT metrology to: (i) check internal dimensions that cannot be measured using traditional...

  10. Benchmarking State-of-the-Art Deep Learning Software Tools

    OpenAIRE

    Shi, Shaohuai; Wang, Qiang; Xu, Pengfei; Chu, Xiaowen

    2016-01-01

    Deep learning has been shown as a successful machine learning method for a variety of tasks, and its popularity results in numerous open-source deep learning software tools. Training a deep network is usually a very time-consuming process. To address the computational challenge in deep learning, many tools exploit hardware features such as multi-core CPUs and many-core GPUs to shorten the training time. However, different tools exhibit different features and running performance when training ...

  11. Exoskeletons, Robots and System Software: Tools for the Warfighter

    Science.gov (United States)

    2012-04-24

    Exoskeletons , Robots and System Software: Tools for the Warfighter? Paul Flanagan, Tuesday, April 24, 2012 11:15 am– 12:00 pm 1 “The views...Emerging technologies such as exoskeletons , robots , drones, and the underlying software are and will change the face of the battlefield. Warfighters will...global hub for educating, informing, and connecting Information Age leaders.” What is an exoskeleton ? An exoskeleton is a wearable robot suit that

  12. Acoustic holography as a metrological tool for characterizing medical ultrasound sources and fields

    Science.gov (United States)

    Sapozhnikov, Oleg A.; Tsysar, Sergey A.; Khokhlova, Vera A.; Kreider, Wayne

    2015-01-01

    Acoustic holography is a powerful technique for characterizing ultrasound sources and the fields they radiate, with the ability to quantify source vibrations and reduce the number of required measurements. These capabilities are increasingly appealing for meeting measurement standards in medical ultrasound; however, associated uncertainties have not been investigated systematically. Here errors associated with holographic representations of a linear, continuous-wave ultrasound field are studied. To facilitate the analysis, error metrics are defined explicitly, and a detailed description of a holography formulation based on the Rayleigh integral is provided. Errors are evaluated both for simulations of a typical therapeutic ultrasound source and for physical experiments with three different ultrasound sources. Simulated experiments explore sampling errors introduced by the use of a finite number of measurements, geometric uncertainties in the actual positions of acquired measurements, and uncertainties in the properties of the propagation medium. Results demonstrate the theoretical feasibility of keeping errors less than about 1%. Typical errors in physical experiments were somewhat larger, on the order of a few percent; comparison with simulations provides specific guidelines for improving the experimental implementation to reduce these errors. Overall, results suggest that holography can be implemented successfully as a metrological tool with small, quantifiable errors. PMID:26428789

  13. Acoustic holography as a metrological tool for characterizing medical ultrasound sources and fields.

    Science.gov (United States)

    Sapozhnikov, Oleg A; Tsysar, Sergey A; Khokhlova, Vera A; Kreider, Wayne

    2015-09-01

    Acoustic holography is a powerful technique for characterizing ultrasound sources and the fields they radiate, with the ability to quantify source vibrations and reduce the number of required measurements. These capabilities are increasingly appealing for meeting measurement standards in medical ultrasound; however, associated uncertainties have not been investigated systematically. Here errors associated with holographic representations of a linear, continuous-wave ultrasound field are studied. To facilitate the analysis, error metrics are defined explicitly, and a detailed description of a holography formulation based on the Rayleigh integral is provided. Errors are evaluated both for simulations of a typical therapeutic ultrasound source and for physical experiments with three different ultrasound sources. Simulated experiments explore sampling errors introduced by the use of a finite number of measurements, geometric uncertainties in the actual positions of acquired measurements, and uncertainties in the properties of the propagation medium. Results demonstrate the theoretical feasibility of keeping errors less than about 1%. Typical errors in physical experiments were somewhat larger, on the order of a few percent; comparison with simulations provides specific guidelines for improving the experimental implementation to reduce these errors. Overall, results suggest that holography can be implemented successfully as a metrological tool with small, quantifiable errors.

  14. A software tool for analyzing multichannel cochlear implant signals.

    Science.gov (United States)

    Lai, Wai Kong; Bögli, Hans; Dillier, Norbert

    2003-10-01

    A useful and convenient means to analyze the radio frequency (RF) signals being sent by a speech processor to a cochlear implant would be to actually capture and display them with appropriate software. This is particularly useful for development or diagnostic purposes. sCILab (Swiss Cochlear Implant Laboratory) is such a PC-based software tool intended for the Nucleus family of Multichannel Cochlear Implants. Its graphical user interface provides a convenient and intuitive means for visualizing and analyzing the signals encoding speech information. Both numerical and graphic displays are available for detailed examination of the captured CI signals, as well as an acoustic simulation of these CI signals. sCILab has been used in the design and verification of new speech coding strategies, and has also been applied as an analytical tool in studies of how different parameter settings of existing speech coding strategies affect speech perception. As a diagnostic tool, it is also useful for troubleshooting problems with the external equipment of the cochlear implant systems.

  15. SDMdata: A Web-Based Software Tool for Collecting Species Occurrence Records.

    Directory of Open Access Journals (Sweden)

    Xiaoquan Kong

    Full Text Available It is important to easily and efficiently obtain high quality species distribution data for predicting the potential distribution of species using species distribution models (SDMs. There is a need for a powerful software tool to automatically or semi-automatically assist in identifying and correcting errors. Here, we use Python to develop a web-based software tool (SDMdata to easily collect occurrence data from the Global Biodiversity Information Facility (GBIF and check species names and the accuracy of coordinates (latitude and longitude. It is an open source software (GNU Affero General Public License/AGPL licensed allowing anyone to access and manipulate the source code. SDMdata is available online free of charge from .

  16. On the evaluation of photogrammetric methods for dense 3D surface reconstruction in a metrological context

    Science.gov (United States)

    Toschi, I.; Capra, A.; De Luca, L.; Beraldin, J.-A.; Cournoyer, L.

    2014-05-01

    This paper discusses a methodology to evaluate the accuracy of recently developed image-based 3D modelling techniques. So far, the emergence of these novel methods has not been supported by the definition of an internationally recognized standard which is fundamental for user confidence and market growth. In order to provide an element of reflection and solution to the different communities involved in 3D imaging, a promising approach is presented in this paper for the assessment of both metric quality and limitations of an open-source suite of tools (Apero/MicMac), developed for the extraction of dense 3D point clouds from a set of unordered 2D images. The proposed procedural workflow is performed within a metrological context, through inter-comparisons with "reference" data acquired with two hemispherical laser scanners, one total station, and one laser tracker. The methodology is applied to two case studies, designed in order to analyse the software performances in dealing with both outdoor and environmentally controlled conditions, i.e. the main entrance of Cathédrale de la Major (Marseille, France) and a custom-made scene located at National Research Council of Canada 3D imaging Metrology Laboratory (Ottawa). Comparative data and accuracy evidence produced for both tests allow the study of some key factors affecting 3D model accuracy.

  17. Classifying Desirable Features of Software Visualization Tools for Corrective Maintenance

    NARCIS (Netherlands)

    Sensalire, Mariam; Ogao, Patrick; Telea, Alexandru

    2008-01-01

    We provide an evaluation of 15 software visualization tools applicable to corrective maintenance. The tasks supported as well as the techniques used are presented and graded based on the support level. By analyzing user acceptation of current tools, we aim to help developers to select what to

  18. An online database for plant image analysis software tools

    OpenAIRE

    Lobet, Guillaume; Draye, Xavier; Périlleux, Claire

    2013-01-01

    Background: Recent years have seen an increase in methods for plant phenotyping using image analyses. These methods require new software solutions for data extraction and treatment. These solutions are instrumental in supporting various research pipelines, ranging from the localisation of cellular compounds to the quantification of tree canopies. However, due to the variety of existing tools and the lack of central repository, it is challenging for researchers to identify the software that is...

  19. Applications of surface metrology in firearm identification

    International Nuclear Information System (INIS)

    Zheng, X; Soons, J; Vorburger, T V; Song, J; Renegar, T; Thompson, R

    2014-01-01

    Surface metrology is commonly used to characterize functional engineering surfaces. The technologies developed offer opportunities to improve forensic toolmark identification. Toolmarks are created when a hard surface, the tool, comes into contact with a softer surface and causes plastic deformation. Toolmarks are commonly found on fired bullets and cartridge cases. Trained firearms examiners use these toolmarks to link an evidence bullet or cartridge case to a specific firearm, which can lead to a criminal conviction. Currently, identification is typically based on qualitative visual comparison by a trained examiner using a comparison microscope. In 2009, a report by the National Academies called this method into question. Amongst other issues, they questioned the objectivity of visual toolmark identification by firearms examiners. The National Academies recommended the development of objective toolmark identification criteria and confidence limits. The National Institute of Standards and Technology (NIST) have applied its experience in surface metrology to develop objective identification criteria, measurement methods, and reference artefacts for toolmark identification. NIST developed the Standard Reference Material SRM 2460 standard bullet and SRM 2461 standard cartridge case to facilitate quality control and traceability of identifications performed in crime laboratories. Objectivity is improved through measurement of surface topography and application of unambiguous surface similarity metrics, such as the maximum value (ACCF MAX ) of the areal cross correlation function. Case studies were performed on consecutively manufactured tools, such as gun barrels and breech faces, to demonstrate that, even in this worst case scenario, all the tested tools imparted unique surface topographies that were identifiable. These studies provide scientific support for toolmark evidence admissibility in criminal court cases. (paper)

  20. ISWHM: Tools and Techniques for Software and System Health Management

    Science.gov (United States)

    Schumann, Johann; Mengshoel, Ole J.; Darwiche, Adnan

    2010-01-01

    This presentation presents status and results of research on Software Health Management done within the NRA "ISWHM: Tools and Techniques for Software and System Health Management." Topics include: Ingredients of a Guidance, Navigation, and Control System (GN and C); Selected GN and C Testbed example; Health Management of major ingredients; ISWHM testbed architecture; and Conclusions and next Steps.

  1. Accuracy Test of Software Architecture Compliance Checking Tools – Test Instruction

    NARCIS (Netherlands)

    Pruijt, Leo; van der Werf, J.M.E.M.|info:eu-repo/dai/nl/36950674X; Brinkkemper., Sjaak|info:eu-repo/dai/nl/07500707X

    2015-01-01

    Software Architecture Compliance Checking (SACC) is an approach to verify conformance of implemented program code to high-level models of architectural design. Static SACC focuses on the modular software architecture and on the existence of rule violating dependencies between modules. Accurate tool

  2. Choosing your weapons : on sentiment analysis tools for software engineering research

    NARCIS (Netherlands)

    Jongeling, R.M.; Datta, S.; Serebrenik, A.; Koschke, R.; Krinke, J.; Robillard, M.

    2015-01-01

    Recent years have seen an increasing attention to social aspects of software engineering, including studies of emotions and sentiments experienced and expressed by the software developers. Most of these studies reuse existing sentiment analysis tools such as SentiStrength and NLTK. However, these

  3. Metrological large range scanning probe microscope

    International Nuclear Information System (INIS)

    Dai Gaoliang; Pohlenz, Frank; Danzebrink, Hans-Ulrich; Xu Min; Hasche, Klaus; Wilkening, Guenter

    2004-01-01

    We describe a metrological large range scanning probe microscope (LR-SPM) with an Abbe error free design and direct interferometric position measurement capability, aimed at versatile traceable topographic measurements that require nanometer accuracy. A dual-stage positioning system was designed to achieve both a large measurement range and a high measurement speed. This dual-stage system consists of a commercially available stage, referred to as nanomeasuring machine (NMM), with a motion range of 25 mmx25 mmx5 mm along x, y, and z axes, and a compact z-axis piezoelectric positioning stage (compact z stage) with an extension range of 2 μm. The metrological LR-SPM described here senses the surface using a stationary fixed scanning force microscope (SFM) head working in contact mode. During operation, lateral scanning of the sample is performed solely by the NMM. Whereas the z motion, controlled by the SFM signal, is carried out by a combination of the NMM and the compact z stage. In this case the compact z stage, with its high mechanical resonance frequency (greater than 20 kHz), is responsible for the rapid motion while the NMM simultaneously makes slower movements over a larger motion range. To reduce the Abbe offset to a minimum the SFM tip is located at the intersection of three interferometer measurement beams orientated in x, y, and z directions. To improve real time performance two high-end digital signal processing (DSP) systems are used for NMM positioning and SFM servocontrol. Comprehensive DSP firmware and Windows XP-based software are implemented, providing a flexible and user-friendly interface. The instrument is able to perform large area imaging or profile scanning directly without stitching small scanned images. Several measurements on different samples such as flatness standards, nanostep height standards, roughness standards as well as sharp nanoedge samples and 1D gratings demonstrate the outstanding metrological capabilities of the instrument

  4. Virtual overlay metrology for fault detection supported with integrated metrology and machine learning

    Science.gov (United States)

    Lee, Hong-Goo; Schmitt-Weaver, Emil; Kim, Min-Suk; Han, Sang-Jun; Kim, Myoung-Soo; Kwon, Won-Taik; Park, Sung-Ki; Ryan, Kevin; Theeuwes, Thomas; Sun, Kyu-Tae; Lim, Young-Wan; Slotboom, Daan; Kubis, Michael; Staecker, Jens

    2015-03-01

    While semiconductor manufacturing moves toward the 7nm node for logic and 15nm node for memory, an increased emphasis has been placed on reducing the influence known contributors have toward the on product overlay budget. With a machine learning technique known as function approximation, we use a neural network to gain insight to how known contributors, such as those collected with scanner metrology, influence the on product overlay budget. The result is a sufficiently trained function that can approximate overlay for all wafers exposed with the lithography system. As a real world application, inline metrology can be used to measure overlay for a few wafers while using the trained function to approximate overlay vector maps for the entire lot of wafers. With the approximated overlay vector maps for all wafers coming off the track, a process engineer can redirect wafers or lots with overlay signatures outside the standard population to offline metrology for excursion validation. With this added flexibility, engineers will be given more opportunities to catch wafers that need to be reworked, resulting in improved yield. The quality of the derived corrections from measured overlay metrology feedback can be improved using the approximated overlay to trigger, which wafers should or shouldn't be, measured inline. As a development or integration engineer the approximated overlay can be used to gain insight into lots and wafers used for design of experiments (DOE) troubleshooting. In this paper we will present the results of a case study that follows the machine learning function approximation approach to data analysis, with production overlay measured on an inline metrology system at SK hynix.

  5. Development to requirements for a procedures software tool

    International Nuclear Information System (INIS)

    Yasutake, J.Y.; Hachiro Isoda

    1993-01-01

    In 1989, the Electric Power Research Institute (EPRI) and the Central Research Institute of the Electric Power Industry (CRIEPI) in Japan initiated a joint research program to investigate various interventions to reduce personnel errors and inefficiencies in the maintenance of nuclear power plants. This program, consisting of several interrelated projects, was initiated because of the mutual recognition of the importance of the human element in the efficient and safe operation of utilities and the continuing need to enhance personnel performance to sustain plant safety and availability. This paper summarizes one of the projects, jointly funded by EPRI and CRIEPI, to analyze the requirements for, and prepare a functional description of, a procedures software tool (PST). The primary objective of this project was to develop a description of the features and functions of a software tool that would help procedure writers to improve the quality of maintenance and testing procedures, thereby enhancing the performance of both procedure writers and maintenance personnel

  6. An evaluation of software tools for the design and development of cockpit displays

    Science.gov (United States)

    Ellis, Thomas D., Jr.

    1993-01-01

    The use of all-glass cockpits at the NASA Langley Research Center (LaRC) simulation facility has changed the means of design, development, and maintenance of instrument displays. The human-machine interface has evolved from a physical hardware device to a software-generated electronic display system. This has subsequently caused an increased workload at the facility. As computer processing power increases and the glass cockpit becomes predominant in facilities, software tools used in the design and development of cockpit displays are becoming both feasible and necessary for a more productive simulation environment. This paper defines LaRC requirements of a display software development tool and compares two available applications against these requirements. As a part of the software engineering process, these tools reduce development time, provide a common platform for display development, and produce exceptional real-time results.

  7. NIF Target Assembly Metrology Methodology and Results

    Energy Technology Data Exchange (ETDEWEB)

    Alger, E. T. [General Atomics, San Diego, CA (United States); Kroll, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dzenitis, E. G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Montesanti, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hughes, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Swisher, M. [IAP, Livermore, CA (United States); Taylor, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Segraves, K. [IAP, Livermore, CA (United States); Lord, D. M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Reynolds, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Castro, C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Edwards, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2011-01-01

    During our inertial confinement fusion (ICF) experiments at the National Ignition Facility (NIF) we require cryogenic targets at the 1-cm scale to be fabricated, assembled, and metrologized to micron-level tolerances. During assembly of these ICF targets, there are physical dimensmetrology is completed using optical coordinate measurement machines that provide repeatable measurements with micron precision, while also allowing in-process data collection for absolute accuracy in assembly. To date, 51 targets have been assembled and metrologized, and 34 targets have been successfully fielded on NIF relying on these metrology data. In the near future, ignition experiments on NIF will require tighter tolerances and more demanding target assembly and metrology capability. Metrology methods, calculations, and uncertainty estimates will be discussed. Target diagnostic port alignment, target position, and capsule location results will be reviewed for the 2009 Energetics Campaign. The information is presented via control charts showing the effect of process improvements that were made during target production. Certain parameters, including capsule position, met the 2009 campaign specifications but will have much tighter requirements in the future. Finally, in order to meet these new requirements assembly process changes and metrology capability upgrades will be necessary.

  8. TEST (Toxicity Estimation Software Tool) Ver 4.1

    Science.gov (United States)

    The Toxicity Estimation Software Tool (T.E.S.T.) has been developed to allow users to easily estimate toxicity and physical properties using a variety of QSAR methodologies. T.E.S.T allows a user to estimate toxicity without requiring any external programs. Users can input a chem...

  9. Development of tools for safety analysis of control software in advanced reactors

    International Nuclear Information System (INIS)

    Guarro, S.; Yau, M.; Motamed, M.

    1996-04-01

    Software based control systems have gained a pervasive presence in a wide variety of applications, including nuclear power plant control and protection systems which are within the oversight and licensing responsibility of the US Nuclear Regulatory Commission. While the cost effectiveness and flexibility of software based plant process control is widely recognized, it is very difficult to achieve and prove high levels of demonstrated dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. The development of tools to model, analyze and test software design and implementations in the context of the system that the software is designed to control can greatly assist the task of providing higher levels of assurance than those obtainable by software testing alone. This report presents and discusses the development of the Dynamic Flowgraph Methodology (DFM) and its application in the dependability and assurance analysis of software-based control systems. The features of the methodology and full-scale examples of application to both generic process and nuclear power plant control systems are presented and discussed in detail. The features of a workstation software tool developed to assist users in the application of DFM are also described

  10. Development of tools for safety analysis of control software in advanced reactors

    Energy Technology Data Exchange (ETDEWEB)

    Guarro, S.; Yau, M.; Motamed, M. [Advanced Systems Concepts Associates, El Segundo, CA (United States)

    1996-04-01

    Software based control systems have gained a pervasive presence in a wide variety of applications, including nuclear power plant control and protection systems which are within the oversight and licensing responsibility of the US Nuclear Regulatory Commission. While the cost effectiveness and flexibility of software based plant process control is widely recognized, it is very difficult to achieve and prove high levels of demonstrated dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. The development of tools to model, analyze and test software design and implementations in the context of the system that the software is designed to control can greatly assist the task of providing higher levels of assurance than those obtainable by software testing alone. This report presents and discusses the development of the Dynamic Flowgraph Methodology (DFM) and its application in the dependability and assurance analysis of software-based control systems. The features of the methodology and full-scale examples of application to both generic process and nuclear power plant control systems are presented and discussed in detail. The features of a workstation software tool developed to assist users in the application of DFM are also described.

  11. Metrology's role in quality assurance

    International Nuclear Information System (INIS)

    Zeederberg, L.B.

    1982-01-01

    Metrology, the science of measurement, is playing an increasing role in modern industry as part of an on-going quality assurance programme. At Escom, quality assurance was critical during the construction of the Koeberg nuclear facility, and also a function in controlling services provided by Escom. This article deals with the role metrology plays in quality assurance

  12. Preface: The 5th International Workshop on X-ray Mirror Design, Fabrication, and Metrology

    Energy Technology Data Exchange (ETDEWEB)

    Assoufid, Lahsen [Argonne National Laboratory, 9700 South Cass Avenue, Lemont, Illinois 60439 (United States); Goldberg, Kenneth; Yashchuk, Valeriy V. [Lawrence Berkeley National Laboratory, 1 Cyclotron Road, Berkeley, California 94720 (United States)

    2016-05-15

    Recent developments in synchrotron storage rings and free-electron laser-based x-ray sources with ever-increasing brightness and coherent flux have pushed x-ray optics requirements to new frontiers. This Special Topic gathers a set of articles derived from a subset of the key presentations of the International Workshop on X-ray Mirrors Fabrication (IWXM-2015) and Metrology held at Lawrence Berkley National Laboratory, Berkeley, California, USA, July 14–16, 2015. The workshop objective was to report on recent progress in x-ray synchrotron radiation mirrors fabrication as well as on new developments in related metrology tools and methods.

  13. An Open-Source Tool Set Enabling Analog-Digital-Software Co-Design

    Directory of Open Access Journals (Sweden)

    Michelle Collins

    2016-02-01

    Full Text Available This paper presents an analog-digital hardware-software co-design environment for simulating and programming reconfigurable systems. The tool simulates, designs, as well as enables experimental measurements after compiling to configurable systems in the same integrated design tool framework. High level software in Scilab/Xcos (open-source programs similar to MATLAB/Simulink that converts the high-level block description by the user to blif format (sci2blif, which acts as an input to the modified VPR tool, including the code v p r 2 s w c s , encoding the specific platform through specific architecture files, resulting in a targetable switch list on the resulting configurable analog–digital system. The resulting tool uses an analog and mixed-signal library of components, enabling users and future researchers access to the basic analog operations/computations that are possible.

  14. Future metrology needs for FEL reflective optics

    International Nuclear Information System (INIS)

    Assoufid, L.

    2000-01-01

    An International Workshop on Metrology for X-ray and Neutron Optics has been held March 16-17, 2000, at the Advanced Photon Source, Argonne National Laboratory, near Chicago, Illinois (USA). The workshop gathered engineers and scientists from both the U.S. and around the world to evaluate metrology instrumentation and methods used to characterize surface figure and finish for long grazing incidence optics used in beamlines at synchrotrons radiation sources. This two-day workshop was motivated by the rapid evolution in the performance of x-ray and neutron sources along with requirements in optics figure and finish. More specifically, the performance of future light sources, such as free-electron laser (FEL)-based x-ray sources, is being pushed to new limits in term of both brilliance and coherence. As a consequence, tolerances on surface figure and finish of the next generation of optics are expected to become tighter. The timing of the workshop provided an excellent opportunity to study the problem, evaluate the state of the art in metrology instrumentation, and stimulate innovation on future metrology instruments and techniques to be used to characterize these optics. This paper focuses on FEL optics and metrology needs. (A more comprehensive summary of the workshop can be found elsewhere.) The performance and limitations of current metrology instrumentation will be discussed and recommendations from the workshop on future metrology development to meet the FEL challenges will be detailed

  15. Future metrology needs for FEL reflective optics.

    Energy Technology Data Exchange (ETDEWEB)

    Assoufid, L.

    2000-09-21

    An International Workshop on Metrology for X-ray and Neutron Optics has been held March 16-17, 2000, at the Advanced Photon Source, Argonne National Laboratory, near Chicago, Illinois (USA). The workshop gathered engineers and scientists from both the U.S. and around the world to evaluate metrology instrumentation and methods used to characterize surface figure and finish for long grazing incidence optics used in beamlines at synchrotrons radiation sources. This two-day workshop was motivated by the rapid evolution in the performance of x-ray and neutron sources along with requirements in optics figure and finish. More specifically, the performance of future light sources, such as free-electron laser (FEL)-based x-ray sources, is being pushed to new limits in term of both brilliance and coherence. As a consequence, tolerances on surface figure and finish of the next generation of optics are expected to become tighter. The timing of the workshop provided an excellent opportunity to study the problem, evaluate the state of the art in metrology instrumentation, and stimulate innovation on future metrology instruments and techniques to be used to characterize these optics. This paper focuses on FEL optics and metrology needs. (A more comprehensive summary of the workshop can be found elsewhere.) The performance and limitations of current metrology instrumentation will be discussed and recommendations from the workshop on future metrology development to meet the FEL challenges will be detailed.

  16. A Reference Architecture for Providing Tools as a Service to Support Global Software Development

    DEFF Research Database (Denmark)

    Chauhan, Aufeef

    2014-01-01

    -computing paradigm for addressing above-mentioned issues by providing a framework to select appropriate tools as well as associated services and reference architecture of the cloud-enabled middleware platform that allows on demand provisioning of software engineering Tools as a Service (TaaS) with focus......Global Software Development (GSD) teams encounter challenges that are associated with distribution of software development activities across multiple geographic regions. The limited support for performing collaborative development and engineering activities and lack of sufficient support......-based solutions. The restricted ability of the organizations to have desired alignment of tools with software engineering and development processes results in administrative and managerial overhead that incur increased development cost and poor product quality. Moreover, stakeholders involved in the projects have...

  17. Classroom Live: a software-assisted gamification tool

    Science.gov (United States)

    de Freitas, Adrian A.; de Freitas, Michelle M.

    2013-06-01

    Teachers have come to rely on a variety of approaches in order to elicit and sustain student interest in the classroom. One particular approach, known as gamification, seeks to improve student engagement by transforming the traditional classroom experience into a competitive multiplayer game. Initial attempts at classroom gamification relied on the teacher manually tracking student progress. At the US Air Force Academy, we wanted to experiment with a software gamification tool. Our client/server suite, dubbed Classroom Live, streamlines the gamification process for the teacher by simplifying common tasks. Simultaneously, the tool provides students with an esthetically pleasing user interface that offers in game rewards in exchange for their participation. Classroom Live is still in development, but our initial experience using the tool has been extremely positive and confirms our belief that students respond positively to gamification, even at the undergraduate level.

  18. Laboratorio de Metrología - LABM

    OpenAIRE

    Jaramillo Ch., Zaira J.

    2011-01-01

    esos y transacciones de forma transparente y justa para todas las partes involucradas. Una herramienta necesaria para este propósito es la Metrología, ciencia que es utilizada en el Laboratorio de Metrología (LABM) del Centro Experimenta

  19. DAE Tools: equation-based object-oriented modelling, simulation and optimisation software

    Directory of Open Access Journals (Sweden)

    Dragan D. Nikolić

    2016-04-01

    Full Text Available In this work, DAE Tools modelling, simulation and optimisation software, its programming paradigms and main features are presented. The current approaches to mathematical modelling such as the use of modelling languages and general-purpose programming languages are analysed. The common set of capabilities required by the typical simulation software are discussed, and the shortcomings of the current approaches recognised. A new hybrid approach is introduced, and the modelling languages and the hybrid approach are compared in terms of the grammar, compiler, parser and interpreter requirements, maintainability and portability. The most important characteristics of the new approach are discussed, such as: (1 support for the runtime model generation; (2 support for the runtime simulation set-up; (3 support for complex runtime operating procedures; (4 interoperability with the third party software packages (i.e. NumPy/SciPy; (5 suitability for embedding and use as a web application or software as a service; and (6 code-generation, model exchange and co-simulation capabilities. The benefits of an equation-based approach to modelling, implemented in a fourth generation object-oriented general purpose programming language such as Python are discussed. The architecture and the software implementation details as well as the type of problems that can be solved using DAE Tools software are described. Finally, some applications of the software at different levels of abstraction are presented, and its embedding capabilities and suitability for use as a software as a service is demonstrated.

  20. Automated hotspot analysis with aerial image CD metrology for advanced logic devices

    Science.gov (United States)

    Buttgereit, Ute; Trautzsch, Thomas; Kim, Min-ho; Seo, Jung-Uk; Yoon, Young-Keun; Han, Hak-Seung; Chung, Dong Hoon; Jeon, Chan-Uk; Meyers, Gary

    2014-09-01

    Continuously shrinking designs by further extension of 193nm technology lead to a much higher probability of hotspots especially for the manufacturing of advanced logic devices. The CD of these potential hotspots needs to be precisely controlled and measured on the mask. On top of that, the feature complexity increases due to high OPC load in the logic mask design which is an additional challenge for CD metrology. Therefore the hotspot measurements have been performed on WLCD from ZEISS, which provides the benefit of reduced complexity by measuring the CD in the aerial image and qualifying the printing relevant CD. This is especially of advantage for complex 2D feature measurements. Additionally, the data preparation for CD measurement becomes more critical due to the larger amount of CD measurements and the increasing feature diversity. For the data preparation this means to identify these hotspots and mark them automatically with the correct marker required to make the feature specific CD measurement successful. Currently available methods can address generic pattern but cannot deal with the pattern diversity of the hotspots. The paper will explore a method how to overcome those limitations and to enhance the time-to-result in the marking process dramatically. For the marking process the Synopsys WLCD Output Module was utilized, which is an interface between the CATS mask data prep software and the WLCD metrology tool. It translates the CATS marking directly into an executable WLCD measurement job including CD analysis. The paper will describe the utilized method and flow for the hotspot measurement. Additionally, the achieved results on hotspot measurements utilizing this method will be presented.

  1. Can agile software tools bring the benefits of a task board to globally distributed teams?

    NARCIS (Netherlands)

    Katsma, Christiaan; Amrit, Chintan Amrit; van Hillegersberg, Jos; Sikkel, Nicolaas; Oshri, Ilan; Kotlarsky, Julia; Willcocks, Leslie P.

    Software-based tooling has become an essential part of globally disitrbuted software development. In this study we focus on the usage of such tools and task boards in particular. We investigate the deployment of these tools through a field research in 4 different companies that feature agile and

  2. Software reference for SaTool - a Tool for Structural Analysis of Automated Systems

    DEFF Research Database (Denmark)

    Lorentzen, Torsten; Blanke, Mogens

    2004-01-01

    This software reference details the functions of SaTool – a tool for structural analysis of technical systems. SaTool is intended used as part of an industrial systems design cycle. Structural analysis is a graph-based technique where principal relations between variables express the system’s...... of the graph. SaTool makes analysis of the structure graph to provide knowledge about fundamental properties of the system in normal and faulty conditions. Salient features of SaTool include rapid analysis of possibility to diagnose faults and ability to make autonomous recovery should faults occur........ The list of such variables and functional relations constitute the system’s structure graph. Normal operation means all functional relations are intact. Should faults occur, one or more functional relations cease to be valid. In a structure graph, this is seen as the disappearance of one or more nodes...

  3. CLAIRE, an event-driven simulation tool for testing software

    International Nuclear Information System (INIS)

    Raguideau, J.; Schoen, D.; Henry, J.Y.; Boulc'h, J.

    1994-06-01

    CLAIRE is a software tool created to perform validations on executable codes or on specifications of distributed real-time applications for nuclear safety. CLAIRE can be used both to verify the safety properties by modelling the specifications, and also to validate the final code by simulating the behaviour of its equipment and software interfaces. It can be used to observe and provide dynamic control of the simulation process, and also to record changes to the simulated data for off-line analysis. (R.P.)

  4. Problems of metrological supply of carbon materials production

    International Nuclear Information System (INIS)

    Belov, G.V.; Bazilevskij, L.P.; Cherkashina, N.V.

    1989-01-01

    Carbon materials and products contain internal residual stresses and have an anisotropy of properties therefore special methods of tests are required to control their quality. The main metrological problems during development, production and application of carbon products are: metrological supply of production forms and records during the development of production conditions; metrological supply of quality control of the product; metrological supply of methods for the tests of products and the methods to forecast the characteristics of product quality for the period of quaranteed service life

  5. Validation of virtual instrument for data analysis in metrology of time and frequency

    International Nuclear Information System (INIS)

    Jordao, Bruno; Quaresma, Daniel; Rocha, Pedro; Carvalho, Ricardo; Peixoto, Jose Guilherme

    2016-01-01

    Commercial Software (CS) for collection, analysis and plot time and frequency data plots are being increasingly used in reference laboratories worldwide. With this, it has greatly improved the results of calculations of uncertainty for these values. We propose the creation of a collection of software and data analysis using Virtual Instruments (VI) developed the Primary Laboratory Time and frequency of the National Observatory - ON and validation of this instrument. To validate the instrument developed, it made a comparative analysis between the results obtained (VI) with the results obtained by (CS) widely used in many metrology laboratories. From these results we can conclude that there was equivalence between the analyzed data. (author)

  6. Metrology in the Bolivia-Brazil Pipeline; Medicao no gasoduro Bolivia-Brasil

    Energy Technology Data Exchange (ETDEWEB)

    Palhares, Julio C.C.M.; Nunes, Ildemar Pinto [TBG - Transportadora Brasileira Gasoduto Bolivia Brasil S.A., Rio de Janeiro, RJ (Brazil)

    2003-07-01

    measurement guideline of TBG seeks to be always assisting to the customer's needs and aligned with the changes of the market of natural gas. In five years of existence, TBG attended the forming regulatory legislation and the establishment of the contract fiscal, important marks of the evolution of the market. This work presents the definitions that orientated the metrological issues of TBG, making use of efficient tools in the answers to each demand and seeking to satisfy its own needs, its customers' needs and all the new regulatory demands. This paper approaches, the calibration procedures, the qualification of suppliers, maintenance of the metrological reliability, the daily confirmation of the delivered volumes, the fail treatment, and the unaccounted gas monitoring in rigorous limits practiced in world class companies in foreigner countries. (author)

  7. Metrology for WEST components design and integration optimization

    International Nuclear Information System (INIS)

    Brun, C.; Archambeau, G.; Blanc, L.; Bucalossi, J.; Chantant, M.; Gargiulo, L.; Hermenier, A.; Le, R.; Pilia, A.

    2015-01-01

    Highlights: • Metrology methods. • Interests of metrology campaign to optimize margins by reducing uncertainties. • Assembly problems are solved and validated on a numerical mock up. • Post treatment of full 3DScan of the vacuum vessel. - Abstract: On WEST new components will be implemented in an existing environment, emphasis has to be put on the metrology to optimize the design and the assembly. Hence, at a particular stage of the project, several components have to coexist in the limited vessel. Therefore, all the difficulty consists in validating the mechanical interfaces between existing components and new one; minimize the risk of the assembling and to maximize the plasma volume. The CEA/IRFM takes the opportunity of the ambitious project to sign a partnership with an industrial specialized in multipurpose metrology domains. To optimize the assembly procedure, the IRFM Assembly group works in strong collaboration with its industrial, to define and plan the campaigns of metrology. The paper will illustrate the organization, methods and results of the dedicated metrology campaigns have been defined and carried out in the WEST dis/assembly phase. To conclude, the future needs of metrology at CEA/IRFM will be exposed to define the next steps.

  8. Software for imaging phase-shift interference microscope

    Science.gov (United States)

    Malinovski, I.; França, R. S.; Couceiro, I. B.

    2018-03-01

    In recent years absolute interference microscope was created at National Metrology Institute of Brazil (INMETRO). The instrument by principle of operation is imaging phase-shifting interferometer (PSI) equipped with two stabilized lasers of different colour as traceable reference wavelength sources. We report here some progress in development of the software for this instrument. The status of undergoing internal validation and verification of the software is also reported. In contrast with standard PSI method, different methodology of phase evaluation is applied. Therefore, instrument specific procedures for software validation and verification are adapted and discussed.

  9. Towards early software reliability prediction for computer forensic tools (case study).

    Science.gov (United States)

    Abu Talib, Manar

    2016-01-01

    Versatility, flexibility and robustness are essential requirements for software forensic tools. Researchers and practitioners need to put more effort into assessing this type of tool. A Markov model is a robust means for analyzing and anticipating the functioning of an advanced component based system. It is used, for instance, to analyze the reliability of the state machines of real time reactive systems. This research extends the architecture-based software reliability prediction model for computer forensic tools, which is based on Markov chains and COSMIC-FFP. Basically, every part of the computer forensic tool is linked to a discrete time Markov chain. If this can be done, then a probabilistic analysis by Markov chains can be performed to analyze the reliability of the components and of the whole tool. The purposes of the proposed reliability assessment method are to evaluate the tool's reliability in the early phases of its development, to improve the reliability assessment process for large computer forensic tools over time, and to compare alternative tool designs. The reliability analysis can assist designers in choosing the most reliable topology for the components, which can maximize the reliability of the tool and meet the expected reliability level specified by the end-user. The approach of assessing component-based tool reliability in the COSMIC-FFP context is illustrated with the Forensic Toolkit Imager case study.

  10. Temperature metrology

    Science.gov (United States)

    Fischer, J.; Fellmuth, B.

    2005-05-01

    The majority of the processes used by the manufacturing industry depend upon the accurate measurement and control of temperature. Thermal metrology is also a key factor affecting the efficiency and environmental impact of many high-energy industrial processes, the development of innovative products and the health and safety of the general population. Applications range from the processing, storage and shipment of perishable foodstuffs and biological materials to the development of more efficient and less environmentally polluting combustion processes for steel-making. Accurate measurement and control of temperature is, for instance, also important in areas such as the characterization of new materials used in the automotive, aerospace and semiconductor industries. This paper reviews the current status of temperature metrology. It starts with the determination of thermodynamic temperatures required on principle because temperature is an intensive quantity. Methods to determine thermodynamic temperatures are reviewed in detail to introduce the underlying physical basis. As these methods cannot usually be applied for practical measurements the need for a practical temperature scale for day-to-day work is motivated. The International Temperature Scale of 1990 and the Provisional Low Temperature Scale PLTS-2000 are described as important parts of the International System of Units to support science and technology. Its main importance becomes obvious in connection with industrial development and international markets. Every country is strongly interested in unique measures, in order to guarantee quality, reproducibility and functionability of products. The eventual realization of an international system, however, is only possible within the well-functioning organization of metrological laboratories. In developed countries the government established scientific institutes have certain metrological duties, as, for instance, the maintenance and dissemination of national

  11. Temperature metrology

    International Nuclear Information System (INIS)

    Fischer, J; Fellmuth, B

    2005-01-01

    The majority of the processes used by the manufacturing industry depend upon the accurate measurement and control of temperature. Thermal metrology is also a key factor affecting the efficiency and environmental impact of many high-energy industrial processes, the development of innovative products and the health and safety of the general population. Applications range from the processing, storage and shipment of perishable foodstuffs and biological materials to the development of more efficient and less environmentally polluting combustion processes for steel-making. Accurate measurement and control of temperature is, for instance, also important in areas such as the characterization of new materials used in the automotive, aerospace and semiconductor industries. This paper reviews the current status of temperature metrology. It starts with the determination of thermodynamic temperatures required on principle because temperature is an intensive quantity. Methods to determine thermodynamic temperatures are reviewed in detail to introduce the underlying physical basis. As these methods cannot usually be applied for practical measurements the need for a practical temperature scale for day-to-day work is motivated. The International Temperature Scale of 1990 and the Provisional Low Temperature Scale PLTS-2000 are described as important parts of the International System of Units to support science and technology. Its main importance becomes obvious in connection with industrial development and international markets. Every country is strongly interested in unique measures, in order to guarantee quality, reproducibility and functionability of products. The eventual realization of an international system, however, is only possible within the well-functioning organization of metrological laboratories. In developed countries the government established scientific institutes have certain metrological duties, as, for instance, the maintenance and dissemination of national

  12. An alternative method to achieve metrological confirmation in measurement process

    Science.gov (United States)

    Villeta, M.; Rubio, E. M.; Sanz, A.; Sevilla, L.

    2012-04-01

    Metrological confirmation process must be designed and implemented to ensure that metrological characteristics of the measurement system meet metrological requirements of the measurement process. The aim of this paper is to present an alternative method to the traditional metrological requirements about the relationship between tolerance and measurement uncertainty, to develop such confirmation processes. The proposed way to metrological confirmation considers a given inspection task of the measurement process into the manufacturing system, and it is based on the Index of Contamination of the Capability, ICC. Metrological confirmation process is then developed taking into account the producer risks and economic considerations on this index. As a consequence, depending on the capability of the manufacturing process, the measurement system will be or will not be in adequate state of metrological confirmation for the measurement process.

  13. Dimensional micro and nano metrology

    DEFF Research Database (Denmark)

    Hansen, Hans Nørgaard; da Costa Carneiro, Kim; Haitjema, Han

    2006-01-01

    The need for dimensional micro and nano metrology is evident, and as critical dimensions are scaled down and geometrical complexity of objects is increased, the available technologies appear not sufficient. Major research and development efforts have to be undertaken in order to answer these chal......The need for dimensional micro and nano metrology is evident, and as critical dimensions are scaled down and geometrical complexity of objects is increased, the available technologies appear not sufficient. Major research and development efforts have to be undertaken in order to answer...... these challenges. The developments have to include new measuring principles and instrumentation, tolerancing rules and procedures as well as traceability and calibration. The current paper describes issues and challenges in dimensional micro and nano metrology by reviewing typical measurement tasks and available...

  14. Critical issues in overlay metrology

    International Nuclear Information System (INIS)

    Sullivan, Neal T.

    2001-01-01

    In this paper, following an overview of overlay metrology, the difficult relationship of overlay with device performance and yield is discussed and supported with several examples. This is followed by a discussion of the impending collision of metrology equipment performance and 'real' process tolerances for sub 0.18 um technologies. This convergence of tolerance and performance is demonstrated to lead to the current emergence of real-time overlay modeling in a feed-forward/feedback process environment and the associated metrology/sampling implications. This modeling takes advantage of the wealth of understanding concerning the systematic behavior of overlay registration errors. Finally, the impact of new process technologies (RET, OAI, CPSM, CMP, and etc.) on the measurement target is discussed and shown to de-stabilize overlay performance on standard overlay measurement target designs

  15. Radioactivity metrology

    International Nuclear Information System (INIS)

    Legrand, J.

    1979-01-01

    Some aspects of the radioactivity metrology are reviewed. Radioactivity primary references; absolute methods of radioactivity measurements used in the Laboratoire de Metrologie des Rayonnements Ionisants; relative measurement methods; traceability through international comparisons and interlaboratory tests; production and distribution of secondary standards [fr

  16. User Driven Development of Software Tools for Open Data Discovery and Exploration

    Science.gov (United States)

    Schlobinski, Sascha; Keppel, Frank; Dihe, Pascal; Boot, Gerben; Falkenroth, Esa

    2016-04-01

    The use of open data in research faces challenges not restricted to inherent properties such as data quality, resolution of open data sets. Often Open data is catalogued insufficiently or fragmented. Software tools that support the effective discovery including the assessment of the data's appropriateness for research have shortcomings such as the lack of essential functionalities like support for data provenance. We believe that one of the reasons is the neglect of real end users requirements in the development process of aforementioned software tools. In the context of the FP7 Switch-On project we have pro-actively engaged the relevant user user community to collaboratively develop a means to publish, find and bind open data relevant for hydrologic research. Implementing key concepts of data discovery and exploration we have used state of the art web technologies to provide an interactive software tool that is easy to use yet powerful enough to satisfy the data discovery and access requirements of the hydrological research community.

  17. FOREWORD: Materials metrology Materials metrology

    Science.gov (United States)

    Bennett, Seton; Valdés, Joaquin

    2010-04-01

    It seems that so much of modern life is defined by the materials we use. From aircraft to architecture, from cars to communications, from microelectronics to medicine, the development of new materials and the innovative application of existing ones have underpinned the technological advances that have transformed the way we live, work and play. Recognizing the need for a sound technical basis for drafting codes of practice and specifications for advanced materials, the governments of countries of the Economic Summit (G7) and the European Commission signed a Memorandum of Understanding in 1982 to establish the Versailles Project on Advanced Materials and Standards (VAMAS). This project supports international trade by enabling scientific collaboration as a precursor to the drafting of standards. The VAMAS participants recognized the importance of agreeing a reliable, universally accepted basis for the traceability of the measurements on which standards depend for their preparation and implementation. Seeing the need to involve the wider metrology community, VAMAS approached the Comité International des Poids et Mesures (CIPM). Following discussions with NMI Directors and a workshop at the BIPM in February 2005, the CIPM decided to establish an ad hoc Working Group on the metrology applicable to the measurement of material properties. The Working Group presented its conclusions to the CIPM in October 2007 and published its final report in 2008, leading to the signature of a Memorandum of Understanding between VAMAS and the BIPM. This MoU recognizes the work that is already going on in VAMAS as well as in the Consultative Committees of the CIPM and establishes a framework for an ongoing dialogue on issues of materials metrology. The question of what is meant by traceability in the metrology of the properties of materials is particularly vexed when the measurement results depend on a specified procedure. In these cases, confidence in results requires not only traceable

  18. Performance Evaluation of a Software Engineering Tool for Automated Design of Cooling Systems in Injection Moulding

    DEFF Research Database (Denmark)

    Jauregui-Becker, Juan M.; Tosello, Guido; van Houten, Fred J.A.M.

    2013-01-01

    This paper presents a software tool for automating the design of cooling systems for injection moulding and a validation of its performance. Cooling system designs were automatically generated by the proposed software tool and by applying a best practice tool engineering design approach. The two...

  19. Profile variation impact on FIB cross-section metrology

    Science.gov (United States)

    Cordes, Aaron; Bunday, Benjamin; Nadeau, Jim

    2012-03-01

    The focused ion beam (FIB) milling tool is an important component of reference metrology and process characterization, both as a supporting instrument for bulk sample preparation before forwarding to the transmission electron microscope (TEM) and other instruments and as an in situ measurement instrument using angled scanning electron microscopy. As features grow denser, deeper and more demanding, full-profile reference metrology is needed, and this methodology will only grow in importance. Thus, the ability to extract accurate dimensional and profile information out of the crosssectional faces produced by FIB milling is critical. For features that demonstrate perfect symmetry in the plane of the cross section, analyzing images and extracting metrology data are straightforward. However, for industrial materials, symmetry is not a safe assumption: as features shrink, the line edge and sidewall roughness increases as a percentage of the overall feature dimension. Furthermore, with the introduction of more complex architectures such as 3D memory and FinFETs, the areas of greatest interest, such as the intersections of wrap-around gates, cannot be assumed to be symmetrical in any given plane if cut placement is not precisely controlled. Therefore it is important to establish the exact location and repeatability of the cross-section plane, both in terms of coordinate placement and effective angle of the milled surface. To this end, we prepared designed-in line edge roughness samples in the Albany Nanotech facility using SEMATECH's AMAG6 metrology reticle. The samples were thoroughly characterized before being milled by a non-destructive, sidewall-scanning atomic force microscope (AFM). These samples are then milled and measured under varying process and setup parameters using a single-beam FIB with angled SEM. We established methodologies that allow precise alignment of the cut planes of slice-and-view FIB milling to 3D-AFM scan lines to compare repeated sections

  20. Metrology for Fuel Cell Manufacturing

    Energy Technology Data Exchange (ETDEWEB)

    Stocker, Michael [National Inst. of Standards and Technology, Gaithersburg, MD (United States); Stanfield, Eric [National Inst. of Standards and Technology, Gaithersburg, MD (United States)

    2015-02-04

    The project was divided into three subprojects. The first subproject is Fuel Cell Manufacturing Variability and Its Impact on Performance. The objective was to determine if flow field channel dimensional variability has an impact on fuel cell performance. The second subproject is Non-contact Sensor Evaluation for Bipolar Plate Manufacturing Process Control and Smart Assembly of Fuel Cell Stacks. The objective was to enable cost reduction in the manufacture of fuel cell plates by providing a rapid non-contact measurement system for in-line process control. The third subproject is Optical Scatterfield Metrology for Online Catalyst Coating Inspection of PEM Soft Goods. The objective was to evaluate the suitability of Optical Scatterfield Microscopy as a viable measurement tool for in situ process control of catalyst coatings.

  1. Distributed large-scale dimensional metrology new insights

    CERN Document Server

    Franceschini, Fiorenzo; Maisano, Domenico

    2011-01-01

    Focuses on the latest insights into and challenges of distributed large scale dimensional metrology Enables practitioners to study distributed large scale dimensional metrology independently Includes specific examples of the development of new system prototypes

  2. Benchmarking therapeutic drug monitoring software: a review of available computer tools.

    Science.gov (United States)

    Fuchs, Aline; Csajka, Chantal; Thoma, Yann; Buclin, Thierry; Widmer, Nicolas

    2013-01-01

    Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare

  3. Learn by Yourself: The Self-Learning Tools for Qualitative Analysis Software Packages

    Science.gov (United States)

    Freitas, Fábio; Ribeiro, Jaime; Brandão, Catarina; Reis, Luís Paulo; de Souza, Francislê Neri; Costa, António Pedro

    2017-01-01

    Computer Assisted Qualitative Data Analysis Software (CAQDAS) are tools that help researchers to develop qualitative research projects. These software packages help the users with tasks such as transcription analysis, coding and text interpretation, writing and annotation, content search and analysis, recursive abstraction, grounded theory…

  4. Metrology and ionospheric observation standards

    Science.gov (United States)

    Panshin, Evgeniy; Minligareev, Vladimir; Pronin, Anton

    Accuracy and ionospheric observation validity are urgent trends nowadays. WMO, URSI and national metrological and standardisation services bring forward requirements and descriptions of the ionospheric observation means. Researches in the sphere of metrological and standardisation observation moved to the next level in the Russian Federation. Fedorov Institute of Applied Geophysics (IAG) is in charge of ionospheric observation in the Russian Federation and the National Technical Committee, TC-101 , which was set up on the base of IAG- of the standardisation in the sphere. TC-101 can be the platform for initiation of the core international committee in the network of ISO The new type of the ionosounde “Parus-A” is engineered, which is up to the national requirements. “Parus-A” calibration and test were conducted by National metrological Institute (NMI) -D.I. Mendeleyev Institute for Metrology (VNIIM), signed CIMP MRA in 1991. VNIIM is a basic NMI in the sphere of Space weather (including ionospheric observations), the founder of which was celebrated chemist and metrologist Dmitriy I. Mendeleyev. Tests and calibration were carried out for the 1st time throughout 50-year-history of ionosonde exploitation in Russia. The following metrological characteristics were tested: -measurement range of radiofrequency time delay 0.5-10 ms; -time measurement inaccuracy of radio- frequency pulse ±12mcs; -frequency range of radio impulse 1-20 MHz ; -measurement inaccuracy of radio impulse carrier frequency± 5KHz. For example, the sound impulse simulator that was built-in in the ionosounde was used for measurement range of radiofrequency time delay testing. The number of standards on different levels is developed. - “Ionospheric observation guidance”; - “The Earth ionosphere. Terms and definitions”.

  5. RAVEN AS A TOOL FOR DYNAMIC PROBABILISTIC RISK ASSESSMENT: SOFTWARE OVERVIEW

    Energy Technology Data Exchange (ETDEWEB)

    Alfonsi Andrea; Mandelli Diego; Rabiti Cristian; Joshua Cogliati; Robert Kinoshita

    2013-05-01

    RAVEN is a software tool under development at the Idaho National Laboratory (INL) that acts as the control logic driver and post-processing tool for the newly developed Thermo-Hydraylic code RELAP- 7. The scope of this paper is to show the software structure of RAVEN and its utilization in connection with RELAP-7. A short overview of the mathematical framework behind the code is presented along with its main capabilities such as on-line controlling/monitoring and Monte-Carlo sampling. A demo of a Station Black Out PRA analysis of a simplified Pressurized Water Reactor (PWR) model is shown in order to demonstrate the Monte-Carlo and clustering capabilities.

  6. WannierTools: An open-source software package for novel topological materials

    Science.gov (United States)

    Wu, QuanSheng; Zhang, ShengNan; Song, Hai-Feng; Troyer, Matthias; Soluyanov, Alexey A.

    2018-03-01

    We present an open-source software package WannierTools, a tool for investigation of novel topological materials. This code works in the tight-binding framework, which can be generated by another software package Wannier90 (Mostofi et al., 2008). It can help to classify the topological phase of a given material by calculating the Wilson loop, and can get the surface state spectrum, which is detected by angle resolved photoemission (ARPES) and in scanning tunneling microscopy (STM) experiments. It also identifies positions of Weyl/Dirac points and nodal line structures, calculates the Berry phase around a closed momentum loop and Berry curvature in a part of the Brillouin zone (BZ).

  7. Evaluation of static analysis tools used to assess software important to nuclear power plant safety

    Energy Technology Data Exchange (ETDEWEB)

    Ourghanlian, Alain [EDF Lab CHATOU, Simulation and Information Technologies for Power Generation Systems Department, EDF R and D, Cedex (France)

    2015-03-15

    We describe a comparative analysis of different tools used to assess safety-critical software used in nuclear power plants. To enhance the credibility of safety assessments and to optimize safety justification costs, Electricit e de France (EDF) investigates the use of methods and tools for source code semantic analysis, to obtain indisputable evidence and help assessors focus on the most critical issues. EDF has been using the PolySpace tool for more than 10 years. Currently, new industrial tools based on the same formal approach, Abstract Interpretation, are available. Practical experimentation with these new tools shows that the precision obtained on one of our shutdown systems software packages is substantially improved. In the first part of this article, we present the analysis principles of the tools used in our experimentation. In the second part, we present the main characteristics of protection-system software, and why these characteristics are well adapted for the new analysis tools.

  8. Integrated management tool for controls software problems, requests and project tasking at SLAC

    International Nuclear Information System (INIS)

    Rogind, D.; Allen, W.; Colocho, W.; DeContreras, G.; Gordon, J.; Pandey, P.; Shoaee, H.

    2012-01-01

    The Accelerator Directorate (AD) Instrumentation and Controls (ICD) Software (SW) Department at SLAC, with its service center model, continuously receives engineering requests to design, build and support controls for accelerator systems lab-wide. Each customer request can vary in complexity from a small software engineering change to a major enhancement. SLAC's Accelerator Improvement Projects (AIPs), along with DOE Construction projects, also contribute heavily to the work load. The various customer requests and projects, paired with the ongoing operational maintenance and problem reports, place a demand on the department that consistently exceeds the capacity of available resources. A centralized repository - comprised of all requests, project tasks, and problems - available to physicists, operators, managers, and engineers alike, is essential to capture, communicate, prioritize, assign, schedule, track, and finally, commission all work components. The Software Department has recently integrated request / project tasking into SLAC's custom online problem tracking 'Comprehensive Accelerator Tool for Enhancing Reliability' (CATER) tool. This paper discusses the newly implemented software request management tool - the workload it helps to track, its structure, features, reports, work-flow and its many usages. (authors)

  9. Improving OCD time to solution using Signal Response Metrology

    Science.gov (United States)

    Fang, Fang; Zhang, Xiaoxiao; Vaid, Alok; Pandev, Stilian; Sanko, Dimitry; Ramanathan, Vidya; Venkataraman, Kartik; Haupt, Ronny

    2016-03-01

    In recent technology nodes, advanced process and novel integration scheme have challenged the precision limits of conventional metrology; with critical dimensions (CD) of device reduce to sub-nanometer region. Optical metrology has proved its capability to precisely detect intricate details on the complex structures, however, conventional RCWA-based (rigorous coupled wave analysis) scatterometry has the limitations of long time-to-results and lack of flexibility to adapt to wide process variations. Signal Response Metrology (SRM) is a new metrology technique targeted to alleviate the consumption of engineering and computation resources by eliminating geometric/dispersion modeling and spectral simulation from the workflow. This is achieved by directly correlating the spectra acquired from a set of wafers with known process variations encoded. In SPIE 2015, we presented the results of SRM application in lithography metrology and control [1], accomplished the mission of setting up a new measurement recipe of focus/dose monitoring in hours. This work will demonstrate our recent field exploration of SRM implementation in 20nm technology and beyond, including focus metrology for scanner control; post etch geometric profile measurement, and actual device profile metrology.

  10. Software engineering and data management for automated payload experiment tool

    Science.gov (United States)

    Maddux, Gary A.; Provancha, Anna; Chattam, David

    1994-01-01

    The Microgravity Projects Office identified a need to develop a software package that will lead experiment developers through the development planning process, obtain necessary information, establish an electronic data exchange avenue, and allow easier manipulation/reformatting of the collected information. An MS-DOS compatible software package called the Automated Payload Experiment Tool (APET) has been developed and delivered. The objective of this task is to expand on the results of the APET work previously performed by University of Alabama in Huntsville (UAH) and provide versions of the software in a Macintosh and Windows compatible format. Appendix 1 science requirements document (SRD) Users Manual is attached.

  11. Registration performance on EUV masks using high-resolution registration metrology

    Science.gov (United States)

    Steinert, Steffen; Solowan, Hans-Michael; Park, Jinback; Han, Hakseung; Beyer, Dirk; Scherübl, Thomas

    2016-10-01

    Next-generation lithography based on EUV continues to move forward to high-volume manufacturing. Given the technical challenges and the throughput concerns a hybrid approach with 193 nm immersion lithography is expected, at least in the initial state. Due to the increasing complexity at smaller nodes a multitude of different masks, both DUV (193 nm) and EUV (13.5 nm) reticles, will then be required in the lithography process-flow. The individual registration of each mask and the resulting overlay error are of crucial importance in order to ensure proper functionality of the chips. While registration and overlay metrology on DUV masks has been the standard for decades, this has yet to be demonstrated on EUV masks. Past generations of mask registration tools were not necessarily limited in their tool stability, but in their resolution capabilities. The scope of this work is an image placement investigation of high-end EUV masks together with a registration and resolution performance qualification. For this we employ a new generation registration metrology system embedded in a production environment for full-spec EUV masks. This paper presents excellent registration performance not only on standard overlay markers but also on more sophisticated e-beam calibration patterns.

  12. Opportunities and Risks in Semiconductor Metrology

    Science.gov (United States)

    Borden, Peter

    2005-09-01

    New metrology opportunities are constantly emerging as the semiconductor industry attempts to meet scaling requirements. The paper summarizes some of the key FEOL and BEOL needs. These must be weighed against a number of considerations to ensure that they are good opportunities for the metrology equipment supplier. The paper discusses some of these considerations.

  13. A portable software tool for computing digitally reconstructed radiographs

    International Nuclear Information System (INIS)

    Chaney, Edward L.; Thorn, Jesse S.; Tracton, Gregg; Cullip, Timothy; Rosenman, Julian G.; Tepper, Joel E.

    1995-01-01

    Purpose: To develop a portable software tool for fast computation of digitally reconstructed radiographs (DRR) with a friendly user interface and versatile image format and display options. To provide a means for interfacing with commercial and custom three-dimensional (3D) treatment planning systems. To make the tool freely available to the Radiation Oncology community. Methods and Materials: A computer program for computing DRRs was enhanced with new features and rewritten to increase computational efficiency. A graphical user interface was added to improve ease of data input and DRR display. Installer, programmer, and user manuals were written, and installation test data sets were developed. The code conforms to the specifications of the Cooperative Working Group (CWG) of the National Cancer Institute (NCI) Contract on Radiotherapy Treatment Planning Tools. Results: The interface allows the user to select DRR input data and image formats primarily by point-and-click mouse operations. Digitally reconstructed radiograph formats are predefined by configuration files that specify 19 calculation parameters. Enhancements include improved contrast resolution for visualizing surgical clips, an extended source model to simulate the penumbra region in a computed port film, and the ability to easily modify the CT numbers of objects contoured on the planning computed tomography (CT) scans. Conclusions: The DRR tool can be used with 3D planning systems that lack this functionality, or perhaps improve the quality and functionality of existing DRR software. The tool can be interfaced to 3D planning systems that run on most modern graphics workstations, and can also function as a stand-alone program

  14. Automated software development tools in the MIS (Management Information Systems) environment

    Energy Technology Data Exchange (ETDEWEB)

    Arrowood, L.F.; Emrich, M.L.

    1987-09-11

    Quantitative and qualitative benefits can be obtained through the use of automated software development tools. Such tools are best utilized when they complement existing procedures and standards. They can assist systems analysts and programmers with project specification, design, implementation, testing, and documentation. Commercial products have been evaluated to determine their efficacy. User comments have been included to illustrate actual benefits derived from introducing these tools into MIS organizations.

  15. An expert system based software sizing tool, phase 2

    Science.gov (United States)

    Friedlander, David

    1990-01-01

    A software tool was developed for predicting the size of a future computer program at an early stage in its development. The system is intended to enable a user who is not expert in Software Engineering to estimate software size in lines of source code with an accuracy similar to that of an expert, based on the program's functional specifications. The project was planned as a knowledge based system with a field prototype as the goal of Phase 2 and a commercial system planned for Phase 3. The researchers used techniques from Artificial Intelligence and knowledge from human experts and existing software from NASA's COSMIC database. They devised a classification scheme for the software specifications, and a small set of generic software components that represent complexity and apply to large classes of programs. The specifications are converted to generic components by a set of rules and the generic components are input to a nonlinear sizing function which makes the final prediction. The system developed for this project predicted code sizes from the database with a bias factor of 1.06 and a fluctuation factor of 1.77, an accuracy similar to that of human experts but without their significant optimistic bias.

  16. UPWIND Metrology, Deliverable D 1A2.1, List of measurement Parameters

    DEFF Research Database (Denmark)

    Hansen, Kurt Schaldemose

    performance measurements - Improvement of aerodynamic codes - Assessment of wind resources In general terms the uncertainty of the testing techniques and methods are typically much higher than the need. Since this problem covers many areas of wind energy, the work package is de-fined as a crosscutting...... activity. The problem is especially relevant for the following areas: Production related - Power performance testing especially in wind farms - Testing of turbine improvements in the order of several percent - Testing of aerodynamic codes - Testing of turbine response to effects such as turbulence...... profiles, turbulence, surface shear recovery distances etc) - Measurements of the interaction wind farms and microclimate The objectives of the metrology work package are to develop metrology tools in wind energy to significantly enhance the quality of measurement and testing techniques. The development...

  17. Introduction to quantum metrology quantum standards and instrumentation

    CERN Document Server

    Nawrocki, Waldemar

    2015-01-01

    This book presents the theory of quantum effects used in metrology and results of the author’s own research in the field of quantum electronics. The book provides also quantum measurement standards used in many branches of metrology for electrical quantities, mass, length, time and frequency. This book represents the first comprehensive survey of quantum metrology problems. As a scientific survey, it propagates a new approach to metrology with more emphasis on its connection with physics. This is of importance for the constantly developing technologies and nanotechnologies in particular. Providing a presentation of practical applications of the effects used in quantum metrology for the construction of quantum standards and sensitive electronic components, the book is useful for a wide audience of physicists and metrologists in the broad sense of both terms. In 2014 a new system of units, the so called  Quantum SI, is introduced. This book helps to understand and approve the new system to both technology a...

  18. Implementation of machine learning for high-volume manufacturing metrology challenges (Conference Presentation)

    Science.gov (United States)

    Timoney, Padraig; Kagalwala, Taher; Reis, Edward; Lazkani, Houssam; Hurley, Jonathan; Liu, Haibo; Kang, Charles; Isbester, Paul; Yellai, Naren; Shifrin, Michael; Etzioni, Yoav

    2018-03-01

    In recent years, the combination of device scaling, complex 3D device architecture and tightening process tolerances have strained the capabilities of optical metrology tools to meet process needs. Two main categories of approaches have been taken to address the evolving process needs. In the first category, new hardware configurations are developed to provide more spectral sensitivity. Most of this category of work will enable next generation optical metrology tools to try to maintain pace with next generation process needs. In the second category, new innovative algorithms have been pursued to increase the value of the existing measurement signal. These algorithms aim to boost sensitivity to the measurement parameter of interest, while reducing the impact of other factors that contribute to signal variability but are not influenced by the process of interest. This paper will evaluate the suitability of machine learning to address high volume manufacturing metrology requirements in both front end of line (FEOL) and back end of line (BEOL) sectors from advanced technology nodes. In the FEOL sector, initial feasibility has been demonstrated to predict the fin CD values from an inline measurement using machine learning. In this study, OCD spectra were acquired after an etch process that occurs earlier in the process flow than where the inline CD is measured. The fin hard mask etch process is known to impact the downstream inline CD value. Figure 1 shows the correlation of predicted CD vs downstream inline CD measurement obtained after the training of the machine learning algorithm. For BEOL, machine learning is shown to provide an additional source of information in prediction of electrical resistance from structures that are not compatible for direct copper height measurement. Figure 2 compares the trench height correlation to electrical resistance (Rs) and the correlation of predicted Rs to the e-test Rs value for a far back end of line (FBEOL) metallization level

  19. Northwestern University Schizophrenia Data and Software Tool (NUSDAST

    Directory of Open Access Journals (Sweden)

    Lei eWang

    2013-11-01

    Full Text Available The schizophrenia research community has invested substantial resources on collecting, managing and sharing large neuroimaging datasets. As part of this effort, our group has collected high resolution magnetic resonance (MR datasets from individuals with schizophrenia, their non-psychotic siblings, healthy controls and their siblings. This effort has resulted in a growing resource, the Northwestern University Schizophrenia Data and Software Tool (NUSDAST, an NIH-funded data sharing project to stimulate new research. This resource resides on XNAT Central, and it contains neuroimaging (MR scans, landmarks and surface maps for deep subcortical structures, and FreeSurfer cortical parcellation and measurement data, cognitive (cognitive domain scores for crystallized intelligence, working memory, episodic memory, and executive function, clinical (demographic, sibling relationship, SAPS and SANS psychopathology, and genetic (20 polymorphisms data, collected from more than 450 subjects, most with 2-year longitudinal follow-up. A neuroimaging mapping, analysis and visualization software tool, CAWorks, is also part of this resource. Moreover, in making our existing neuroimaging data along with the associated meta-data and computational tools publically accessible, we have established a web-based information retrieval portal that allows the user to efficiently search the collection. This research-ready dataset meaningfully combines neuroimaging data with other relevant information, and it can be used to help facilitate advancing neuroimaging research. It is our hope that this effort will help to overcome some of the commonly recognized technical barriers in advancing neuroimaging research such as lack of local organization and standard descriptions.

  20. Forensic surface metrology: tool mark evidence.

    Science.gov (United States)

    Gambino, Carol; McLaughlin, Patrick; Kuo, Loretta; Kammerman, Frani; Shenkin, Peter; Diaczuk, Peter; Petraco, Nicholas; Hamby, James; Petraco, Nicholas D K

    2011-01-01

    Over the last several decades, forensic examiners of impression evidence have come under scrutiny in the courtroom due to analysis methods that rely heavily on subjective morphological comparisons. Currently, there is no universally accepted system that generates numerical data to independently corroborate visual comparisons. Our research attempts to develop such a system for tool mark evidence, proposing a methodology that objectively evaluates the association of striated tool marks with the tools that generated them. In our study, 58 primer shear marks on 9 mm cartridge cases, fired from four Glock model 19 pistols, were collected using high-resolution white light confocal microscopy. The resulting three-dimensional surface topographies were filtered to extract all "waviness surfaces"-the essential "line" information that firearm and tool mark examiners view under a microscope. Extracted waviness profiles were processed with principal component analysis (PCA) for dimension reduction. Support vector machines (SVM) were used to make the profile-gun associations, and conformal prediction theory (CPT) for establishing confidence levels. At the 95% confidence level, CPT coupled with PCA-SVM yielded an empirical error rate of 3.5%. Complementary, bootstrap-based computations for estimated error rates were 0%, indicating that the error rate for the algorithmic procedure is likely to remain low on larger data sets. Finally, suggestions are made for practical courtroom application of CPT for assigning levels of confidence to SVM identifications of tool marks recorded with confocal microscopy. Copyright © 2011 Wiley Periodicals, Inc.

  1. 7/5nm logic manufacturing capabilities and requirements of metrology

    Science.gov (United States)

    Bunday, Benjamin; Bello, A. F.; Solecky, Eric; Vaid, Alok

    2018-03-01

    This paper will provide an update to previous works [2][4][9] to our view of the future for in-line high volume manufacturing (HVM) metrology for the semiconductor industry, concentrating on logic technology for foundries. First, we will review of the needs of patterned defect, critical dimensional (CD/3D), overlay and films metrology, and present the extensive list of applications for which metrology solutions are needed. We will then update the industry's progress towards addressing gating technical limits of the most important of these metrology solutions, highlighting key metrology technology gaps requiring industry attention and investment.

  2. Perspectives in absorbed dose metrology with regard to the technical evolutions of external beam radiotherapy

    International Nuclear Information System (INIS)

    Chauvenet, B.; Bordy, J.M.; Barthe, J.

    2009-01-01

    This paper presents several R and D axes in absorbed close metrology to meet the needs resulting from the technical evolutions of external beam radiotherapy. The facilities in operation in France have considerably evolved under the impulse of the plan Cancer launched in 2003: replacements and increase of the number of accelerators, substitution of accelerators for telecobalt almost completed and acquisition of innovative facilities for tomo-therapy and stereotaxy. The increasing versatility of facilities makes possible the rapid evolution of treatment modalities, allowing to better delimit irradiation to tumoral tissues and spare surrounding healthy tissues and organs at risk. This leads to a better treatment efficacy through dose escalation. National metrology laboratories must offer responses adapted to the new need, i.e. not restrict themselves to the establishment of references under conventional conditions defined at international level, contribute to the improvement of uncertainties at all levels of reference transfer to practitioners: primary measurements under conditions as close as possible to those of treatment, characterization of transfer and treatment control dosimeters., metrological validation of treatment planning tools... Those axes have been identified as priorities for the next years in ionizing radiation metrology at the European level and included in the European. Metrology Research Programme. A project dealing with some of those topics has been selected in the frame of the Eranet+ Call EMRP 2007 and is now starting. The LNE-LAM is strongly engaged in it. (authors)

  3. Software tool for horizontal-axis wind turbine simulation

    Energy Technology Data Exchange (ETDEWEB)

    Vitale, A.J. [Instituto Argentino de Oceanografia, Camino La Carrindanga Km. 7, 5 CC 804, B8000FWB Bahia Blanca (Argentina); Universidad Tecnologica Nacional Facultad Regional Bahia Blanca, GESE, 11 de Abril 461, B8000LMI Bahia Blanca (Argentina); Dpto. de Ing. Electrica y de Computadoras, Universidad Nacional del Sur, Av. Alem 1253, 8000 Bahia Blanca (Argentina); Rossi, A.P. [Universidad Tecnologica Nacional Facultad Regional Bahia Blanca, GESE, 11 de Abril 461, B8000LMI Bahia Blanca (Argentina); Dpto. de Ing. Electrica y de Computadoras, Universidad Nacional del Sur, Av. Alem 1253, 8000 Bahia Blanca (Argentina)

    2008-07-15

    The main problem of a wind turbine generator design project is the design of the right blades capable of satisfying the specific energy requirement of an electric system with optimum performance. Once the blade has been designed for optimum operation at a particular rotor angular speed, it is necessary to determine the overall performance of the rotor under the range of wind speed that it will encounter. A software tool that simulates low-power, horizontal-axis wind turbines was developed for this purpose. With this program, the user can calculate the rotor power output for any combination of wind and rotor speeds, with definite blade shape and airfoil characteristics. The software also provides information about distribution of forces along the blade span, for different operational conditions. (author)

  4. About new software and hardware tools in the education of 'Semiconductor Devices'

    International Nuclear Information System (INIS)

    Taneva, Ljudmila; Basheva, Bistra

    2009-01-01

    This paper describes the new tools, used in the education of ”Semiconductor Devices”, developed at the Technological School “Electronic Systems”, Department of the Technical University, Sofia. The software and hardware tools give the opportunity to achieve the right balance between theory and practice, and the students are given the chance to accumulate valuable “hands-on” skills. The main purpose of the developed lab exercises is to demonstrate the use of some electronic components and practice with them. Keywords: semiconductors, media software tool, hardware, education

  5. Slovak Office of Standards, Metrology and Testing. Annual Report 2001

    International Nuclear Information System (INIS)

    2002-01-01

    A brief account of activities carried out by the Slovak Office of Standards, Metrology and Testing of the Slovak Republic in 2001 is presented. These activities are reported under the headings: (1) Introduction by the President of the Slovak Office of Standards, Metrology and Testing; (2) The Vice-president's Unit Standardization and Quality; (3) The President's Office; (4) Chief Inspector Department; (5) Legislative-juridical Department; (6) Department of Economy; (7) Department of International Co-operation; (8) Department of European Integration; (9) Department of Metrology; (10) Department of Testing; (11) Department of the Cyclotron Centre SR; (12) Slovak Institute of Metrology; (13) Slovak Standards Institution; (14) Slovak Metrology Inspectorate; (15) Slovak Legal Metrology; (16) Measuring Techniques - Technocentre - MTT; Abbreviations; (17) Technical Testing Institute Piestany; (18) Testing Institute of Transport and Earthmoving Machinery - SUDST

  6. The quality of measurements a metrological reference

    CERN Document Server

    Fridman, A E

    2012-01-01

    This book provides a detailed discussion and commentary on the fundamentals of metrology. The fundamentals of metrology, the principles underlying the design of the SI International System of units, the theory of measurement error, a new methodology for estimation of measurement accuracy based on uncertainty, and methods for reduction of measured results and estimation of measurement uncertainty are all discussed from a modern point of view. The concept of uncertainty is shown to be consistent with the classical theory of accuracy. The theory of random measurement errors is supplemented by a very general description based on the generalized normal distribution; systematic instrumental error is described in terms of a methodology for normalizing the metrological characteristics of measuring instruments. A new international system for assuring uniformity of measurements based on agreements between national metrological institutes is discussed, in addition to the role and procedure for performance of key compari...

  7. COMPUTER TOOLS OF DYNAMIC MATHEMATIC SOFTWARE AND METHODICAL PROBLEMS OF THEIR USE

    OpenAIRE

    Olena V. Semenikhina; Maryna H. Drushliak

    2014-01-01

    The article presents results of analyses of standard computer tools of dynamic mathematic software which are used in solving tasks, and tools on which the teacher can support in the teaching of mathematics. Possibility of the organization of experimental investigating of mathematical objects on the basis of these tools and the wording of new tasks on the basis of the limited number of tools, fast automated check are specified. Some methodological comments on application of computer tools and ...

  8. Techniques and tools for measuring energy efficiency of scientific software applications

    International Nuclear Information System (INIS)

    Abdurachmanov, David; Elmer, Peter; Eulisse, Giulio; Knight, Robert; Niemi, Tapio; Pestana, Gonçalo; Khan, Kashif; Nurminen, Jukka K; Nyback, Filip; Ou, Zhonghong

    2015-01-01

    The scale of scientific High Performance Computing (HPC) and High Throughput Computing (HTC) has increased significantly in recent years, and is becoming sensitive to total energy use and cost. Energy-efficiency has thus become an important concern in scientific fields such as High Energy Physics (HEP). There has been a growing interest in utilizing alternate architectures, such as low power ARM processors, to replace traditional Intel x86 architectures. Nevertheless, even though such solutions have been successfully used in mobile applications with low I/O and memory demands, it is unclear if they are suitable and more energy-efficient in the scientific computing environment. Furthermore, there is a lack of tools and experience to derive and compare power consumption between the architectures for various workloads, and eventually to support software optimizations for energy efficiency. To that end, we have performed several physical and software-based measurements of workloads from HEP applications running on ARM and Intel architectures, and compare their power consumption and performance. We leverage several profiling tools (both in hardware and software) to extract different characteristics of the power use. We report the results of these measurements and the experience gained in developing a set of measurement techniques and profiling tools to accurately assess the power consumption for scientific workloads. (paper)

  9. Tool Support for Parametric Analysis of Large Software Simulation Systems

    Science.gov (United States)

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  10. Software Tools for Electrical Quality Assurance in the LHC

    CERN Document Server

    Bednarek, Mateusz

    2011-01-01

    There are over 1600 superconducting magnet circuits in the LHC machine. Many of them consist of a large number of components electrically connected in series. This enhances the sensitivity of the whole circuits to electrical faults of individual components. Furthermore, circuits are equipped with a large number of instrumentation wires, which are exposed to accidental damage or swapping. In order to ensure safe operation, an Electrical Quality Assurance (ELQA) campaign is needed after each thermal cycle. Due to the complexity of the circuits, as well as their distant geographical distribution (tunnel of 27km circumference divided in 8 sectors), suitable software and hardware platforms had to be developed. The software combines an Oracle database, LabView data acquisition applications and PHP-based web follow-up tools. This paper describes the software used for the ELQA of the LHC.

  11. Software tools for electrical quality assurance in the LHC

    International Nuclear Information System (INIS)

    Bednarek, M.; Ludwin, J.

    2012-01-01

    There are over 1600 superconducting magnet circuits in the LHC machine. Many of them consist of a large number of components electrically connected in series. This enhances the sensitivity of the whole circuits to electrical faults of individual components. Furthermore, circuits are equipped with a large number of instrumentation wires, which are exposed to accidental damage or swapping. In order to ensure safe operation, an Electrical Quality Assurance (ELQA) campaign is needed after each thermal cycle. Due to the complexity of the circuits, as well as their distant geographical distribution (tunnel of 27 km circumference divided in 8 sectors), suitable software and hardware platforms had to be developed. The software combines an Oracle database, LabView data acquisition applications and PHP-based web follow-up tools. This paper describes the software used for the ELQA of the LHC. (authors)

  12. NuFTA: A CASE Tool for Automatic Software Fault Tree Analysis

    International Nuclear Information System (INIS)

    Yun, Sang Hyun; Lee, Dong Ah; Yoo, Jun Beom

    2010-01-01

    Software fault tree analysis (SFTA) is widely used for analyzing software requiring high-reliability. In SFTA, experts predict failures of system through HA-ZOP (Hazard and Operability study) or FMEA (Failure Mode and Effects Analysis) and draw software fault trees about the failures. Quality and cost of the software fault tree, therefore, depend on knowledge and experience of the experts. This paper proposes a CASE tool NuFTA in order to assist experts of safety analysis. The NuFTA automatically generate software fault trees from NuSCR formal requirements specification. NuSCR is a formal specification language used for specifying software requirements of KNICS RPS (Reactor Protection System) in Korea. We used the SFTA templates proposed by in order to generate SFTA automatically. The NuFTA also generates logical formulae summarizing the failure's cause, and we have a plan to use the formulae usefully through formal verification techniques

  13. Metrology in Pharmaceutical Industry - A Case Study

    International Nuclear Information System (INIS)

    Yuvamoto, Priscila D.; Fermam, Ricardo K. S.; Nascimento, Elizabeth S.

    2016-01-01

    Metrology is recognized by improving production process, increasing the productivity, giving more reliability to the measurements and consequently, it impacts in the economy of a country. Pharmaceutical area developed GMP (Good Manufacture Practice) requeriments, with no introduction of metrological concepts. However, due to Nanomedicines, it is expected this approach and the consequent positive results. The aim of this work is to verify the level of metrology implementation in a Brazilian pharmaceutical industry, using a case study. The purpose is a better mutual comprehension by both areas, acting together and governmental support to robustness of Brazilian pharmaceutical area. (paper)

  14. Software Tools: A One-Semester Secondary School Computer Course.

    Science.gov (United States)

    Bromley, John; Lakatos, John

    1985-01-01

    Provides a course outline, describes equipment and teacher requirements, discusses student evaluation and course outcomes, and details the computer programs used in a high school course. The course is designed to teach students use of the microcomputer as a tool through hands-on experience with a variety of commercial software programs. (MBR)

  15. PyElph - a software tool for gel images analysis and phylogenetics

    Directory of Open Access Journals (Sweden)

    Pavel Ana Brânduşa

    2012-01-01

    Full Text Available Abstract Background This paper presents PyElph, a software tool which automatically extracts data from gel images, computes the molecular weights of the analyzed molecules or fragments, compares DNA patterns which result from experiments with molecular genetic markers and, also, generates phylogenetic trees computed by five clustering methods, using the information extracted from the analyzed gel image. The software can be successfully used for population genetics, phylogenetics, taxonomic studies and other applications which require gel image analysis. Researchers and students working in molecular biology and genetics would benefit greatly from the proposed software because it is free, open source, easy to use, has a friendly Graphical User Interface and does not depend on specific image acquisition devices like other commercial programs with similar functionalities do. Results PyElph software tool is entirely implemented in Python which is a very popular programming language among the bioinformatics community. It provides a very friendly Graphical User Interface which was designed in six steps that gradually lead to the results. The user is guided through the following steps: image loading and preparation, lane detection, band detection, molecular weights computation based on a molecular weight marker, band matching and finally, the computation and visualization of phylogenetic trees. A strong point of the software is the visualization component for the processed data. The Graphical User Interface provides operations for image manipulation and highlights lanes, bands and band matching in the analyzed gel image. All the data and images generated in each step can be saved. The software has been tested on several DNA patterns obtained from experiments with different genetic markers. Examples of genetic markers which can be analyzed using PyElph are RFLP (Restriction Fragment Length Polymorphism, AFLP (Amplified Fragment Length Polymorphism, RAPD

  16. Knowledge Architect : A Tool Suite for Managing Software Architecture Knowledge

    NARCIS (Netherlands)

    Liang, Peng; Jansen, Anton; Avgeriou, Paris

    2009-01-01

    Management of software architecture knowledge (AK) is vital for improving an organization’s architectural capabilities. To support the architecting process within our industrial partner: Astron, the Dutch radio astronomy institute, we implemented the Knowledge Architect (KA): a tool suite for

  17. Metabolic interrelationships software application: Interactive learning tool for intermediary metabolism

    NARCIS (Netherlands)

    A.J.M. Verhoeven (Adrie); M. Doets (Mathijs); J.M.J. Lamers (Jos); J.F. Koster (Johan)

    2005-01-01

    textabstractWe developed and implemented the software application titled Metabolic Interrelationships as a self-learning and -teaching tool for intermediary metabolism. It is used by undergraduate medical students in an integrated organ systems-based and disease-oriented core curriculum, which

  18. Variation of densitometry on computed tomography in COPD--influence of different software tools.

    Directory of Open Access Journals (Sweden)

    Mark O Wielpütz

    Full Text Available Quantitative multidetector computed tomography (MDCT as a potential biomarker is increasingly used for severity assessment of emphysema in chronic obstructive pulmonary disease (COPD. Aim of this study was to evaluate the user-independent measurement variability between five different fully-automatic densitometry software tools.MDCT and full-body plethysmography incl. forced expiratory volume in 1s and total lung capacity were available for 49 patients with advanced COPD (age = 64±9 years, forced expiratory volume in 1 s = 31±6% predicted. Measurement variation regarding lung volume, emphysema volume, emphysema index, and mean lung density was evaluated for two scientific and three commercially available lung densitometry software tools designed to analyze MDCT from different scanner types.One scientific tool and one commercial tool failed to process most or all datasets, respectively, and were excluded. One scientific and another commercial tool analyzed 49, the remaining commercial tool 30 datasets. Lung volume, emphysema volume, emphysema index and mean lung density were significantly different amongst these three tools (p<0.001. Limits of agreement for lung volume were [-0.195, -0.052 l], [-0.305, -0.131 l], and [-0.123, -0.052 l] with correlation coefficients of r = 1.00 each. Limits of agreement for emphysema index were [-6.2, 2.9%], [-27.0, 16.9%], and [-25.5, 18.8%], with r = 0.79 to 0.98. Correlation of lung volume with total lung capacity was good to excellent (r = 0.77 to 0.91, p<0.001, but segmented lung volume (6.7±1.3-6.8±1.3 l were significantly lower than total lung capacity (7.7±1.7 l, p<0.001.Technical incompatibilities hindered evaluation of two of five tools. The remaining three showed significant measurement variation for emphysema, hampering quantitative MDCT as a biomarker in COPD. Follow-up studies should currently use identical software, and standardization efforts should encompass software as

  19. Toward reliable and repeatable automated STEM-EDS metrology with high throughput

    Science.gov (United States)

    Zhong, Zhenxin; Donald, Jason; Dutrow, Gavin; Roller, Justin; Ugurlu, Ozan; Verheijen, Martin; Bidiuk, Oleksii

    2018-03-01

    New materials and designs in complex 3D architectures in logic and memory devices have raised complexity in S/TEM metrology. In this paper, we report about a newly developed, automated, scanning transmission electron microscopy (STEM) based, energy dispersive X-ray spectroscopy (STEM-EDS) metrology method that addresses these challenges. Different methodologies toward repeatable and efficient, automated STEM-EDS metrology with high throughput are presented: we introduce the best known auto-EDS acquisition and quantification methods for robust and reliable metrology and present how electron exposure dose impacts the EDS metrology reproducibility, either due to poor signalto-noise ratio (SNR) at low dose or due to sample modifications at high dose conditions. Finally, we discuss the limitations of the STEM-EDS metrology technique and propose strategies to optimize the process both in terms of throughput and metrology reliability.

  20. Oxygen octahedra picker: A software tool to extract quantitative information from STEM images

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Yi, E-mail: y.wang@fkf.mpg.de; Salzberger, Ute; Sigle, Wilfried; Eren Suyolcu, Y.; Aken, Peter A. van

    2016-09-15

    In perovskite oxide based materials and hetero-structures there are often strong correlations between oxygen octahedral distortions and functionality. Thus, atomistic understanding of the octahedral distortion, which requires accurate measurements of atomic column positions, will greatly help to engineer their properties. Here, we report the development of a software tool to extract quantitative information of the lattice and of BO{sub 6} octahedral distortions from STEM images. Center-of-mass and 2D Gaussian fitting methods are implemented to locate positions of individual atom columns. The precision of atomic column distance measurements is evaluated on both simulated and experimental images. The application of the software tool is demonstrated using practical examples. - Highlights: • We report a software tool for mapping atomic positions from HAADF and ABF images. • It enables quantification of both crystal lattice and oxygen octahedral distortions. • We test the measurement accuracy and precision on simulated and experimental images. • It works well for different orientations of perovskite structures and interfaces.

  1. An Accurate FFPA-PSR Estimator Algorithm and Tool for Software Effort Estimation

    Directory of Open Access Journals (Sweden)

    Senthil Kumar Murugesan

    2015-01-01

    Full Text Available Software companies are now keen to provide secure software with respect to accuracy and reliability of their products especially related to the software effort estimation. Therefore, there is a need to develop a hybrid tool which provides all the necessary features. This paper attempts to propose a hybrid estimator algorithm and model which incorporates quality metrics, reliability factor, and the security factor with a fuzzy-based function point analysis. Initially, this method utilizes a fuzzy-based estimate to control the uncertainty in the software size with the help of a triangular fuzzy set at the early development stage. Secondly, the function point analysis is extended by the security and reliability factors in the calculation. Finally, the performance metrics are added with the effort estimation for accuracy. The experimentation is done with different project data sets on the hybrid tool, and the results are compared with the existing models. It shows that the proposed method not only improves the accuracy but also increases the reliability, as well as the security, of the product.

  2. Consultative committee on ionizing radiation: Impact on radionuclide metrology

    International Nuclear Information System (INIS)

    Karam, L.R.; Ratel, G.

    2016-01-01

    In response to the CIPM MRA, and to improve radioactivity measurements in the face of advancing technologies, the CIPM's consultative committee on ionizing radiation developed a strategic approach to the realization and validation of measurement traceability for radionuclide metrology. As a consequence, measurement institutions throughout the world have devoted no small effort to establish radionuclide metrology capabilities, supported by active quality management systems and validated through prioritized participation in international comparisons, providing a varied stakeholder community with measurement confidence. - Highlights: • Influence of CIPM MRA on radionuclide metrology at laboratories around the world. • CCRI strategy: to be the “undisputed hub for ionizing radiation global metrology.” • CCRI Strategic Plan stresses importance of measurement confidence for stakeholder. • NMIs increasing role in radionuclide metrology by designating institutions (DIs). • NMIs and DIs establish quality systems; validate capabilities through comparisons.

  3. A SOFTWARE TOOL TO COMPARE MEASURED AND SIMULATED BUILDING ENERGY PERFORMANCE DATA

    Energy Technology Data Exchange (ETDEWEB)

    Maile, Tobias; Bazjanac, Vladimir; O' Donnell, James; Garr, Matthew

    2011-11-01

    Building energy performance is often inadequate when compared to design goals. To link design goals to actual operation one can compare measured with simulated energy performance data. Our previously developed comparison approach is the Energy Performance Comparison Methodology (EPCM), which enables the identification of performance problems based on a comparison of measured and simulated performance data. In context of this method, we developed a software tool that provides graphing and data processing capabilities of the two performance data sets. The software tool called SEE IT (Stanford Energy Efficiency Information Tool) eliminates the need for manual generation of data plots and data reformatting. SEE IT makes the generation of time series, scatter and carpet plots independent of the source of data (measured or simulated) and provides a valuable tool for comparing measurements with simulation results. SEE IT also allows assigning data points on a predefined building object hierarchy and supports different versions of simulated performance data. This paper briefly introduces the EPCM, describes the SEE IT tool and illustrates its use in the context of a building case study.

  4. Impact of the ITRS Metrology Roadmap

    International Nuclear Information System (INIS)

    Diebold, Alain C.

    2001-01-01

    The International Technology Roadmap for Semiconductors (ITRS) provides the semiconductor industry with the timing of critical technology needs for future generations of integrated circuits. The Metrology roadmap in the ITRS describes the measurement needs based on the process requirements found in the Lithography, Front End Processes, Interconnect, and Packaging Roadmaps. This paper illustrates the impact of the Metrology Roadmap on the development of key measurement technology

  5. Celtiberian metrology and its romanization

    Directory of Open Access Journals (Sweden)

    Leonard A. CURCHIN

    2013-05-01

    Full Text Available Celtiberian metrology has scarcely been investigated until now, with the exception of coin weights. On the basis of measurements of pre-Roman mud bricks, a Celtiberian foot of 24 cm is proposed. With regard to weights, we can accept a module of 9 g for silver jewelry and some bronze coins; however, loom weights do not conform to any metrological system. Over time, Roman measures of length (as indicated by the dimensions of bricks, tiles and architectural monuments and weight were adopted.

  6. Management of metrology in measuring of the displacement of building construction

    Directory of Open Access Journals (Sweden)

    Jiří Kratochvíl

    2007-06-01

    Full Text Available The metrology management of the measurement of the displacement of building construction is not regulated in the standard ČSN ISO 73 0405 - Measurement of the displacement of building construction. But the metrology management has to be included in the project of measurement of the displacement (Stage of project. Then we have to pay an attention to the metrological management during this measurement (Stage of realization and during the evaluation of this measurement (Stage of evaluation. We have to insist on the subsequent improving of metrology management within the frame of the next project (so-called feedback. The metrology management in the measurement of the displacement during the stages should be based on an application of statutory instruments and technical standards. We should insist especially on the system of standards for the quality control ISO 9000. Considering specialities of geodetic measurements it is necessary to adapt the metrology management. That is why it will differ from the metrology management in other fields of knowledge. This paper includes some steps of metrological provision which must not be ignored.

  7. A free software tool for the development of decision support systems

    Directory of Open Access Journals (Sweden)

    COLONESE, G

    2008-06-01

    Full Text Available This article describes PostGeoOlap, a free software open source tool for decision support that integrates OLAP (On-Line Analytical Processing and GIS (Geographical Information Systems. Besides describing the tool, we show how it can be used to achieve effective and low cost decision support that is adequate for small and medium companies and for small public offices.

  8. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    Directory of Open Access Journals (Sweden)

    Jyri Pakarinen

    2010-01-01

    Full Text Available Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  9. Improving automated 3D reconstruction methods via vision metrology

    Science.gov (United States)

    Toschi, Isabella; Nocerino, Erica; Hess, Mona; Menna, Fabio; Sargeant, Ben; MacDonald, Lindsay; Remondino, Fabio; Robson, Stuart

    2015-05-01

    This paper aims to provide a procedure for improving automated 3D reconstruction methods via vision metrology. The 3D reconstruction problem is generally addressed using two different approaches. On the one hand, vision metrology (VM) systems try to accurately derive 3D coordinates of few sparse object points for industrial measurement and inspection applications; on the other, recent dense image matching (DIM) algorithms are designed to produce dense point clouds for surface representations and analyses. This paper strives to demonstrate a step towards narrowing the gap between traditional VM and DIM approaches. Efforts are therefore intended to (i) test the metric performance of the automated photogrammetric 3D reconstruction procedure, (ii) enhance the accuracy of the final results and (iii) obtain statistical indicators of the quality achieved in the orientation step. VM tools are exploited to integrate their main functionalities (centroid measurement, photogrammetric network adjustment, precision assessment, etc.) into the pipeline of 3D dense reconstruction. Finally, geometric analyses and accuracy evaluations are performed on the raw output of the matching (i.e. the point clouds) by adopting a metrological approach. The latter is based on the use of known geometric shapes and quality parameters derived from VDI/VDE guidelines. Tests are carried out by imaging the calibrated Portable Metric Test Object, designed and built at University College London (UCL), UK. It allows assessment of the performance of the image orientation and matching procedures within a typical industrial scenario, characterised by poor texture and known 3D/2D shapes.

  10. Application of advanced diffraction based optical metrology overlay capabilities for high-volume manufacturing

    Science.gov (United States)

    Chen, Kai-Hsiung; Huang, Guo-Tsai; Hsieh, Hung-Chih; Ni, Wei-Feng; Chuang, S. M.; Chuang, T. K.; Ke, Chih-Ming; Huang, Jacky; Rao, Shiuan-An; Cumurcu Gysen, Aysegul; d'Alfonso, Maxime; Yueh, Jenny; Izikson, Pavel; Soco, Aileen; Wu, Jon; Nooitgedagt, Tjitte; Ottens, Jeroen; Kim, Yong Ho; Ebert, Martin

    2017-03-01

    On-product overlay requirements are becoming more challenging with every next technology node due to the continued decrease of the device dimensions and process tolerances. Therefore, current and future technology nodes require demanding metrology capabilities such as target designs that are robust towards process variations and high overlay measurement density (e.g. for higher order process corrections) to enable advanced process control solutions. The impact of advanced control solutions based on YieldStar overlay data is being presented in this paper. Multi patterning techniques are applied for critical layers and leading to additional overlay measurement demands. The use of 1D process steps results in the need of overlay measurements relative to more than one layer. Dealing with the increased number of overlay measurements while keeping the high measurement density and metrology accuracy at the same time presents a challenge for high volume manufacturing (HVM). These challenges are addressed by the capability to measure multi-layer targets with the recently introduced YieldStar metrology tool, YS350. On-product overlay results of such multi-layers and standard targets are presented including measurement stability performance.

  11. Surface slope metrology of highly curved x-ray optics with an interferometric microscope

    Science.gov (United States)

    Gevorkyan, Gevork S.; Centers, Gary; Polonska, Kateryna S.; Nikitin, Sergey M.; Lacey, Ian; Yashchuk, Valeriy V.

    2017-09-01

    The development of deterministic polishing techniques has given rise to vendors that manufacture high quality threedimensional x-ray optics. The surface metrology on these optics remains a difficult task. For the fabrication, vendors usually use unique surface metrology tools, generally developed on site, that are not available in the optical metrology labs at x-ray facilities. At the Advanced Light Source X-Ray Optics Laboratory, we have developed a rather straightforward interferometric-microscopy-based procedure capable of sub microradian characterization of sagittal slope variation of x-ray optics for two-dimensionally focusing and collimating (such as ellipsoids, paraboloids, etc.). In the paper, we provide the mathematical foundation of the procedure and describe the related instrument calibration. We also present analytical expression describing the ideal surface shape in the sagittal direction of a spheroid specified by the conjugate parameters of the optic's beamline application. The expression is useful when analyzing data obtained with such optics. The high efficiency of the developed measurement and data analysis procedures is demonstrated in results of measurements with a number of x-ray optics with sagittal radius of curvature between 56 mm and 480 mm. We also discuss potential areas of further improvement.

  12. Three Software Tools for Viewing Sectional Planes, Volume Models, and Surface Models of a Cadaver Hand.

    Science.gov (United States)

    Chung, Beom Sun; Chung, Min Suk; Shin, Byeong Seok; Kwon, Koojoo

    2018-02-19

    The hand anatomy, including the complicated hand muscles, can be grasped by using computer-assisted learning tools with high quality two-dimensional images and three-dimensional models. The purpose of this study was to present up-to-date software tools that promote learning of stereoscopic morphology of the hand. On the basis of horizontal sectioned images and outlined images of a male cadaver, vertical planes, volume models, and surface models were elaborated. Software to browse pairs of the sectioned and outlined images in orthogonal planes and software to peel and rotate the volume models, as well as a portable document format (PDF) file to select and rotate the surface models, were produced. All of the software tools were downloadable free of charge and usable off-line. The three types of tools for viewing multiple aspects of the hand could be adequately employed according to individual needs. These new tools involving the realistic images of a cadaver and the diverse functions are expected to improve comprehensive knowledge of the hand shape. © 2018 The Korean Academy of Medical Sciences.

  13. D-VASim: A Software Tool to Simulate and Analyze Genetic Logic Circuits

    DEFF Research Database (Denmark)

    Baig, Hasan; Madsen, Jan

    2016-01-01

    -stage researchers with limited experience in the field of biology. The Solution: Using LabVIEW to develop a user-friendly simulation tool named Dynamic Virtual Analyzer and Simulator (D-VASim), which is the first software tool in the domain of synthetic biology that provides a virtual laboratory environment...

  14. Review of software tools for design and analysis of large scale MRM proteomic datasets.

    Science.gov (United States)

    Colangelo, Christopher M; Chung, Lisa; Bruce, Can; Cheung, Kei-Hoi

    2013-06-15

    Selective or Multiple Reaction monitoring (SRM/MRM) is a liquid-chromatography (LC)/tandem-mass spectrometry (MS/MS) method that enables the quantitation of specific proteins in a sample by analyzing precursor ions and the fragment ions of their selected tryptic peptides. Instrumentation software has advanced to the point that thousands of transitions (pairs of primary and secondary m/z values) can be measured in a triple quadrupole instrument coupled to an LC, by a well-designed scheduling and selection of m/z windows. The design of a good MRM assay relies on the availability of peptide spectra from previous discovery-phase LC-MS/MS studies. The tedious aspect of manually developing and processing MRM assays involving thousands of transitions has spurred to development of software tools to automate this process. Software packages have been developed for project management, assay development, assay validation, data export, peak integration, quality assessment, and biostatistical analysis. No single tool provides a complete end-to-end solution, thus this article reviews the current state and discusses future directions of these software tools in order to enable researchers to combine these tools for a comprehensive targeted proteomics workflow. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.

  15. Programming heterogeneous MPSoCs tool flows to close the software productivity gap

    CERN Document Server

    Castrillón Mazo, Jerónimo

    2014-01-01

    This book provides embedded software developers with techniques for programmingheterogeneous Multi-Processor Systems-on-Chip (MPSoCs), capable of executing multiple applications simultaneously. It describes a set of algorithms and methodologies to narrow the software productivity gap, as well as an in-depth description of the underlying problems and challenges of today’s programming practices. The authors present four different tool flows: A parallelism extraction flow for applications writtenusing the C programming language, a mapping and scheduling flow for parallel applications, a special mapping flow for baseband applications in the context of Software Defined Radio (SDR) and a final flow for analyzing multiple applications at design time. The tool flows are evaluated on Virtual Platforms (VPs), which mimic different characteristics of state-of-the-art heterogeneous MPSoCs.   • Provides a novel set of algorithms and methodologies for programming heterogeneous Multi-Processor Systems-on-Chip (MPSoCs)...

  16. Metrology for radioactive waste management. (WP2, WP3)

    International Nuclear Information System (INIS)

    Suran, J.

    2014-01-01

    The three-year European research project M etrology for Radioactive Waste Management' was launched in October 2011 under the EMRP (European Metrology Research Programme). It involves 13 European national metrology institutes and a total budget exceeds four million Euros. The project is coordinated by the Czech Metrology Institute and is divided into five working groups. In this presentation the Project is described. (author)

  17. Metrological Array of Cyber-Physical Systems. Part 11. Remote Error Correction of Measuring Channel

    Directory of Open Access Journals (Sweden)

    Yuriy YATSUK

    2015-09-01

    Full Text Available The multi-channel measuring instruments with both the classical structure and the isolated one is identified their errors major factors basing on general it metrological properties analysis. Limiting possibilities of the remote automatic method for additive and multiplicative errors correction of measuring instruments with help of code-control measures are studied. For on-site calibration of multi- channel measuring instruments, the portable voltage calibrators structures are suggested and their metrological properties while automatic errors adjusting are analysed. It was experimentally envisaged that unadjusted error value does not exceed ± 1 mV that satisfies most industrial applications. This has confirmed the main approval concerning the possibilities of remote errors self-adjustment as well multi- channel measuring instruments as calibration tools for proper verification.

  18. Integrated structural analysis tool using the linear matching method part 1 – Software development

    International Nuclear Information System (INIS)

    Ure, James; Chen, Haofeng; Tipping, David

    2014-01-01

    A number of direct methods based upon the Linear Matching Method (LMM) framework have been developed to address structural integrity issues for components subjected to cyclic thermal and mechanical load conditions. This paper presents a new integrated structural analysis tool using the LMM framework for the assessment of load carrying capacity, shakedown limit, ratchet limit and steady state cyclic response of structures. First, the development of the LMM for the evaluation of design limits in plasticity is introduced. Second, preliminary considerations for the development of the LMM into a tool which can be used on a regular basis by engineers are discussed. After the re-structuring of the LMM subroutines for multiple central processing unit (CPU) solution, the LMM software tool for the assessment of design limits in plasticity is implemented by developing an Abaqus CAE plug-in with graphical user interfaces. Further demonstration of this new LMM analysis tool including practical application and verification is presented in an accompanying paper. - Highlights: • A new structural analysis tool using the Linear Matching Method (LMM) is developed. • The software tool is able to evaluate the design limits in plasticity. • Able to assess limit load, shakedown, ratchet limit and steady state cyclic response. • Re-structuring of the LMM subroutines for multiple CPU solution is conducted. • The software tool is implemented by developing an Abaqus CAE plug-in with GUI

  19. 8th Brazilian Congress on Metrology (Metrologia 2015)

    International Nuclear Information System (INIS)

    2016-01-01

    THE EIGHTH BRAZILIAN CONGRESS ON METROLOGY (METROLOGIA 2015) The United Nations celebrated 2015 as the International Year of Light. By a curious coincidence, many notable events in science and technology completed a multiple of 50 or 100 years in 2015. From the pioneering work of the wise Ibn Al-Haytham in 1015, through Fresnel, Maxwell, Einstein, the discovery of the cosmic microwave background, to the use of optical fibres in communications in 1965. Electromagnetic radiation is present in our daily lives in countless applications. It is remarkable that there is no way to think about these applications without thinking of measurements. From entangled photons to more prosaic public illumination of our daily life, we are intrinsically connected all the time with the luminous phenomena. Among other things, the light allows global communication on a large scale. It strengthens the internationalization of production processes, which brings considerable changes in relations, processes and economic structures, as well as it orients the social, political and cultural behaviour of any country. These conditions of this internationalization require interchangeability of parts of complex systems, translated into strict adherence to the standards and specifications that use increasingly accurate measurement techniques, as well as the growing demand from consumer markets for products and higher quality services. They also require innovation and improvements in domestic production to boost the competitiveness of industries in domestic and foreign markets. Thus, if the Science of Measurements is taken as a serious concern, countries are better prepared to evolve towards economic and social development. In this 8"t"h edition of the Brazilian Congress on Metrology (METROLOGIA 2015), in addition to the thematic sessions in various areas of Metrology and Conformity Assessment, we hold several satellite events. They are already traditional events or highlight important current issues

  20. Deep machine learning based Image classification in hard disk drive manufacturing (Conference Presentation)

    Science.gov (United States)

    Rana, Narender; Chien, Chester

    2018-03-01

    A key sensor element in a Hard Disk Drive (HDD) is the read-write head device. The device is complex 3D shape and its fabrication requires over thousand process steps with many of them being various types of image inspection and critical dimension (CD) metrology steps. In order to have high yield of devices across a wafer, very tight inspection and metrology specifications are implemented. Many images are collected on a wafer and inspected for various types of defects and in CD metrology the quality of image impacts the CD measurements. Metrology noise need to be minimized in CD metrology to get better estimate of the process related variations for implementing robust process controls. Though there are specialized tools available for defect inspection and review allowing classification and statistics. However, due to unavailability of such advanced tools or other reasons, many times images need to be manually inspected. SEM Image inspection and CD-SEM metrology tools are different tools differing in software as well. SEM Image inspection and CD-SEM metrology tools are separate tools differing in software and purpose. There have been cases where a significant numbers of CD-SEM images are blurred or have some artefact and there is a need for image inspection along with the CD measurement. Tool may not report a practical metric highlighting the quality of image. Not filtering CD from these blurred images will add metrology noise to the CD measurement. An image classifier can be helpful here for filtering such data. This paper presents the use of artificial intelligence in classifying the SEM images. Deep machine learning is used to train a neural network which is then used to classify the new images as blurred and not blurred. Figure 1 shows the image blur artefact and contingency table of classification results from the trained deep neural network. Prediction accuracy of 94.9 % was achieved in the first model. Paper covers other such applications of the deep neural

  1. Laser metrology for a next generation gravimetric mission

    Science.gov (United States)

    Mottini, Sergio; Biondetti, Giorgio; Cesare, Stefano; Castorina, Giuseppe; Musso, Fabio; Pisani, Marco; Leone, Bruno

    2017-11-01

    Within the ESA technology research project "Laser Interferometer High Precision tracking for LEO", Thales Alenia Space Italia is developing a laser metrology system for a Next Generation Gravimetric Mission (NGGM) based on satellite-to-satellite tracking. This technique is based on the precise measurement of the displacement between two satellites flying in formation at low altitude for monitoring the variations of Earth's gravity field at high resolution over a long time period. The laser metrology system that has been defined for this mission consists of the following elements: • an heterodyne Michelson interferometer for measuring the distance variation between retroreflectors positioned on the two satellites; • an angle metrology for measuring the orientation of the laser beam in the reference frames of the two satellites; • a lateral displacement metrology for measuring the deviations of the laser beam axis from the target retro-reflector. The laser interferometer makes use of a chopped measurement beam to avoid spurious signals and nonlinearity caused by the unbalance between the strong local beam and the weak return beam. The main results of the design, development and test activities performed on the breadboard of the metrology system are summarized in this paper.

  2. Commissioning software tools at the Advanced Photon Source

    International Nuclear Information System (INIS)

    Emery, L.

    1995-01-01

    A software tool-oriented approach has been adopted in the commissioning of the Advanced Photon Source (APS) at Argonne National Laboratory, particularly in the commissioning of the Positron Accumulator Ring (PAR). The general philosophy is to decompose a complicated procedure involving measurement, data processing, and control into a series of simpler steps, each accomplished by a generic toolkit program. The implementation is greatly facilitated by adopting the SDDS (self-describing data set protocol), which comes with its own toolkit. The combined toolkit has made accelerator physics measurements easier. For instance, the measurement of the optical functions of the PAR and the beamlines connected to it have been largely automated. Complicated measurements are feasible with a combination of tools running independently

  3. Software Tools for In-Situ Documentation of Built Heritage

    Science.gov (United States)

    Smars, P.

    2013-07-01

    The paper presents open source software tools developed by the author to facilitate in-situ documentation of architectural and archæological heritage. The design choices are exposed and related to a general issue in conservation and documentation: taking decisions about a valuable object under threat . The questions of level of objectivity is central to the three steps of this process. It is our belief that in-situ documentation has to be favoured in this demanding context, full of potential discoveries. The very powerful surveying techniques in rapid development nowadays enhance our vision but often tend to bring back a critical part of the documentation process to the office. The software presented facilitate a direct treatment of the data on the site. Emphasis is given to flexibility, interoperability and simplicity. Key features of the software are listed and illustrated with examples (3D model of Gothic vaults, analysis of the shape of a column, deformation of a wall, direct interaction with AutoCAD).

  4. COMSY - A Software Tool for Aging and Plant Life Management

    International Nuclear Information System (INIS)

    Zander, Andre; Nopper, Helmut

    2012-01-01

    A Plant-wide and systematic Aging and Plant Life Management is essential for the safe operation and/or availability of nuclear power plants. The Aging Management (AM) has the objective to monitor and control degradation effects for safety relevant Systems, Structures and Components (SSCs) which may compromise safety functions of the plant. The Plant Life Management (PLM) methodology also includes aging surveillance for availability relevant SSCs. AM and PLM cover mechanical components, electrical and I and C systems and civil structures All Aging and Plant Life Management rules call for a comprehensive approach, requiring the systematic collection of various aging and safety relevant data on a plant-wide basis. This data needs to be serviced and periodically evaluated. Due to the complexity of the process, this activity needs to be supported by a qualified software tool for the management of aging relevant data and associated documents (approx. 30 000 SSCs). In order to support the power plant operators AREVA NP has developed the software tool COMSY. The COMSY software with its integrated AM modules enables the design and setup of a knowledge-based power plant model compatible to the requirements of international and national rules (e.g. IAEA Safety Guide NS-G-2.12, KTA 1403). In this process, a key task is to identify and monitor degradation mechanisms. For this purpose the COMSY tool provides prognosis and trending functions, which are based on more than 30 years of experience in the evaluation of degradation effects and numerous experimental studies. Since 1998 COMSY has been applied successfully in more than fifty reactor units in this field. The current version 3.0 was revised completely and offers additional AM functions. All aging-relevant component data are compiled and allocated via an integrated power plant model. Owing to existing interfaces to other software solutions and flexible import functions, COMSY is highly compatible with already existing data

  5. Joint Research on Scatterometry and AFM Wafer Metrology

    NARCIS (Netherlands)

    Bodermann, B.; Buhr, E.; Danzebrink, H.U.; Bär, M.; Scholze, F.; Krumrey, M.; Wurm, M.; Klapetek, P.; Hansen, P.E.; Korpelainen, V.; Van Veghel, M.; Yacoot, A.; Siitonen, S.; El Gawhary, O.; Burger, S.; Saastamoinen, T.

    2011-01-01

    Supported by the European Commission and EURAMET, a consortium of 10 participants from national metrology institutes, universities and companies has started a joint research project with the aim of overcoming current challenges in optical scatterometry for traceable linewidth metrology. Both

  6. HITCal: a software tool for analysis of video head impulse test responses.

    Science.gov (United States)

    Rey-Martinez, Jorge; Batuecas-Caletrio, Angel; Matiño, Eusebi; Perez Fernandez, Nicolás

    2015-09-01

    The developed software (HITCal) may be a useful tool in the analysis and measurement of the saccadic video head impulse test (vHIT) responses and with the experience obtained during its use the authors suggest that HITCal is an excellent method for enhanced exploration of vHIT outputs. To develop a (software) method to analyze and explore the vHIT responses, mainly saccades. HITCal was written using a computational development program; the function to access a vHIT file was programmed; extended head impulse exploration and measurement tools were created and an automated saccade analysis was developed using an experimental algorithm. For pre-release HITCal laboratory tests, a database of head impulse tests (HITs) was created with the data collected retrospectively in three reference centers. This HITs database was evaluated by humans and was also computed with HITCal. The authors have successfully built HITCal and it has been released as open source software; the developed software was fully operative and all the proposed characteristics were incorporated in the released version. The automated saccades algorithm implemented in HITCal has good concordance with the assessment by human observers (Cohen's kappa coefficient = 0.7).

  7. DAISY: a new software tool to test global identifiability of biological and physiological systems.

    Science.gov (United States)

    Bellu, Giuseppina; Saccomani, Maria Pia; Audoly, Stefania; D'Angiò, Leontina

    2007-10-01

    A priori global identifiability is a structural property of biological and physiological models. It is considered a prerequisite for well-posed estimation, since it concerns the possibility of recovering uniquely the unknown model parameters from measured input-output data, under ideal conditions (noise-free observations and error-free model structure). Of course, determining if the parameters can be uniquely recovered from observed data is essential before investing resources, time and effort in performing actual biomedical experiments. Many interesting biological models are nonlinear but identifiability analysis for nonlinear system turns out to be a difficult mathematical problem. Different methods have been proposed in the literature to test identifiability of nonlinear models but, to the best of our knowledge, so far no software tools have been proposed for automatically checking identifiability of nonlinear models. In this paper, we describe a software tool implementing a differential algebra algorithm to perform parameter identifiability analysis for (linear and) nonlinear dynamic models described by polynomial or rational equations. Our goal is to provide the biological investigator a completely automatized software, requiring minimum prior knowledge of mathematical modelling and no in-depth understanding of the mathematical tools. The DAISY (Differential Algebra for Identifiability of SYstems) software will potentially be useful in biological modelling studies, especially in physiology and clinical medicine, where research experiments are particularly expensive and/or difficult to perform. Practical examples of use of the software tool DAISY are presented. DAISY is available at the web site http://www.dei.unipd.it/~pia/.

  8. Metrology of human-based and other qualitative measurements

    Science.gov (United States)

    Pendrill, Leslie; Petersson, Niclas

    2016-09-01

    The metrology of human-based and other qualitative measurements is in its infancy—concepts such as traceability and uncertainty are as yet poorly developed. This paper reviews how a measurement system analysis approach, particularly invoking as performance metric the ability of a probe (such as a human being) acting as a measurement instrument to make a successful decision, can enable a more general metrological treatment of qualitative observations. Measures based on human observations are typically qualitative, not only in sectors, such as health care, services and safety, where the human factor is obvious, but also in customer perception of traditional products of all kinds. A principal challenge is that the usual tools of statistics normally employed for expressing measurement accuracy and uncertainty will probably not work reliably if relations between distances on different portions of scales are not fully known, as is typical of ordinal or other qualitative measurements. A key enabling insight is to connect the treatment of decision risks associated with measurement uncertainty to generalized linear modelling (GLM). Handling qualitative observations in this way unites information theory, the perceptive identification and choice paradigms of psychophysics. The Rasch invariant measure psychometric GLM approach in particular enables a proper treatment of ordinal data; a clear separation of probe and item attribute estimates; simple expressions for instrument sensitivity; etc. Examples include two aspects of the care of breast cancer patients, from diagnosis to rehabilitation. The Rasch approach leads in turn to opportunities of establishing metrological references for quality assurance of qualitative measurements. In psychometrics, one could imagine a certified reference for knowledge challenge, for example, a particular concept in understanding physics or for product quality of a certain health care service. Multivariate methods, such as Principal Component

  9. Metrology of human-based and other qualitative measurements

    International Nuclear Information System (INIS)

    Pendrill, Leslie; Petersson, Niclas

    2016-01-01

    The metrology of human-based and other qualitative measurements is in its infancy—concepts such as traceability and uncertainty are as yet poorly developed. This paper reviews how a measurement system analysis approach, particularly invoking as performance metric the ability of a probe (such as a human being) acting as a measurement instrument to make a successful decision, can enable a more general metrological treatment of qualitative observations. Measures based on human observations are typically qualitative, not only in sectors, such as health care, services and safety, where the human factor is obvious, but also in customer perception of traditional products of all kinds. A principal challenge is that the usual tools of statistics normally employed for expressing measurement accuracy and uncertainty will probably not work reliably if relations between distances on different portions of scales are not fully known, as is typical of ordinal or other qualitative measurements. A key enabling insight is to connect the treatment of decision risks associated with measurement uncertainty to generalized linear modelling (GLM). Handling qualitative observations in this way unites information theory, the perceptive identification and choice paradigms of psychophysics. The Rasch invariant measure psychometric GLM approach in particular enables a proper treatment of ordinal data; a clear separation of probe and item attribute estimates; simple expressions for instrument sensitivity; etc. Examples include two aspects of the care of breast cancer patients, from diagnosis to rehabilitation. The Rasch approach leads in turn to opportunities of establishing metrological references for quality assurance of qualitative measurements. In psychometrics, one could imagine a certified reference for knowledge challenge, for example, a particular concept in understanding physics or for product quality of a certain health care service. Multivariate methods, such as Principal Component

  10. La metrología en nuestras vidas

    OpenAIRE

    Jaramillo, Zaira

    2010-01-01

    A primera vista, la palabra "Metrología" nos trae a la mente la idea de condiciones meteorológicas. Nada más alejado de la realidad, porque la Meteorología es la disciplina que se encarga de estudiar las condiciones del tiempo y la Metrología se encarga de estudiar las mediciones.

  11. Software tools for manipulating fe mesh, virtual surgery and post-processing

    Directory of Open Access Journals (Sweden)

    Milašinović Danko Z.

    2009-01-01

    Full Text Available This paper describes a set of software tools which we developed for the calculation of fluid flow through cardiovascular organs. Our tools work with medical data from a CT scanner, but could be used with any other 3D input data. For meshing we used a Tetgen tetrahedral mesh generator, as well as a mesh re-generator that we have developed for conversion of tetrahedral elements into bricks. After adequate meshing we used our PAKF solver for calculation of fluid flow. For human-friendly presentation of results we developed a set of post-processing software tools. With modification of 2D mesh (boundary of cardiovascular organ it is possible to do virtual surgery, so in a case of an aorta with aneurism, which we had received from University Clinical center in Heidelberg from a multi-slice 64-CT scanner, we removed the aneurism and ran calculations on both geometrical models afterwards. The main idea of this methodology is creating a system that could be used in clinics.

  12. Hardware and software and machine-tool simulation with parallel structures mechanisms

    Directory of Open Access Journals (Sweden)

    Keba P.V.

    2016-12-01

    Full Text Available The usage spectrum of mechanisms with parallel structure is spreading all the time. The mechanisms of machine-tools and manipulators become more complicated and it is necessary to improve the program-controlled modules. Closed circuit mechanisms are mostly spread in robotic complexes, where manipulator performs complicated spatial movements by the given trajectory. The usage spectrum is very wide and the most popular are sorting, welding, assembling and others. However, the problem of designing the operating programs is still present even today. It is just because the developed post-processors are created for the equipment that we have for now. But new machine tool constructions appear every day and there is a necessity to control them. The problems associated with using of hardware and software of mechanisms with parallel structure in computer-aided simulation are considered. The program for inverse problem kinematics solving is designed. New method of designing the control programs is found. The kinematic analysis methods options and calculated data obtained by computer mathematics systems are shown with «Tools Glide» software taken as an example.

  13. A Business Analytics Software Tool for Monitoring and Predicting Radiology Throughput Performance.

    Science.gov (United States)

    Jones, Stephen; Cournane, Seán; Sheehy, Niall; Hederman, Lucy

    2016-12-01

    Business analytics (BA) is increasingly being utilised by radiology departments to analyse and present data. It encompasses statistical analysis, forecasting and predictive modelling and is used as an umbrella term for decision support and business intelligence systems. The primary aim of this study was to determine whether utilising BA technologies could contribute towards improved decision support and resource management within radiology departments. A set of information technology requirements were identified with key stakeholders, and a prototype BA software tool was designed, developed and implemented. A qualitative evaluation of the tool was carried out through a series of semi-structured interviews with key stakeholders. Feedback was collated, and emergent themes were identified. The results indicated that BA software applications can provide visibility of radiology performance data across all time horizons. The study demonstrated that the tool could potentially assist with improving operational efficiencies and management of radiology resources.

  14. Differential Evolution for Many-Particle Adaptive Quantum Metrology

    NARCIS (Netherlands)

    Lovett, N.B.; Crosnier, C.; Perarnau- Llobet, M.; Sanders, B.

    2013-01-01

    We devise powerful algorithms based on differential evolution for adaptive many-particle quantum metrology. Our new approach delivers adaptive quantum metrology policies for feedback control that are orders-of-magnitude more efficient and surpass the few-dozen-particle limitation arising in methods

  15. Featureous: A Tool for Feature-Centric Analysis of Java Software

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2010-01-01

    Feature-centric comprehension of source code is necessary for incorporating user-requested modifications during software evolution and maintenance. However, such comprehension is difficult to achieve in case of large object-oriented programs due to the size, complexity, and implicit character...... of mappings between features and source code. To support programmers in overcoming these difficulties, we present a feature-centric analysis tool, Featureous. Our tool extends the NetBeans IDE with mechanisms for efficient location of feature implementations in legacy source code, and an extensive analysis...

  16. Metrological issues in molecular radiotherapy

    International Nuclear Information System (INIS)

    D'Arienzo, Marco; Capogni, Marco; Smyth, Vere; Cox, Maurice; Johansson, Lena; Bobin, Christophe

    2014-01-01

    The therapeutic effect from molecular radiation therapy (MRT), on both tumour and normal tissue, is determined by the radiation absorbed dose. Recent research indicates that as a consequence of biological variation across patients the absorbed dose can vary, for the same administered activity, by as much as two orders of magnitude. The international collaborative EURAMET-EMRP project Metrology for molecular radiotherapy (MetroMRT) is addressing this problem. The overall aim of the project is to develop methods of calibrating and verifying clinical dosimetry in MRT. In the present paper an overview of the metrological issues in molecular radiotherapy is provided. (authors)

  17. ATLAS software configuration and build tool optimisation

    Science.gov (United States)

    Rybkin, Grigory; Atlas Collaboration

    2014-06-01

    ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of reads of requirements files that are now read once per package by a CMT build command that generates cached requirements files for subsequent CMT build commands; introduction of more fine-grained build parallelism at package task level, i.e., dependent applications and libraries are compiled in parallel; code optimisation of CMT commands used for build; introduction of package level build parallelism, i. e., parallelise the build of independent packages. By default, CMT launches NUMBER-OF-PROCESSORS build commands in parallel. The other focus was on CMT commands optimisation in general that made them approximately 2 times faster. CMT can generate a cached requirements file for the environment setup command, which is especially useful for deployment on distributed file systems like AFS or CERN VMFS. The use of parallelism, caching and code optimisation significantly-by several times-reduced software build time, environment setup time, increased the efficiency of

  18. Metrology and properties of engineering surfaces

    CERN Document Server

    Greenwood, J; Chetwynd, D

    2001-01-01

    Metrology and Properties of Engineering Surfaces provides in a single volume a comprehensive and authoritative treatment of the crucial topics involved in the metrology and properties of engineering surfaces. The subject matter is a central issue in manufacturing technology, since the quality and reliability of manufactured components depend greatly upon the selection and qualities of the appropriate materials as ascertained through measurement. The book can in broad terms be split into two parts; the first deals with the metrology of engineering surfaces and covers the important issues relating to the measurement and characterization of surfaces in both two and three dimensions. This covers topics such as filtering, power spectral densities, autocorrelation functions and the use of Fractals in topography. A significant proportion is dedicated to the calibration of scanning probe microscopes using the latest techniques. The remainder of the book deals with the properties of engineering surfaces and covers a w...

  19. A software tool for simulation of surfaces generated by ball nose end milling

    DEFF Research Database (Denmark)

    Bissacco, Giuliano

    2004-01-01

    , for prediction of surface topography of ball nose end milled surfaces, was developed. Such software tool is based on a simplified model of the ideal tool motion and neglects the effects due to run-out, static and dynamic deflections and error motions, but has the merit of generating in output a file in a format...... readable by a surface processor software (SPIP [2]), for calculation of a number of surface roughness parameters. In the next paragraph a description of the basic features of ball nose end milled surfaces is given, while in paragraph 3 the model is described....

  20. Deep sub-wavelength metrology for advanced defect classification

    Science.gov (United States)

    van der Walle, P.; Kramer, E.; van der Donck, J. C. J.; Mulckhuyse, W.; Nijsten, L.; Bernal Arango, F. A.; de Jong, A.; van Zeijl, E.; Spruit, H. E. T.; van den Berg, J. H.; Nanda, G.; van Langen-Suurling, A. K.; Alkemade, P. F. A.; Pereira, S. F.; Maas, D. J.

    2017-06-01

    Particle defects are important contributors to yield loss in semi-conductor manufacturing. Particles need to be detected and characterized in order to determine and eliminate their root cause. We have conceived a process flow for advanced defect classification (ADC) that distinguishes three consecutive steps; detection, review and classification. For defect detection, TNO has developed the Rapid Nano (RN3) particle scanner, which illuminates the sample from nine azimuth angles. The RN3 is capable of detecting 42 nm Latex Sphere Equivalent (LSE) particles on XXX-flat Silicon wafers. For each sample, the lower detection limit (LDL) can be verified by an analysis of the speckle signal, which originates from the surface roughness of the substrate. In detection-mode (RN3.1), the signal from all illumination angles is added. In review-mode (RN3.9), the signals from all nine arms are recorded individually and analyzed in order to retrieve additional information on the shape and size of deep sub-wavelength defects. This paper presents experimental and modelling results on the extraction of shape information from the RN3.9 multi-azimuth signal such as aspect ratio, skewness, and orientation of test defects. Both modeling and experimental work confirm that the RN3.9 signal contains detailed defect shape information. After review by RN3.9, defects are coarsely classified, yielding a purified Defect-of-Interest (DoI) list for further analysis on slower metrology tools, such as SEM, AFM or HIM, that provide more detailed review data and further classification. Purifying the DoI list via optical metrology with RN3.9 will make inspection time on slower review tools more efficient.

  1. Metrology Department - DEMET

    International Nuclear Information System (INIS)

    1989-01-01

    In this report are presented the activities and purposes of the Metrology Dept. of the Institute of Radioprotection and Dosimetry of Brazilian CNEN. It is also presented a list of services rendered by that Dept., the projects in course, personnel and publications.(J.A.M.M.)

  2. EINSTEIN - Expert system for an Intelligent Supply of Thermal Energy in Industry. Audit methodology and software tool

    Energy Technology Data Exchange (ETDEWEB)

    Schweiger, Hans; Danov, Stoyan (energyXperts.NET (Spain)); Vannoni, Claudia; Facci, Enrico (Sapienza Univ. of Rome, Dept. of Mechanics and Aeronautics, Rome (Italy)); Brunner, Christoph; Slawitsch, Bettina (Joanneum Research, Inst. of Sustainable Techniques and Systems - JOINTS, Graz (Austria))

    2009-07-01

    For optimising thermal energy supply in industry, a holistic integral approach is required that includes possibilities of demand reduction by heat recovery and process integration, and by an intelligent combination of efficient heat and cold supply technologies. EINSTEIN is a tool-kit for fast and high quality thermal energy audits in industry, composed by an audit guide describing the methodology and by a software tool that guides the auditor through all the audit steps. The main features of EINSTEIN are: (1) a basic questionnaire helps for systematic collection of the necessary information with the possibility to acquire data by distance; (2) special tools allow for fast consistency checking and estimation of missing data, so that already with very few data some first predictions can be made; (3) the data processing is based on standardised models for industrial processes and industrial heat supply systems; (4) semi-automatization: the software tool gives support to decision making for the generation of alternative heat and cold supply proposals, carries out automatically all the necessary calculations, including dynamic simulation of the heat supply system, and creates a standard audit report. The software tool includes modules for benchmarking, automatic design of heat exchanger networks, and design assistants for the heat and cold supply system. The core of the expert system software tool is available for free, as an open source software project. This type of software development has shown to be very efficient for dissemination of knowledge and for the continuous maintenance and improvement thanks to user contributions.

  3. Metrological Array of Cyber-Physical Systems. Part 3. Smart Energy-Efficient House

    Directory of Open Access Journals (Sweden)

    Ihor HNES

    2015-04-01

    Full Text Available Smart energy-efficient houses as the components of Cyber-Physical Systems are developed intensively. The main stream of progress consists in the research of Smart houses’ energy supply. By this option the mentioned objects are advancing from passive houses through net-zero energy houses to active houses that are capable of sharing their own accumulated energy with other components of Cyber-Physical Systems. We consider the problems of studying the metrology models and measuring the heat dissipation in such houses trying to apply network and software achievements as well as the new types of devices with improved characteristics.

  4. National Laboratory of Ionizing Radiation Metrology - Brazilian CNEN

    International Nuclear Information System (INIS)

    1992-01-01

    The activities of the Brazilian National Laboratory of Ionizing Radiations Metrology are described. They include research and development of metrological techniques and procedures, the calibration of area radiation monitors, clinical dosemeters and other instruments and the preparation and standardization of reference radioactive sources. 4 figs., 13 tabs

  5. 64nm pitch metal1 double patterning metrology: CD and OVL control by SEMCD, image based overlay and diffraction based overlay

    Science.gov (United States)

    Ducoté, Julien; Dettoni, Florent; Bouyssou, Régis; Le-Gratiet, Bertrand; Carau, Damien; Dezauzier, Christophe

    2015-03-01

    Patterning process control of advanced nodes has required major changes over the last few years. Process control needs of critical patterning levels since 28nm technology node is extremely aggressive showing that metrology accuracy/sensitivity must be finely tuned. The introduction of pitch splitting (Litho-Etch-Litho-Etch) at 14FDSOInm node requires the development of specific metrologies to adopt advanced process control (for CD, overlay and focus corrections). The pitch splitting process leads to final line CD uniformities that are a combination of the CD uniformities of the two exposures, while the space CD uniformities are depending on both CD and OVL variability. In this paper, investigations of CD and OVL process control of 64nm minimum pitch at Metal1 level of 14FDSOI technology, within the double patterning process flow (Litho, hard mask etch, line etch) are presented. Various measurements with SEMCD tools (Hitachi), and overlay tools (KT for Image Based Overlay - IBO, and ASML for Diffraction Based Overlay - DBO) are compared. Metrology targets are embedded within a block instanced several times within the field to perform intra-field process variations characterizations. Specific SEMCD targets were designed for independent measurement of both line CD (A and B) and space CD (A to B and B to A) for each exposure within a single measurement during the DP flow. Based on those measurements correlation between overlay determined with SEMCD and with standard overlay tools can be evaluated. Such correlation at different steps through the DP flow is investigated regarding the metrology type. Process correction models are evaluated with respect to the measurement type and the intra-field sampling.

  6. Using Case Study Videos as an Effective Active Learning Tool to Teach Software Development Best Practices (Invited Paper

    Directory of Open Access Journals (Sweden)

    Sushil Acharya

    2017-06-01

    Full Text Available The fundamental challenge to a solution to improve software quality is in the people and processes that develop software products. Imparting real world experiences in software development best practices to undergraduate students is often a challenge due to the lack of effective learning tools. This pedagogical requirement is important because graduates are expected to develop software that meets rigorous quality standards. Certain best practices are difficult to comprehend by course lectures alone and are enhanced with supplemental learning tools. Realizing the necessity of such teaching tools, we designed and developed six (6 delivery hours of case study videos for use in courses that impart knowledge on Software Verification & Validation (SV&V topics viz. requirements engineering, and software reviews. We see case study videos as an effective active learning tool in our flipped classroom approach. We present our design of the case study video in its generic components envisioning how it may be used in general. To evaluate our active learning tools we mapped the learning objectives of the case Study videos to the expected learning outcomes for ABET accreditation of an undergraduate engineering program. Our implementation has been disseminated to partner institutions. Results of delivery in a faculty workshop and in two different university courses are shared.

  7. SafetyAnalyst : software tools for safety management of specific highway sites

    Science.gov (United States)

    2010-07-01

    SafetyAnalyst provides a set of software tools for use by state and local highway agencies for highway safety management. SafetyAnalyst can be used by highway agencies to improve their programming of site-specific highway safety improvements. SafetyA...

  8. Fast and accurate: high-speed metrological large-range AFM for surface and nanometrology

    Science.gov (United States)

    Dai, Gaoliang; Koenders, Ludger; Fluegge, Jens; Hemmleb, Matthias

    2018-05-01

    Low measurement speed remains a major shortcoming of the scanning probe microscopic technique. It not only leads to a low measurement throughput, but a significant measurement drift over the long measurement time needed (up to hours or even days). To overcome this challenge, PTB, the national metrology institute of Germany, has developed a high-speed metrological large-range atomic force microscope (HS Met. LR-AFM) capable of measuring speeds up to 1 mm s‑1. This paper has introduced the design concept in detail. After modelling scanning probe microscopic measurements, our results suggest that the signal spectrum of the surface to be measured is the spatial spectrum of the surface scaled by the scanning speed. The higher the scanning speed , the broader the spectrum to be measured. To realise an accurate HS Met. LR-AFM, our solution is to combine different stages/sensors synchronously in measurements, which provide a much larger spectrum area for high-speed measurement capability. Two application examples have been demonstrated. The first is a new concept called reference areal surface metrology. Using the developed HS Met. LR-AFM, surfaces are measured accurately and traceably at a speed of 500 µm s‑1 and the results are applied as a reference 3D data map of the surfaces. By correlating the reference 3D data sets and 3D data sets of tools under calibration, which are measured at the same surface, it has the potential to comprehensively characterise the tools, for instance, the spectrum properties of the tools. The investigation results of two commercial confocal microscopes are demonstrated, indicating very promising results. The second example is the calibration of a kind of 3D nano standard, which has spatially distributed landmarks, i.e. special unique features defined by 3D-coordinates. Experimental investigations confirmed that the calibration accuracy is maintained at a measurement speed of 100 µm s‑1, which improves the calibration efficiency by a

  9. Process and system - A dual definition, revisited with consequences in metrology

    Science.gov (United States)

    Ruhm, K. H.

    2010-07-01

    Lets assert that metrology life could be easier scientifically as well as technologically, if we, intentionally, would make an explicit distinction between two outstanding domains, namely the given, really existent domain of processes and the just virtually existent domain of systems, the latter of which is designed and used by the human mind. The abstract domain of models, by which we map the manifold reality of processes, is itself part of the domain of systems. Models support comprehension and communication, although they are normally extreme simplifications of properties and behaviour of a concrete reality. So, systems and signals represent processes and quantities, which are described by means of Signal and System Theory as well as by Stochastics and Statistics. The following presentation of this new, demanding and somehow irritating definition of the terms process and system as a dual pair is unusual indeed, but it opens the door widely to a better and more consistent discussion and understanding of manifold scientific tools in many areas. Metrology [4] is one of the important fields of concern due to many reasons: One group of the soft and hard links between the domain of processes and the domain of systems is realised by concepts of measurement science on the one hand and by instrumental tools of measurement technology on the other hand.

  10. Exploiting Software Tool Towards Easier Use And Higher Efficiency

    Science.gov (United States)

    Lin, G. H.; Su, J. T.; Deng, Y. Y.

    2006-08-01

    In developing countries, using data based on instrument made by themselves in maximum extent is very important. It is not only related to maximizing science returns upon prophase investment -- deep accumulations in every aspects but also science output. Based on the idea, we are exploiting a software (called THDP: Tool of Huairou Data Processing). It is used for processing a series of issues, which is met necessary in processing data. This paper discusses its designed purpose, functions, method and specialities. The primary vehicle for general data interpretation is through various techniques of data visualization, techniques of interactive. In the software, we employed Object Oriented approach. It is appropriate to the vehicle. it is imperative that the approach provide not only function, but do so in as convenient a fashion as possible. As result of the software exploiting, it is not only easier to learn data processing for beginner and more convenienter to need further improvement for senior but also increase greatly efficiency in every phrases include analyse, parameter adjusting, result display. Under frame of virtual observatory, for developing countries, we should study more and newer related technologies, which can advance ability and efficiency in science research, like the software we are developing

  11. Metrology at Philip Morris Europe

    Directory of Open Access Journals (Sweden)

    Gualandris R

    2014-12-01

    Full Text Available The importance of the metrology function at Philip Morris Europe (PME, a multinational organisation producing at over 40 sites in the European, Middle Eastern and African Regions is presented. Standardisation of test methods and equipment as well as the traceability of calibration gauges to the same reference gauge are essential in order to obtain comparable results among the various production centers. The metrology function as well as the qualification of instruments and the drafting of test and calibration operating procedures for this region are conducted or co-ordinated by the Research and Development Department in Neuchatel, Switzerland. In this paper the metrology function within PME is presented based on the measurement of the resistance to draw for which the PME R&D laboratory is accredited (ISO/CEI 17025, as both a calibration and a testing laboratory. The following topics are addressed in this paper: traceability of calibration standards to national standards; comparison of results among manufacturing centres; the choice, the budget as well as the computation of uncertainties. Furthermore, some practical aspects related to the calibration and use of the glass multicapillary gauges are discussed.

  12. Metrology and quality control handbook

    International Nuclear Information System (INIS)

    Hofmann, D.

    1983-01-01

    This book tries to present the fundamentals of metrology and quality control in brief surveys. Compromises had to be made in order to reduce the material available to a sensible volume for the sake of clarity. This becomes evident by the following two restrictions which had to made: First, in dealing with the theoretical principles of metrology and quality control, mere reference had to be made in many cases to the great variety of special literature without discussing it to explain further details. Second, in dealing with the application of metrology and quality control techniques in practice, only the basic qantities of the International System of Units (SI) could be taken into account as a rule. Some readers will note that many special measuring methods and equipment known to them are not included in this book. I do hope, however, that this short-coming will show to have a positive effect, too. This book will show the reader how to find the basic quantities and units from the derived quantities and units, and the steps that are necessary to solve any kind of measuring task. (orig./RW) [de

  13. Flexible resources for quantum metrology

    Science.gov (United States)

    Friis, Nicolai; Orsucci, Davide; Skotiniotis, Michalis; Sekatski, Pavel; Dunjko, Vedran; Briegel, Hans J.; Dür, Wolfgang

    2017-06-01

    Quantum metrology offers a quadratic advantage over classical approaches to parameter estimation problems by utilising entanglement and nonclassicality. However, the hurdle of actually implementing the necessary quantum probe states and measurements, which vary drastically for different metrological scenarios, is usually not taken into account. We show that for a wide range of tasks in metrology, 2D cluster states (a particular family of states useful for measurement-based quantum computation) can serve as flexible resources that allow one to efficiently prepare any required state for sensing, and perform appropriate (entangled) measurements using only single qubit operations. Crucially, the overhead in the number of qubits is less than quadratic, thus preserving the quantum scaling advantage. This is ensured by using a compression to a logarithmically sized space that contains all relevant information for sensing. We specifically demonstrate how our method can be used to obtain optimal scaling for phase and frequency estimation in local estimation problems, as well as for the Bayesian equivalents with Gaussian priors of varying widths. Furthermore, we show that in the paradigmatic case of local phase estimation 1D cluster states are sufficient for optimal state preparation and measurement.

  14. Flexible resources for quantum metrology

    International Nuclear Information System (INIS)

    Friis, Nicolai; Orsucci, Davide; Skotiniotis, Michalis; Sekatski, Pavel; Dunjko, Vedran; Briegel, Hans J; Dür, Wolfgang

    2017-01-01

    Quantum metrology offers a quadratic advantage over classical approaches to parameter estimation problems by utilising entanglement and nonclassicality. However, the hurdle of actually implementing the necessary quantum probe states and measurements, which vary drastically for different metrological scenarios, is usually not taken into account. We show that for a wide range of tasks in metrology, 2D cluster states (a particular family of states useful for measurement-based quantum computation) can serve as flexible resources that allow one to efficiently prepare any required state for sensing, and perform appropriate (entangled) measurements using only single qubit operations. Crucially, the overhead in the number of qubits is less than quadratic, thus preserving the quantum scaling advantage. This is ensured by using a compression to a logarithmically sized space that contains all relevant information for sensing. We specifically demonstrate how our method can be used to obtain optimal scaling for phase and frequency estimation in local estimation problems, as well as for the Bayesian equivalents with Gaussian priors of varying widths. Furthermore, we show that in the paradigmatic case of local phase estimation 1D cluster states are sufficient for optimal state preparation and measurement. (paper)

  15. Metrology in electricity and magnetism: EURAMET activities today and tomorrow

    Science.gov (United States)

    Piquemal, F.; Jeckelmann, B.; Callegaro, L.; Hällström, J.; Janssen, T. J. B. M.; Melcher, J.; Rietveld, G.; Siegner, U.; Wright, P.; Zeier, M.

    2017-10-01

    Metrology dedicated to electricity and magnetism has changed considerably in recent years. It encompasses almost all modern scientific, industrial, and societal challenges, e.g. the revision of the International System of Units, the profound transformation of industry, changes in energy use and generation, health, and environment, as well as nanotechnologies (including graphene and 2D materials) and quantum engineering. Over the same period, driven by the globalization of worldwide trade, the Mutual Recognition Arrangement (referred to as the CIPM MRA) was set up. As a result, the regional metrology organizations (RMOs) of national metrology institutes have grown in significance. EURAMET is the European RMO and has been very prominent in developing a strategic research agenda (SRA) and has established a comprehensive research programme. This paper reviews the highlights of EURAMET in electrical metrology within the European Metrology Research Programme and its main contributions to the CIPM MRA. In 2012 EURAMET undertook an extensive roadmapping exercise for proposed activities for the next decade which will also be discussed in this paper. This work has resulted in a new SRA of the second largest European funding programme: European Metrology Programme for Innovation and Research.

  16. Optical vortex metrology for non-destructive testing

    DEFF Research Database (Denmark)

    Wang, W.; Hanson, Steen Grüner

    2009-01-01

    Based on the phase singularities in optical fields, we introduce a new technique, referred to as Optical Vortex Metrology, and demonstrate its application to nano- displacement, flow measurements and biological kinematic analysis.......Based on the phase singularities in optical fields, we introduce a new technique, referred to as Optical Vortex Metrology, and demonstrate its application to nano- displacement, flow measurements and biological kinematic analysis....

  17. Software Construction and Composition Tools for Petascale Computing SCW0837 Progress Report

    Energy Technology Data Exchange (ETDEWEB)

    Epperly, T W; Hochstein, L

    2011-09-12

    The majority of scientific software is distributed as source code. As the number of library dependencies and supported platforms increases, so does the complexity of describing the rules for configuring and building software. In this project, we have performed an empirical study of the magnitude of the build problem by examining the development history of two DOE-funded scientific software projects. We have developed MixDown, a meta-build tool, to simplify the task of building applications that depend on multiple third-party libraries. The results of this research indicate that the effort that scientific programmers spend takes a significant fraction of the total development effort and that the use of MixDown can significantly simplify the task of building software with multiple dependencies.

  18. Metrological Reliability of Medical Devices

    Science.gov (United States)

    Costa Monteiro, E.; Leon, L. F.

    2015-02-01

    The prominent development of health technologies of the 20th century triggered demands for metrological reliability of physiological measurements comprising physical, chemical and biological quantities, essential to ensure accurate and comparable results of clinical measurements. In the present work, aspects concerning metrological reliability in premarket and postmarket assessments of medical devices are discussed, pointing out challenges to be overcome. In addition, considering the social relevance of the biomeasurements results, Biometrological Principles to be pursued by research and innovation aimed at biomedical applications are proposed, along with the analysis of their contributions to guarantee the innovative health technologies compliance with the main ethical pillars of Bioethics.

  19. Software attribute visualization for high integrity software

    Energy Technology Data Exchange (ETDEWEB)

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  20. Legal control scenario applied to embedded software in measuring instruments

    International Nuclear Information System (INIS)

    Castro, C.G. de; Brandao, P.C.; Leitao, F.O.

    2013-01-01

    This paper presents a scenario of legal control of software in measuring instruments. Such control is hampered by intrinsic problems related to software analysis and verification. To circumvent these difficulties, several projects are being developed to attack different stages of legal control, such as the model type approval, periodic verifications and metrological expertise. The proposals that will arise from these projects will be discussed among the parts and may be incorporated into the measuring instruments. (author)

  1. PACMAN: PRIMA astrometric instrument software

    Science.gov (United States)

    Abuter, Roberto; Sahlmann, Johannes; Pozna, Eszter

    2010-07-01

    The dual feed astrometric instrument software of PRIMA (PACMAN) that is currently being integrated at the VLTI will use two spatially modulated fringe sensor units and a laser metrology system to carry out differential astrometry. Its software and hardware compromises a distributed system involving many real time computers and workstations operating in a synchronized manner. Its architecture has been designed to allow the construction of efficient and flexible calibration and observation procedures. In parallel, a novel scheme of integrating M-code (MATLAB/OCTAVE) with standard VLT (Very Large Telescope) control software applications had to be devised in order to support numerically intensive operations and to have the capacity of adapting to fast varying strategies and algorithms. This paper presents the instrument software, including the current operational sequences for the laboratory calibration and sky calibration. Finally, a detailed description of the algorithms with their implementation, both under M and C code, are shown together with a comparative analysis of their performance and maintainability.

  2. Analyst Tools and Quality Control Software for the ARM Data System

    Energy Technology Data Exchange (ETDEWEB)

    Moore, S.T.

    2004-12-14

    ATK Mission Research develops analyst tools and automated quality control software in order to assist the Atmospheric Radiation Measurement (ARM) Data Quality Office with their data inspection tasks. We have developed a web-based data analysis and visualization tool, called NCVweb, that allows for easy viewing of ARM NetCDF files. NCVweb, along with our library of sharable Interactive Data Language procedures and functions, allows even novice ARM researchers to be productive with ARM data with only minimal effort. We also contribute to the ARM Data Quality Office by analyzing ARM data streams, developing new quality control metrics, new diagnostic plots, and integrating this information into DQ HandS - the Data Quality Health and Status web-based explorer. We have developed several ways to detect outliers in ARM data streams and have written software to run in an automated fashion to flag these outliers.

  3. Recent development in software and automation tools for high-throughput discovery bioanalysis.

    Science.gov (United States)

    Shou, Wilson Z; Zhang, Jun

    2012-05-01

    Bioanalysis with LC-MS/MS has been established as the method of choice for quantitative determination of drug candidates in biological matrices in drug discovery and development. The LC-MS/MS bioanalytical support for drug discovery, especially for early discovery, often requires high-throughput (HT) analysis of large numbers of samples (hundreds to thousands per day) generated from many structurally diverse compounds (tens to hundreds per day) with a very quick turnaround time, in order to provide important activity and liability data to move discovery projects forward. Another important consideration for discovery bioanalysis is its fit-for-purpose quality requirement depending on the particular experiments being conducted at this stage, and it is usually not as stringent as those required in bioanalysis supporting drug development. These aforementioned attributes of HT discovery bioanalysis made it an ideal candidate for using software and automation tools to eliminate manual steps, remove bottlenecks, improve efficiency and reduce turnaround time while maintaining adequate quality. In this article we will review various recent developments that facilitate automation of individual bioanalytical procedures, such as sample preparation, MS/MS method development, sample analysis and data review, as well as fully integrated software tools that manage the entire bioanalytical workflow in HT discovery bioanalysis. In addition, software tools supporting the emerging high-resolution accurate MS bioanalytical approach are also discussed.

  4. Integration of mask and silicon metrology in DFM

    Science.gov (United States)

    Matsuoka, Ryoichi; Mito, Hiroaki; Sugiyama, Akiyuki; Toyoda, Yasutaka

    2009-03-01

    We have developed a highly integrated method of mask and silicon metrology. The method adopts a metrology management system based on DBM (Design Based Metrology). This is the high accurate contouring created by an edge detection algorithm used in mask CD-SEM and silicon CD-SEM. We have inspected the high accuracy, stability and reproducibility in the experiments of integration. The accuracy is comparable with that of the mask and silicon CD-SEM metrology. In this report, we introduce the experimental results and the application. As shrinkage of design rule for semiconductor device advances, OPC (Optical Proximity Correction) goes aggressively dense in RET (Resolution Enhancement Technology). However, from the view point of DFM (Design for Manufacturability), the cost of data process for advanced MDP (Mask Data Preparation) and mask producing is a problem. Such trade-off between RET and mask producing is a big issue in semiconductor market especially in mask business. Seeing silicon device production process, information sharing is not completely organized between design section and production section. Design data created with OPC and MDP should be linked to process control on production. But design data and process control data are optimized independently. Thus, we provided a solution of DFM: advanced integration of mask metrology and silicon metrology. The system we propose here is composed of followings. 1) Design based recipe creation: Specify patterns on the design data for metrology. This step is fully automated since they are interfaced with hot spot coordinate information detected by various verification methods. 2) Design based image acquisition: Acquire the images of mask and silicon automatically by a recipe based on the pattern design of CD-SEM.It is a robust automated step because a wide range of design data is used for the image acquisition. 3) Contour profiling and GDS data generation: An image profiling process is applied to the acquired image based

  5. Radionuclide metrology: traceability and response to a radiological accident

    Energy Technology Data Exchange (ETDEWEB)

    Tauhata, L.; Cruz, P.A.L. da; Silva, C.J. da; Delgado, J.U.; Oliveira, A.E. de; Oliveira, E.M. de; Poledna, R.; Loureiro, J. dos S.; Ferreira Filho, A.L.; Silva, R.L. da; Filho, O. L.T.; Santos, A.R.L. dos; Veras, E.V. de; Rangel, J. de A.; Quadros, A.L.L.; Araújo, M.T.F. de; Souza, P.S. de; Ruzzarim, A.; Conceição, D.A. da; Iwahara, A., E-mail: palcruz@ird.gov.br [Instituto de Radioproteção e Dosimetria (LNMRI/IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil). Lab. Nacional de Metrologia das Radiações Ionizantes

    2017-07-01

    In the case of a radiological accident, there are characteristic phases: discovery and initial assistance with first aid; the triage and monitoring of the affected population; the release of the affected people; forward the victims to medical care; as well as the preparation of the report on the accident. In addition, studies and associated researches performed in the later period. Monitors, dosimeters and measuring systems should be calibrated by contaminating radionuclide standards. The radioactive sources used must be metrologically reliable. In Brazil, this function is performed by LNMRI/IRD/CNEN, designated by INMETRO, which Radionuclide Metrology Laboratory is responsible for the standardization and supply of radioactive sources in diverse geometries and matrices. This laboratory has a stock of radionuclide solutions with controlled environmental variables for the preparation of sources, which are calibrated and standardized by mean of primary and secondary systems. It is also responsible for the dissemination of standards and, in order to establish the metrological traceability of national standards, participates in international key-comparisons promoted by BIPM and regional metrology organizations. Internally, it promotes the National Comparison Programs for laboratories for the analysis of environmental samples and the traceability for producing centers of radiopharmaceuticals and Nuclear Medicine Services in the country. The paper presents the demand for {sup 137}Cs related to the radioactive accident in Goiania/Brazil and the significant results for the main radionuclides standardized by the Radionuclide Metrology Laboratory for international key-comparisons and national comparisons to provide metrological traceability. With the obtained results, the LNMRI of Brazil integrates the international metrology BIPM network and fulfills its function of supplying, with about a hundred of radioactive standards, the country's needs in different applications

  6. Radionuclide metrology: traceability and response to a radiological accident

    International Nuclear Information System (INIS)

    Tauhata, L.; Cruz, P.A.L. da; Silva, C.J. da; Delgado, J.U.; Oliveira, A.E. de; Oliveira, E.M. de; Poledna, R.; Loureiro, J. dos S.; Ferreira Filho, A.L.; Silva, R.L. da; Filho, O. L.T.; Santos, A.R.L. dos; Veras, E.V. de; Rangel, J. de A.; Quadros, A.L.L.; Araújo, M.T.F. de; Souza, P.S. de; Ruzzarim, A.; Conceição, D.A. da; Iwahara, A.

    2017-01-01

    In the case of a radiological accident, there are characteristic phases: discovery and initial assistance with first aid; the triage and monitoring of the affected population; the release of the affected people; forward the victims to medical care; as well as the preparation of the report on the accident. In addition, studies and associated researches performed in the later period. Monitors, dosimeters and measuring systems should be calibrated by contaminating radionuclide standards. The radioactive sources used must be metrologically reliable. In Brazil, this function is performed by LNMRI/IRD/CNEN, designated by INMETRO, which Radionuclide Metrology Laboratory is responsible for the standardization and supply of radioactive sources in diverse geometries and matrices. This laboratory has a stock of radionuclide solutions with controlled environmental variables for the preparation of sources, which are calibrated and standardized by mean of primary and secondary systems. It is also responsible for the dissemination of standards and, in order to establish the metrological traceability of national standards, participates in international key-comparisons promoted by BIPM and regional metrology organizations. Internally, it promotes the National Comparison Programs for laboratories for the analysis of environmental samples and the traceability for producing centers of radiopharmaceuticals and Nuclear Medicine Services in the country. The paper presents the demand for 137 Cs related to the radioactive accident in Goiania/Brazil and the significant results for the main radionuclides standardized by the Radionuclide Metrology Laboratory for international key-comparisons and national comparisons to provide metrological traceability. With the obtained results, the LNMRI of Brazil integrates the international metrology BIPM network and fulfills its function of supplying, with about a hundred of radioactive standards, the country's needs in different applications

  7. Design and implementation of a software tool intended for simulation and test of real time codes

    International Nuclear Information System (INIS)

    Le Louarn, C.

    1986-09-01

    The objective of real time software testing is to show off processing errors and unobserved functional requirements or timing constraints in a code. In the perspective of safety analysis of nuclear equipments of power plants testing should be carried independently from the physical process (which is not generally available), and because casual hardware failures must be considered. We propose here a simulation and test tool, integrally software, with large interactive possibilities for testing assembly code running on microprocessor. The OST (outil d'aide a la simulation et au Test de logiciels temps reel) simulates code execution and hardware or software environment behaviour. Test execution is closely monitored and many useful informations are automatically saved. The present thesis work details, after exposing methods and tools dedicated to real time software, the OST system. We show the internal mechanisms and objects of the system: particularly ''events'' (which describe evolutions of the system under test) and mnemonics (which describe the variables). Then, we detail the interactive means available to the user for constructing the test data and the environment of the tested software. Finally, a prototype implementation is presented along with the results of the tests carried out. This demonstrates the many advantages of the use of an automatic tool over a manual investigation. As a conclusion, further developments, nececessary to complete the final tool are rewieved [fr

  8. Experimental Demonstration of Higher Precision Weak-Value-Based Metrology Using Power Recycling

    Science.gov (United States)

    Wang, Yi-Tao; Tang, Jian-Shun; Hu, Gang; Wang, Jian; Yu, Shang; Zhou, Zong-Quan; Cheng, Ze-Di; Xu, Jin-Shi; Fang, Sen-Zhi; Wu, Qing-Lin; Li, Chuan-Feng; Guo, Guang-Can

    2016-12-01

    The weak-value-based metrology is very promising and has attracted a lot of attention in recent years because of its remarkable ability in signal amplification. However, it is suggested that the upper limit of the precision of this metrology cannot exceed that of classical metrology because of the low sample size caused by the probe loss during postselection. Nevertheless, a recent proposal shows that this probe loss can be reduced by the power-recycling technique, and thus enhance the precision of weak-value-based metrology. Here we experimentally realize the power-recycled interferometric weak-value-based beam-deflection measurement and obtain the amplitude of the detected signal and white noise by discrete Fourier transform. Our results show that the detected signal can be strengthened by power recycling, and the power-recycled weak-value-based signal-to-noise ratio can surpass the upper limit of the classical scheme, corresponding to the shot-noise limit. This work sheds light on higher precision metrology and explores the real advantage of the weak-value-based metrology over classical metrology.

  9. A tool to include gamma analysis software into a quality assurance program.

    Science.gov (United States)

    Agnew, Christina E; McGarry, Conor K

    2016-03-01

    To provide a tool to enable gamma analysis software algorithms to be included in a quality assurance (QA) program. Four image sets were created comprising two geometric images to independently test the distance to agreement (DTA) and dose difference (DD) elements of the gamma algorithm, a clinical step and shoot IMRT field and a clinical VMAT arc. The images were analysed using global and local gamma analysis with 2 in-house and 8 commercially available software encompassing 15 software versions. The effect of image resolution on gamma pass rates was also investigated. All but one software accurately calculated the gamma passing rate for the geometric images. Variation in global gamma passing rates of 1% at 3%/3mm and over 2% at 1%/1mm was measured between software and software versions with analysis of appropriately sampled images. This study provides a suite of test images and the gamma pass rates achieved for a selection of commercially available software. This image suite will enable validation of gamma analysis software within a QA program and provide a frame of reference by which to compare results reported in the literature from various manufacturers and software versions. Copyright © 2015. Published by Elsevier Ireland Ltd.

  10. Mirror surface metrology and polishing for AXAF/TMA

    International Nuclear Information System (INIS)

    Slomba, A.; Babish, R.; Glenn, P.

    1985-01-01

    The achievement of the derived goals for mirror surface quality on the Advanced X-ray Astrophysics Facility (AXAF), Technology Mirror Assembly (TMA) required a combination of state-of-the-art metrology and polishing techniques. In this paper, the authors summarize the derived goals and cover the main facets of the various metrology instruments employed, as well as the philosophy and technique used in the polishing work. In addition, they show how progress was measured against the goals, using the detailed error budget for surface errors and a mathematical model for performance prediction. The metrology instruments represented a considerable advance on the state-of-the-art and fully satisfied the error budget goals for the various surface errors. They were capable of measuring the surface errors over a large range of spatial periods, from low-frequency figure errors to microroughness. The polishing was accomplished with a computer-controlled process, guided by the combined data from various metrology instruments. This process was also tailored to reduce the surface errors over the full range of spatial periods

  11. A review of electronic engineering design free software tools

    OpenAIRE

    Medrano Sánchez, Carlos; Plaza García, Inmaculada; Castro Gil, Manuel Alonso; García Sevilla, Francisco; Martínez Calero, J.D.; Pou Félix, Josep; Corbalán Fuertes, Montserrat

    2010-01-01

    In this paper, we review electronic design free software tools. We have searched open source programs that help with several tasks of the electronic design flow: analog and digital simulation, schematic capture, printed circuit board design and hardware description language compilation and simulation. Using some rapid criteria for verifying their availability, we have selected some of them which are worth working with. This work intends to perform a deeper analysis of fre...

  12. Joint Research on Scatterometry and AFM Wafer Metrology

    Science.gov (United States)

    Bodermann, Bernd; Buhr, Egbert; Danzebrink, Hans-Ulrich; Bär, Markus; Scholze, Frank; Krumrey, Michael; Wurm, Matthias; Klapetek, Petr; Hansen, Poul-Erik; Korpelainen, Virpi; van Veghel, Marijn; Yacoot, Andrew; Siitonen, Samuli; El Gawhary, Omar; Burger, Sven; Saastamoinen, Toni

    2011-11-01

    Supported by the European Commission and EURAMET, a consortium of 10 participants from national metrology institutes, universities and companies has started a joint research project with the aim of overcoming current challenges in optical scatterometry for traceable linewidth metrology. Both experimental and modelling methods will be enhanced and different methods will be compared with each other and with specially adapted atomic force microscopy (AFM) and scanning electron microscopy (SEM) measurement systems in measurement comparisons. Additionally novel methods for sophisticated data analysis will be developed and investigated to reach significant reductions of the measurement uncertainties in critical dimension (CD) metrology. One final goal will be the realisation of a wafer based reference standard material for calibration of scatterometers.

  13. European research project 'Metrology for radioactive waste management'

    International Nuclear Information System (INIS)

    Suran, J.

    2014-01-01

    The three-year European research project M etrology for Radioactive Waste Management' was launched in October 2011 under the EMRP (European Metrology Research Programme). It involves 13 European national metrology institutes and a total budget exceeds four million Euros. The project is coordinated by the Czech Metrology Institute and is divided into five working groups. This poster presents impact, excellence, relevance to EMPR objectives, and implementation and management of this project.(author)

  14. New software tool for dynamic radiological characterisation and monitoring in nuclear sites

    International Nuclear Information System (INIS)

    Szoeke, Istvan; Louka, Michael N.; Mark, Niels K.; Bryntesen, Tom R.; Bratteli, Joachim; Edvardsen, Svein T.; Gustavsen, Morten A.; Toppe, Aleksander L.; Johnsen, Terje; Rindahl, Grete

    2012-01-01

    The Halden Reactor Project (HRP) is a jointly sponsored international cooperation, under the aegis of the Organisation for Economic Co-operation and Development - Nuclear Energy Agency. Extensive and valuable guidance and tools, connected to safe and reliable operation of nuclear facilities, has been elaborated throughout the years within the frame of this programme. The HRP has particularly high level results in virtual-reality based tools for real-time areal and personal monitoring. The techniques, developed earlier, are now being supplemented to enhance the planning and monitoring capabilities, and support general radiological characterisation connected to nuclear sites and facilities. Due to the complexity and abundance of the input information required, software tools, dedicated to the radiological characterization of contaminated materials, buildings, land and groundwater, are applied to review, evaluate and visualize the data. Characterisation of the radiation situation in a realistic environment can be very complex, and efficient visualisation of the data to the user is not straight forward. The monitoring and planning tools elaborated in the frame of the HRP feature very sophisticated three-dimensional (3D) high definition visualisation and user interfaces to promote easy interpretation of the input data. The visualisation tools permit dynamic visualisation of radiation fields in virtual or augmented reality by various techniques and real-time personal monitoring of humanoid models. In addition new techniques are being elaborated to visualise the 3D distribution of activities in structures and materials. The dosimetric algorithms, feeding information to the visualisation and user interface of these planning tools, include deterministic radiation transport techniques suitable for fast photon dose estimates, in case physical and radio- and spectrometric characteristics of the gamma sources are known. The basic deterministic model, implemented in earlier

  15. Ignominy: a tool for software dependency and metric analysis with examples from large HEP packages

    International Nuclear Information System (INIS)

    Tuura, L.A.; Taylor, L.

    2001-01-01

    Ignominy is a tool developed in the CMS IGUANA project to analyse the structure of software systems. Its primary component is a dependency scanner that distills information into human-usable forms. It also includes several tools to visualise the collected data in the form of graphical views and numerical metrics. Ignominy was designed to adapt to almost any reasonable structure, and it has been used to analyse several large projects. The original purpose of Ignominy was to help us better ensure the quality of our own software, and in particular warn us about possible structural problems early on. As a part of this activity it is now used as a standard part of our release procedure. The authors also use it to evaluate and study the quality of external packages they plan to make use of. The authors describe what Ignominy can find out, and how it can be used to visualise and assess a software structure. The authors also discuss the inherent problems of the analysis as well as the different approaches to modularity the tool makes quite evident. The focus is the illustration of these issues through the analysis results for several sizable HEP software projects

  16. Life Cycle Assessment Studies of Chemical and Biochemical Processes through the new LCSoft Software-tool

    DEFF Research Database (Denmark)

    Supawanich, Perapong; Malakul, Pomthong; Gani, Rafiqul

    2015-01-01

    requirements have to be evaluated together with environmental and economic aspects. The LCSoft software-tool has been developed to perform LCA as a stand-alone tool as well as integrated with other process design tools such as process simulation, economic analysis (ECON), and sustainable process design...

  17. An open-source software tool for the generation of relaxation time maps in magnetic resonance imaging

    International Nuclear Information System (INIS)

    Messroghli, Daniel R; Rudolph, Andre; Abdel-Aty, Hassan; Wassmuth, Ralf; Kühne, Titus; Dietz, Rainer; Schulz-Menger, Jeanette

    2010-01-01

    In magnetic resonance (MR) imaging, T1, T2 and T2* relaxation times represent characteristic tissue properties that can be quantified with the help of specific imaging strategies. While there are basic software tools for specific pulse sequences, until now there is no universal software program available to automate pixel-wise mapping of relaxation times from various types of images or MR systems. Such a software program would allow researchers to test and compare new imaging strategies and thus would significantly facilitate research in the area of quantitative tissue characterization. After defining requirements for a universal MR mapping tool, a software program named MRmap was created using a high-level graphics language. Additional features include a manual registration tool for source images with motion artifacts and a tabular DICOM viewer to examine pulse sequence parameters. MRmap was successfully tested on three different computer platforms with image data from three different MR system manufacturers and five different sorts of pulse sequences: multi-image inversion recovery T1; Look-Locker/TOMROP T1; modified Look-Locker (MOLLI) T1; single-echo T2/T2*; and multi-echo T2/T2*. Computing times varied between 2 and 113 seconds. Estimates of relaxation times compared favorably to those obtained from non-automated curve fitting. Completed maps were exported in DICOM format and could be read in standard software packages used for analysis of clinical and research MR data. MRmap is a flexible cross-platform research tool that enables accurate mapping of relaxation times from various pulse sequences. The software allows researchers to optimize quantitative MR strategies in a manufacturer-independent fashion. The program and its source code were made available as open-source software on the internet

  18. An open-source software tool for the generation of relaxation time maps in magnetic resonance imaging

    Directory of Open Access Journals (Sweden)

    Kühne Titus

    2010-07-01

    Full Text Available Abstract Background In magnetic resonance (MR imaging, T1, T2 and T2* relaxation times represent characteristic tissue properties that can be quantified with the help of specific imaging strategies. While there are basic software tools for specific pulse sequences, until now there is no universal software program available to automate pixel-wise mapping of relaxation times from various types of images or MR systems. Such a software program would allow researchers to test and compare new imaging strategies and thus would significantly facilitate research in the area of quantitative tissue characterization. Results After defining requirements for a universal MR mapping tool, a software program named MRmap was created using a high-level graphics language. Additional features include a manual registration tool for source images with motion artifacts and a tabular DICOM viewer to examine pulse sequence parameters. MRmap was successfully tested on three different computer platforms with image data from three different MR system manufacturers and five different sorts of pulse sequences: multi-image inversion recovery T1; Look-Locker/TOMROP T1; modified Look-Locker (MOLLI T1; single-echo T2/T2*; and multi-echo T2/T2*. Computing times varied between 2 and 113 seconds. Estimates of relaxation times compared favorably to those obtained from non-automated curve fitting. Completed maps were exported in DICOM format and could be read in standard software packages used for analysis of clinical and research MR data. Conclusions MRmap is a flexible cross-platform research tool that enables accurate mapping of relaxation times from various pulse sequences. The software allows researchers to optimize quantitative MR strategies in a manufacturer-independent fashion. The program and its source code were made available as open-source software on the internet.

  19. STARS software tool for analysis of reliability and safety

    International Nuclear Information System (INIS)

    Poucet, A.; Guagnini, E.

    1989-01-01

    This paper reports on the STARS (Software Tool for the Analysis of Reliability and Safety) project aims at developing an integrated set of Computer Aided Reliability Analysis tools for the various tasks involved in systems safety and reliability analysis including hazard identification, qualitative analysis, logic model construction and evaluation. The expert system technology offers the most promising perspective for developing a Computer Aided Reliability Analysis tool. Combined with graphics and analysis capabilities, it can provide a natural engineering oriented environment for computer assisted reliability and safety modelling and analysis. For hazard identification and fault tree construction, a frame/rule based expert system is used, in which the deductive (goal driven) reasoning and the heuristic, applied during manual fault tree construction, is modelled. Expert system can explain their reasoning so that the analyst can become aware of the why and the how results are being obtained. Hence, the learning aspect involved in manual reliability and safety analysis can be maintained and improved

  20. Development of software tools for 4-D visualization and quantitative analysis of PHITS simulation results

    International Nuclear Information System (INIS)

    Furutaka, Kazuyoshi

    2015-02-01

    A suite of software tools has been developed to facilitate the development of apparatus using a radiation transport simulation code PHITS by enabling 4D visualization (3D space and time) and quantitative analysis of so-called dieaway plots. To deliver useable tools as soon as possible, the existing software was utilized as much as possible; ParaView will be used for the 4D visualization of the results, whereas the analyses of dieaway plots will be done with ROOT toolkit with a tool named “diana”. To enable 4D visualization using ParaView, a group of tools (angel2vtk, DispDCAS1, CamPos) has been developed for the conversion of the data format to the one which can be read from ParaView and to ease the visualization. (author)

  1. XPS and angle resolved XPS, in the semiconductor industry: Characterization and metrology control of ultra-thin films

    International Nuclear Information System (INIS)

    Brundle, C.R.; Conti, Giuseppina; Mack, Paul

    2010-01-01

    This review discusses the development of X-ray photoelectron spectroscopy, XPS, used as a characterization and metrology method for ultra-thin films in the semiconductor wafer processing industry. After a brief explanation of how the relative roles of XPS and Auger electron spectroscopy, AES, have changed over the last 15 years or so in the semiconductor industry, we go into some detail as to what is implied by metrology, as opposed to characterization, for thin films in the industry, and then describe how XPS, and particularly angle resolved XPS, ARXPS, have been implemented as a metrology 'tool' for thickness, chemical composition, and non-destructive depth profiling, of transistor gate oxide material, a key requirement in front-end processing. We take a historical approach, dealing first with the early use for SiO 2 films on Si(1 0 0), then moving to silicon oxynitride, SiO x N y in detail, and finally and briefly HfO 2 -based material, which is used today in the most advanced devices (32 nm node).

  2. BYMUR software: a free and open source tool for quantifying and visualizing multi-risk analyses

    Science.gov (United States)

    Tonini, Roberto; Selva, Jacopo

    2013-04-01

    The BYMUR software aims to provide an easy-to-use open source tool for both computing multi-risk and managing/visualizing/comparing all the inputs (e.g. hazard, fragilities and exposure) as well as the corresponding results (e.g. risk curves, risk indexes). For all inputs, a complete management of inter-model epistemic uncertainty is considered. The BYMUR software will be one of the final products provided by the homonymous ByMuR project (http://bymur.bo.ingv.it/) funded by Italian Ministry of Education, Universities and Research (MIUR), focused to (i) provide a quantitative and objective general method for a comprehensive long-term multi-risk analysis in a given area, accounting for inter-model epistemic uncertainty through Bayesian methodologies, and (ii) apply the methodology to seismic, volcanic and tsunami risks in Naples (Italy). More specifically, the BYMUR software will be able to separately account for the probabilistic hazard assessment of different kind of hazardous phenomena, the relative (time-dependent/independent) vulnerabilities and exposure data, and their possible (predefined) interactions: the software will analyze these inputs and will use them to estimate both single- and multi- risk associated to a specific target area. In addition, it will be possible to connect the software to further tools (e.g., a full hazard analysis), allowing a dynamic I/O of results. The use of Python programming language guarantees that the final software will be open source and platform independent. Moreover, thanks to the integration of some most popular and rich-featured Python scientific modules (Numpy, Matplotlib, Scipy) with the wxPython graphical user toolkit, the final tool will be equipped with a comprehensive Graphical User Interface (GUI) able to control and visualize (in the form of tables, maps and/or plots) any stage of the multi-risk analysis. The additional features of importing/exporting data in MySQL databases and/or standard XML formats (for

  3. Teaching structure: student use of software tools for understanding macromolecular structure in an undergraduate biochemistry course.

    Science.gov (United States)

    Jaswal, Sheila S; O'Hara, Patricia B; Williamson, Patrick L; Springer, Amy L

    2013-01-01

    Because understanding the structure of biological macromolecules is critical to understanding their function, students of biochemistry should become familiar not only with viewing, but also with generating and manipulating structural representations. We report a strategy from a one-semester undergraduate biochemistry course to integrate use of structural representation tools into both laboratory and homework activities. First, early in the course we introduce the use of readily available open-source software for visualizing protein structure, coincident with modules on amino acid and peptide bond properties. Second, we use these same software tools in lectures and incorporate images and other structure representations in homework tasks. Third, we require a capstone project in which teams of students examine a protein-nucleic acid complex and then use the software tools to illustrate for their classmates the salient features of the structure, relating how the structure helps explain biological function. To ensure engagement with a range of software and database features, we generated a detailed template file that can be used to explore any structure, and that guides students through specific applications of many of the software tools. In presentations, students demonstrate that they are successfully interpreting structural information, and using representations to illustrate particular points relevant to function. Thus, over the semester students integrate information about structural features of biological macromolecules into the larger discussion of the chemical basis of function. Together these assignments provide an accessible introduction to structural representation tools, allowing students to add these methods to their biochemical toolboxes early in their scientific development. © 2013 by The International Union of Biochemistry and Molecular Biology.

  4. Conceptual design finalisation of the ITER In-Vessel Viewing and Metrology System (IVVS)

    Energy Technology Data Exchange (ETDEWEB)

    Dubus, Gregory, E-mail: gregory.dubus@f4e.europa.eu [Fusion for Energy, c/ Josep Pla, n°2 - Torres Diagonal Litoral - Edificio B3, 08019 Barcelona (Spain); Puiu, Adrian; Damiani, Carlo; Van Uffelen, Marco; Lo Bue, Alessandro; Izquierdo, Jesus; Semeraro, Luigi [Fusion for Energy, c/ Josep Pla, n°2 - Torres Diagonal Litoral - Edificio B3, 08019 Barcelona (Spain); Martins, Jean-Pierre; Palmer, Jim [ITER Organization, Route de Vinon sur Verdon, 13115 Saint Paul Lez Durance (France)

    2013-10-15

    The In-Vessel Viewing and Metrology System (IVVS) is a fundamental tool for the ITER machine operations, aiming at performing inspections as well as providing information related to the erosion of in-vessel components. Periodically or on request, the IVVS probes will be deployed into the Vacuum Vessel from their storage positions (still within the ITER primary confinement) in order to perform both viewing and metrology on plasma facing components (blanket, divertor, heating/diagnostic plugs, test blanket modules) and, more generically, to provide information on the status of the in-vessel components. In 2011, the IO proposed to simplify and strengthen the six IVVS port extensions situated at the divertor level. Among other important consequences, such as the relocation of the Glow Discharge Cleaning (GDC) electrodes at other levels of the machine, this major design change implied the need for a substantial redesign of the IVVS plug, which took part to an on-going effort to bring the integrated IVVS concept – including the scanning probe and its deployment system – to the level of maturity suitable for the Conceptual Design Review. This paper gives an overview of the various design and R and D activities in progress: plug design integration, probe concept validation under environmental conditions, development of a metrology strategy, the whole supported by a nuclear analysis.

  5. Omics Informatics: From Scattered Individual Software Tools to Integrated Workflow Management Systems.

    Science.gov (United States)

    Ma, Tianle; Zhang, Aidong

    2017-01-01

    Omic data analyses pose great informatics challenges. As an emerging subfield of bioinformatics, omics informatics focuses on analyzing multi-omic data efficiently and effectively, and is gaining momentum. There are two underlying trends in the expansion of omics informatics landscape: the explosion of scattered individual omics informatics tools with each of which focuses on a specific task in both single- and multi- omic settings, and the fast-evolving integrated software platforms such as workflow management systems that can assemble multiple tools into pipelines and streamline integrative analysis for complicated tasks. In this survey, we give a holistic view of omics informatics, from scattered individual informatics tools to integrated workflow management systems. We not only outline the landscape and challenges of omics informatics, but also sample a number of widely used and cutting-edge algorithms in omics data analysis to give readers a fine-grained view. We survey various workflow management systems (WMSs), classify them into three levels of WMSs from simple software toolkits to integrated multi-omic analytical platforms, and point out the emerging needs for developing intelligent workflow management systems. We also discuss the challenges, strategies and some existing work in systematic evaluation of omics informatics tools. We conclude by providing future perspectives of emerging fields and new frontiers in omics informatics.

  6. "SABER": A new software tool for radiotherapy treatment plan evaluation.

    Science.gov (United States)

    Zhao, Bo; Joiner, Michael C; Orton, Colin G; Burmeister, Jay

    2010-11-01

    Both spatial and biological information are necessary in order to perform true optimization of a treatment plan and for predicting clinical outcome. The goal of this work is to develop an enhanced treatment plan evaluation tool which incorporates biological parameters and retains spatial dose information. A software system is developed which provides biological plan evaluation with a novel combination of features. It incorporates hyper-radiosensitivity using the induced-repair model and applies the new concept of dose convolution filter (DCF) to simulate dose wash-out effects due to cell migration, bystander effect, and/or tissue motion during treatment. Further, the concept of spatial DVH (sDVH) is introduced to evaluate and potentially optimize the spatial dose distribution in the target volume. Finally, generalized equivalent uniform dose is derived from both the physical dose distribution (gEUD) and the distribution of equivalent dose in 2 Gy fractions (gEUD2) and the software provides three separate models for calculation of tumor control probability (TCP), normal tissue complication probability (NTCP), and probability of uncomplicated tumor control (P+). TCP, NTCP, and P+ are provided as a function of prescribed dose and multivariable TCP, NTCP, and P+ plots are provided to illustrate the dependence on individual parameters used to calculate these quantities. Ten plans from two clinical treatment sites are selected to test the three calculation models provided by this software. By retaining both spatial and biological information about the dose distribution, the software is able to distinguish features of radiotherapy treatment plans not discernible using commercial systems. Plans that have similar DVHs may have different spatial and biological characteristics and the application of novel tools such as sDVH and DCF within the software may substantially change the apparent plan quality or predicted plan metrics such as TCP and NTCP. For the cases examined

  7. Contingency Contractor Optimization Phase 3 Sustainment Software Design Document - Contingency Contractor Optimization Tool - Prototype

    Energy Technology Data Exchange (ETDEWEB)

    Durfee, Justin David; Frazier, Christopher Rawls; Bandlow, Alisa; Jones, Katherine A

    2016-05-01

    This document describes the final software design of the Contingency Contractor Optimization Tool - Prototype. Its purpose is to provide the overall architecture of the software and the logic behind this architecture. Documentation for the individual classes is provided in the application Javadoc. The Contingency Contractor Optimization project is intended to address Department of Defense mandates by delivering a centralized strategic planning tool that allows senior decision makers to quickly and accurately assess the impacts, risks, and mitigation strategies associated with utilizing contract support. The Contingency Contractor Optimization Tool - Prototype was developed in Phase 3 of the OSD ATL Contingency Contractor Optimization project to support strategic planning for contingency contractors. The planning tool uses a model to optimize the Total Force mix by minimizing the combined total costs for selected mission scenarios. The model optimizes the match of personnel types (military, DoD civilian, and contractors) and capabilities to meet mission requirements as effectively as possible, based on risk, cost, and other requirements.

  8. MyETL: A Java Software Tool to Extract, Transform, and Load Your Business

    Directory of Open Access Journals (Sweden)

    Michele Nuovo

    2015-12-01

    Full Text Available The project follows the development of a Java Software Tool that extracts data from Flat File (Fixed Length Record Type, CSV (Comma Separated Values, and XLS (Microsoft Excel 97-2003 Worksheet file, apply transformation to those sources, and finally load the data into the end target RDBMS. The software refers to a process known as ETL (Extract Transform and Load. Those kinds of systems are called ETL systems.

  9. Development of a User Interface for a Regression Analysis Software Tool

    Science.gov (United States)

    Ulbrich, Norbert Manfred; Volden, Thomas R.

    2010-01-01

    An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.

  10. Color and appearance metrology facility

    Data.gov (United States)

    Federal Laboratory Consortium — The NIST Physical Measurement Laboratory has established the color and appearance metrology facility to support calibration services for 0°/45° colored samples, 20°,...

  11. COMPUTER TOOLS OF DYNAMIC MATHEMATIC SOFTWARE AND METHODICAL PROBLEMS OF THEIR USE

    Directory of Open Access Journals (Sweden)

    Olena V. Semenikhina

    2014-08-01

    Full Text Available The article presents results of analyses of standard computer tools of dynamic mathematic software which are used in solving tasks, and tools on which the teacher can support in the teaching of mathematics. Possibility of the organization of experimental investigating of mathematical objects on the basis of these tools and the wording of new tasks on the basis of the limited number of tools, fast automated check are specified. Some methodological comments on application of computer tools and methodological features of the use of interactive mathematical environments are presented. Problems, which are arising from the use of computer tools, among which rethinking forms and methods of training by teacher, the search for creative problems, the problem of rational choice of environment, check the e-solutions, common mistakes in the use of computer tools are selected.

  12. A Process Framework for Designing Software Reference Architectures for Providing Tools as a Service

    DEFF Research Database (Denmark)

    Chauhan, Muhammad Aufeef; Babar, Muhammad Ali; Probst, Christian W.

    2016-01-01

    of software systems need customized and systematic SRA design and evaluation methods. In this paper, we present a software Reference Architecture Design process Framework (RADeF) that can be used for analysis, design and evaluation of the SRA for provisioning of Tools as a Service as part of a cloud......Software Reference Architecture (SRA), which is a generic architecture solution for a specific type of software systems, provides foundation for the design of concrete architectures in terms of architecture design guidelines and architecture elements. The complexity and size of certain types......-enabled workSPACE (TSPACE). The framework is based on the state of the art results from literature and our experiences with designing software architectures for cloud-based systems. We have applied RADeF SRA design two types of TSPACE: software architecting TSPACE and software implementation TSPACE...

  13. JULIDE: a software tool for 3D reconstruction and statistical analysis of autoradiographic mouse brain sections.

    Directory of Open Access Journals (Sweden)

    Delphine Ribes

    Full Text Available In this article we introduce JULIDE, a software toolkit developed to perform the 3D reconstruction, intensity normalization, volume standardization by 3D image registration and voxel-wise statistical analysis of autoradiographs of mouse brain sections. This software tool has been developed in the open-source ITK software framework and is freely available under a GPL license. The article presents the complete image processing chain from raw data acquisition to 3D statistical group analysis. Results of the group comparison in the context of a study on spatial learning are shown as an illustration of the data that can be obtained with this tool.

  14. Chemical metrology, strategic job for the Chilean Nuclear Energy Commission

    International Nuclear Information System (INIS)

    Gras, Nuri; Munoz, Luis; Cortes, Eduardo

    2001-01-01

    The National Standardization Institute's (INN) Metrology unit prepared a study in 1996 to evaluate the impact of metrological activity in Chile. This study was based on a survey of the supply and demand of metrological services and on studies of the behavior of the production system and technological services in Chile during the period 1990-1996. With the information obtained in this study the economic impact resulting from the lack of a national metrology system could be evaluated. This impact was estimated to be a 5% loss in gross national product equal to 125-500 million dollars because of direct product rejection in the mining, fisheries, agricultural and manufacturing sectors. Chemical measurements are responsible for 50% of these losses. In response to this need and coordinated by the INN, a metrological network of reference laboratories began to operate in 1997 for the principal physical magnitudes (mass, temperature, longitude and force) and a CORFO-FDI project began in 2001 that includes the chemical magnitudes. The Chilean Nuclear Energy Commission, aware of the problem's importance and the amount of economic damage that the country may suffer, as a result of these deficiencies, has formed a Chemical Metrology Unit to provide technical support. It aims to raise the standards of local analytical laboratories by providing international recognition to the export sector. Nuclear analytical techniques are used as reference methods. This work describes the laboratories that are included in this Chemical Metrology Unit and the historical contribution to the development of local analytical chemistry. The national and international projects are described together with the publications they have generated. The quality assurance program applied to the laboratories is described as well, which has led to the accreditation of the analytical chemical assays. The procedures used for validation and calculation of uncertain nuclear methodologies are described together with

  15. Utilization of the research and measurement reactor Braunschweig for neutron metrology

    International Nuclear Information System (INIS)

    Alberts, W.G.

    1982-01-01

    The objectives of the Physikalisch-Technische Bundesanstalt (PTB) with regard to neutron metrology are briefly described. The use of the PTB's Research and Measuring Reactor as neutron source for metrological purposes is discussed. Reference neutron beams are described which serve as irradiation facilities for the calibration of detectors for radiation protection purposes in the frame of the legal metrology work in the PTB. (orig.) [de

  16. At-wavelength Optical Metrology Development at the ALS

    International Nuclear Information System (INIS)

    Yuan, Sheng Sam; Goldberg, Kenneth A.; Yashchuk, Valeriy V.; Celestre, Richard; Mochi, Iacopo; Macdougall, James; Morrison, Gregory Y.; Smith, Brian V.; Domning, Edward E.; McKinney, Wayne R.; Warwick, Tony

    2010-01-01

    Nano-focusing and brightness preservation for ever brighter synchrotron radiation and free electron laser beamlines require surface slope tolerances of x-ray optics on the order of 100 nrad. While the accuracy of fabrication and ex situ metrology of x-ray mirrors has improved over time, beamline in situ performance of the optics is often limited by application specific factors such as x-ray beam heat loading, temperature drift, alignment, vibration, etc. In the present work, we discuss the recent results from the Advanced Light Source developing high accuracy, in situ, at-wavelength wavefront measurement techniques to surpass 100-nrad accuracy surface slope measurements with reflecting x-ray optics. The techniques will ultimately allow closed-loop feedback systems to be implemented for x-ray nano-focusing. In addition, we present a dedicated metrology beamline endstation, applicable to a wide range of in situ metrology and test experiments. The design and performance of a bendable Kirkpatrick-Baez (KB) mirror with active temperature stabilization will also be presented. The mirror is currently used to study, refine, and optimize in situ mirror alignment, bending and metrology methods essential for nano-focusing application.

  17. Advanced applications of scatterometry based optical metrology

    Science.gov (United States)

    Dixit, Dhairya; Keller, Nick; Kagalwala, Taher; Recchia, Fiona; Lifshitz, Yevgeny; Elia, Alexander; Todi, Vinit; Fronheiser, Jody; Vaid, Alok

    2017-03-01

    The semiconductor industry continues to drive patterning solutions that enable devices with higher memory storage capacity, faster computing performance, and lower cost per transistor. These developments in the field of semiconductor manufacturing along with the overall minimization of the size of transistors require continuous development of metrology tools used for characterization of these complex 3D device architectures. Optical scatterometry or optical critical dimension (OCD) is one of the most prevalent inline metrology techniques in semiconductor manufacturing because it is a quick, precise and non-destructive metrology technique. However, at present OCD is predominantly used to measure the feature dimensions such as line-width, height, side-wall angle, etc. of the patterned nano structures. Use of optical scatterometry for characterizing defects such as pitch-walking, overlay, line edge roughness, etc. is fairly limited. Inspection of process induced abnormalities is a fundamental part of process yield improvement. It provides process engineers with important information about process errors, and consequently helps optimize materials and process parameters. Scatterometry is an averaging technique and extending it to measure the position of local process induced defectivity and feature-to-feature variation is extremely challenging. This report is an overview of applications and benefits of using optical scatterometry for characterizing defects such as pitch-walking, overlay and fin bending for advanced technology nodes beyond 7nm. Currently, the optical scatterometry is based on conventional spectroscopic ellipsometry and spectroscopic reflectometry measurements, but generalized ellipsometry or Mueller matrix spectroscopic ellipsometry data provides important, additional information about complex structures that exhibit anisotropy and depolarization effects. In addition the symmetry-antisymmetry properties associated with Mueller matrix (MM) elements

  18. DLP-based 3D metrology by structured light or projected fringe technology for life sciences and industrial metrology

    Science.gov (United States)

    Frankowski, G.; Hainich, R.

    2009-02-01

    Since the mid-eighties, a fundamental idea for achieving measuring accuracy in projected fringe technology was to consider the projected fringe pattern as an interferogram and evaluate it on the basis of advanced algorithms widely used for phase measuring in real-time interferometry. A fundamental requirement for obtaining a sufficiently high degree of measuring accuracy with this so-called "phase measuring projected fringe technology" is that the projected fringes, analogous to interference fringes, must have a cos2-shaped intensity distribution. Until the mid-nineties, this requirement for the projected fringe pattern measurement technology presented a basic handicap for its wide application in 3D metrology. This situation changed abruptly, when in the nineties Texas Instruments introduced to the market advanced digital light projection on the basis of micro mirror based projection systems, socalled DLP technology, which also facilitated the generation and projection of cos2-shaped intensity and/or fringe patterns. With this DLP technology, which from its original approach was actually oriented towards completely different applications such as multimedia projection, Texas Instruments boosted phase-measuring fringe projection in optical 3D metrology to a worldwide breakthrough both for medical as well as industrial applications. A subject matter of the lecture will be to present the fundamental principles and the resulting advantages of optical 3D metrology based on phase-measuring fringe projection using DLP technology. Further will be presented and discussed applications of the measurement technology in medical engineering and industrial metrology.

  19. AMIDE: A Free Software Tool for Multimodality Medical Image Analysis

    Directory of Open Access Journals (Sweden)

    Andreas Markus Loening

    2003-07-01

    Full Text Available Amide's a Medical Image Data Examiner (AMIDE has been developed as a user-friendly, open-source software tool for displaying and analyzing multimodality volumetric medical images. Central to the package's abilities to simultaneously display multiple data sets (e.g., PET, CT, MRI and regions of interest is the on-demand data reslicing implemented within the program. Data sets can be freely shifted, rotated, viewed, and analyzed with the program automatically handling interpolation as needed from the original data. Validation has been performed by comparing the output of AMIDE with that of several existing software packages. AMIDE runs on UNIX, Macintosh OS X, and Microsoft Windows platforms, and it is freely available with source code under the terms of the GNU General Public License.

  20. Slovak Institute of Metrology. Annual Report 2001

    International Nuclear Information System (INIS)

    Bily, M.

    2002-03-01

    A brief account of activities carried out by the Slovak Institute of Metrology (SMU) in 2001 is presented. These activities are reported under the headings: (1) Organisation identification; (2) Mission and medium-term perspectives; (3) Contract with Slovak Office of Standards, Metrology and Testing of the Slovak Republic; (4) SMU activities ; (5) Economic results; (6) Personnel management; (7) Aims and results of their fulfilment; (8) Evaluation and analysis of SMU development in 2001; (9) Main group of outputs users; (10) Conclusion

  1. Objectives and functions of ionizing radiation metrology

    International Nuclear Information System (INIS)

    Rothe, H.

    1981-01-01

    Proceeding from the fundamental objectives of ionizing radiation metrology, the main tasks of metrological research and assurances of accurate measurements in dosimetry and activity determination are summarized. With a view to the technical performance of these tasks the state-of-the-art and the trends in reproduction and dissemination of dosimetric and activity units are outlined. Problems are derived that should be solved within the framework of the CMEA Standing Commissions on Standardization and on the Peaceful Uses of Atomic Energy. (author)

  2. Investigation into the use of smartphone as a machine vision device for engineering metrology and flaw detection, with focus on drilling

    Science.gov (United States)

    Razdan, Vikram; Bateman, Richard

    2015-05-01

    This study investigates the use of a Smartphone and its camera vision capabilities in Engineering metrology and flaw detection, with a view to develop a low cost alternative to Machine vision systems which are out of range for small scale manufacturers. A Smartphone has to provide a similar level of accuracy as Machine Vision devices like Smart cameras. The objective set out was to develop an App on an Android Smartphone, incorporating advanced Computer vision algorithms written in java code. The App could then be used for recording measurements of Twist Drill bits and hole geometry, and analysing the results for accuracy. A detailed literature review was carried out for in-depth study of Machine vision systems and their capabilities, including a comparison between the HTC One X Android Smartphone and the Teledyne Dalsa BOA Smart camera. A review of the existing metrology Apps in the market was also undertaken. In addition, the drilling operation was evaluated to establish key measurement parameters of a twist Drill bit, especially flank wear and diameter. The methodology covers software development of the Android App, including the use of image processing algorithms like Gaussian Blur, Sobel and Canny available from OpenCV software library, as well as designing and developing the experimental set-up for carrying out the measurements. The results obtained from the experimental set-up were analysed for geometry of Twist Drill bits and holes, including diametrical measurements and flaw detection. The results show that Smartphones like the HTC One X have the processing power and the camera capability to carry out metrological tasks, although dimensional accuracy achievable from the Smartphone App is below the level provided by Machine vision devices like Smart cameras. A Smartphone with mechanical attachments, capable of image processing and having a reasonable level of accuracy in dimensional measurement, has the potential to become a handy low-cost Machine vision

  3. IT Security Standards and Legal Metrology - Transfer and Validation

    Science.gov (United States)

    Thiel, F.; Hartmann, V.; Grottker, U.; Richter, D.

    2014-08-01

    Legal Metrology's requirements can be transferred into the IT security domain applying a generic set of standardized rules provided by the Common Criteria (ISO/IEC 15408). We will outline the transfer and cross validation of such an approach. As an example serves the integration of Legal Metrology's requirements into a recently developed Common Criteria based Protection Profile for a Smart Meter Gateway designed under the leadership of the Germany's Federal Office for Information Security. The requirements on utility meters laid down in the Measuring Instruments Directive (MID) are incorporated. A verification approach to check for meeting Legal Metrology's requirements by their interpretation through Common Criteria's generic requirements is also presented.

  4. Real cell overlay measurement through design based metrology

    Science.gov (United States)

    Yoo, Gyun; Kim, Jungchan; Park, Chanha; Lee, Taehyeong; Ji, Sunkeun; Jo, Gyoyeon; Yang, Hyunjo; Yim, Donggyu; Yamamoto, Masahiro; Maruyama, Kotaro; Park, Byungjun

    2014-04-01

    Until recent device nodes, lithography has been struggling to improve its resolution limit. Even though next generation lithography technology is now facing various difficulties, several innovative resolution enhancement technologies, based on 193nm wavelength, were introduced and implemented to keep the trend of device scaling. Scanner makers keep developing state-of-the-art exposure system which guarantees higher productivity and meets a more aggressive overlay specification. "The scaling reduction of the overlay error has been a simple matter of the capability of exposure tools. However, it is clear that the scanner contributions may no longer be the majority component in total overlay performance. The ability to control correctable overlay components is paramount to achieve the desired performance.(2)" In a manufacturing fab, the overlay error, determined by a conventional overlay measurement: by using an overlay mark based on IBO and DBO, often does not represent the physical placement error in the cell area of a memory device. The mismatch may arise from the size or pitch difference between the overlay mark and the cell pattern. Pattern distortion, caused by etching or CMP, also can be a source of the mismatch. Therefore, the requirement of a direct overlay measurement in the cell pattern gradually increases in the manufacturing field, and also in the development level. In order to overcome the mismatch between conventional overlay measurement and the real placement error of layer to layer in the cell area of a memory device, we suggest an alternative overlay measurement method utilizing by design, based metrology tool. A basic concept of this method is shown in figure1. A CD-SEM measurement of the overlay error between layer 1 and 2 could be the ideal method but it takes too long time to extract a lot of data from wafer level. An E-beam based DBM tool provides high speed to cover the whole wafer with high repeatability. It is enabled by using the design as a

  5. APT - NASA ENHANCED VERSION OF AUTOMATICALLY PROGRAMMED TOOL SOFTWARE - STAND-ALONE VERSION

    Science.gov (United States)

    Premo, D. A.

    1994-01-01

    The APT code is one of the most widely used software tools for complex numerically controlled (N/C) machining. APT is an acronym for Automatically Programmed Tools and is used to denote both a language and the computer software that processes that language. Development of the APT language and software system was begun over twenty years ago as a U. S. government sponsored industry and university research effort. APT is a "problem oriented" language that was developed for the explicit purpose of aiding the N/C machine tools. Machine-tool instructions and geometry definitions are written in the APT language to constitute a "part program." The APT part program is processed by the APT software to produce a cutter location (CL) file. This CL file may then be processed by user supplied post processors to convert the CL data into a form suitable for a particular N/C machine tool. This June, 1989 offering of the APT system represents an adaptation, with enhancements, of the public domain version of APT IV/SSX8 to the DEC VAX-11/780 for use by the Engineering Services Division of the NASA Goddard Space Flight Center. Enhancements include the super pocket feature which allows concave and convex polygon shapes of up to 40 points including shapes that overlap, that leave islands of material within the pocket, and that have one or more arcs as part of the pocket boundary. Recent modifications to APT include a rework of the POCKET subroutine and correction of an error that prevented the use within a macro of a macro variable cutter move statement combined with macro variable double check surfaces. Former modifications included the expansion of array and buffer sizes to accommodate larger part programs, and the insertion of a few user friendly error messages. The APT system software on the DEC VAX-11/780 is organized into two separate programs: the load complex and the APT processor. The load complex handles the table initiation phase and is usually only run when changes to the

  6. CytoBayesJ: software tools for Bayesian analysis of cytogenetic radiation dosimetry data.

    Science.gov (United States)

    Ainsbury, Elizabeth A; Vinnikov, Volodymyr; Puig, Pedro; Maznyk, Nataliya; Rothkamm, Kai; Lloyd, David C

    2013-08-30

    A number of authors have suggested that a Bayesian approach may be most appropriate for analysis of cytogenetic radiation dosimetry data. In the Bayesian framework, probability of an event is described in terms of previous expectations and uncertainty. Previously existing, or prior, information is used in combination with experimental results to infer probabilities or the likelihood that a hypothesis is true. It has been shown that the Bayesian approach increases both the accuracy and quality assurance of radiation dose estimates. New software entitled CytoBayesJ has been developed with the aim of bringing Bayesian analysis to cytogenetic biodosimetry laboratory practice. CytoBayesJ takes a number of Bayesian or 'Bayesian like' methods that have been proposed in the literature and presents them to the user in the form of simple user-friendly tools, including testing for the most appropriate model for distribution of chromosome aberrations and calculations of posterior probability distributions. The individual tools are described in detail and relevant examples of the use of the methods and the corresponding CytoBayesJ software tools are given. In this way, the suitability of the Bayesian approach to biological radiation dosimetry is highlighted and its wider application encouraged by providing a user-friendly software interface and manual in English and Russian. Copyright © 2013 Elsevier B.V. All rights reserved.

  7. Joint Research on Scatterometry and AFM Wafer Metrology

    OpenAIRE

    Bodermann, B.; Buhr, E.; Danzebrink, H.U.; Bär, M.; Scholze, F.; Krumrey, M.; Wurm, M.; Klapetek, P.; Hansen, P.E.; Korpelainen, V.; Van Veghel, M.; Yacoot, A.; Siitonen, S.; El Gawhary, O.; Burger, S.

    2011-01-01

    Supported by the European Commission and EURAMET, a consortium of 10 participants from national metrology institutes, universities and companies has started a joint research project with the aim of overcoming current challenges in optical scatterometry for traceable linewidth metrology. Both experimental and modelling methods will be enhanced and different methods will be compared with each other and with specially adapted atomic force microscopy (AFM) and scanning electron microscopy (SEM) m...

  8. Assess/Mitigate Risk through the Use of Computer-Aided Software Engineering (CASE) Tools

    Science.gov (United States)

    Aguilar, Michael L.

    2013-01-01

    The NASA Engineering and Safety Center (NESC) was requested to perform an independent assessment of the mitigation of the Constellation Program (CxP) Risk 4421 through the use of computer-aided software engineering (CASE) tools. With the cancellation of the CxP, the assessment goals were modified to capture lessons learned and best practices in the use of CASE tools. The assessment goal was to prepare the next program for the use of these CASE tools. The outcome of the assessment is contained in this document.

  9. Managing the Testing Process Practical Tools and Techniques for Managing Hardware and Software Testing

    CERN Document Server

    Black, Rex

    2011-01-01

    New edition of one of the most influential books on managing software and hardware testing In this new edition of his top-selling book, Rex Black walks you through the steps necessary to manage rigorous testing programs of hardware and software. The preeminent expert in his field, Mr. Black draws upon years of experience as president of both the International and American Software Testing Qualifications boards to offer this extensive resource of all the standards, methods, and tools you'll need. The book covers core testing concepts and thoroughly examines the best test management practices

  10. SOFTWARE TOOL FOR LASER CUTTING PROCESS CONTROL – SOLVING REAL INDUSTRIAL CASE STUDIES

    Directory of Open Access Journals (Sweden)

    Miloš Madić

    2016-08-01

    Full Text Available Laser cutting is one of the leading non-conventional machining technologies with a wide spectrum of application in modern industry. It order to exploit a number of advantages that this technology offers for contour cutting of materials, it is necessary to carefully select laser cutting conditions for each given workpiece material, thickness and desired cut qualities. In other words, there is a need for process control of laser cutting. After a comprehensive analysis of the main laser cutting parameters and process performance characteristics, the application of the developed software tool “BRUTOMIZER” for off-line control of CO2 laser cutting process of three different workpiece materials (mild steel, stainless steel and aluminum is illustrated. Advantages and abilities of the developed software tool are also illustrated.

  11. High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    Energy Technology Data Exchange (ETDEWEB)

    Habib, Salman [Argonne National Lab. (ANL), Argonne, IL (United States); Roser, Robert [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); LeCompte, Tom [Argonne National Lab. (ANL), Argonne, IL (United States); Marshall, Zach [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Borgland, Anders [SLAC National Accelerator Lab., Menlo Park, CA (United States); Viren, Brett [Brookhaven National Lab. (BNL), Upton, NY (United States); Nugent, Peter [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Asai, Makato [SLAC National Accelerator Lab., Menlo Park, CA (United States); Bauerdick, Lothar [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Finkel, Hal [Argonne National Lab. (ANL), Argonne, IL (United States); Gottlieb, Steve [Indiana Univ., Bloomington, IN (United States); Hoeche, Stefan [SLAC National Accelerator Lab., Menlo Park, CA (United States); Sheldon, Paul [Vanderbilt Univ., Nashville, TN (United States); Vay, Jean-Luc [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Elmer, Peter [Princeton Univ., NJ (United States); Kirby, Michael [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Patton, Simon [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Potekhin, Maxim [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Yanny, Brian [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Calafiura, Paolo [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Dart, Eli [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Gutsche, Oliver [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Izubuchi, Taku [Brookhaven National Lab. (BNL), Upton, NY (United States); Lyon, Adam [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Petravick, Don [Univ. of Illinois, Urbana-Champaign, IL (United States). National Center for Supercomputing Applications (NCSA)

    2015-10-29

    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.

  12. High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    Energy Technology Data Exchange (ETDEWEB)

    Habib, Salman [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Roser, Robert [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2015-10-28

    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.

  13. A Software module for pointwise calibration of free form objevts and for uncertainty representation

    DEFF Research Database (Denmark)

    Savio, Enrico; Farmer, A.F.; De Chiffre, Leonardo

    . The Centre for Geometrical Metrology (CGM) at the Technical University of Denmark takes care of free form measurements, in collaboration with DIMEG, University of Padova, Italy. The present report describes a software module, MULTICAL, to be used for the calibration of free form objects. The purpose...... of the software is to calculate the uncertainty of free form measurements according to the method described in the draft standard ISO/WD 15530-6....

  14. SAMPA: A free software tool for skin and membrane permeation data analysis.

    Science.gov (United States)

    Bezrouk, Aleš; Fiala, Zdeněk; Kotingová, Lenka; Krulichová, Iva Selke; Kopečná, Monika; Vávrová, Kateřina

    2017-10-01

    Skin and membrane permeation experiments comprise an important step in the development of a transdermal or topical formulation or toxicological risk assessment. The standard method for analyzing these data relies on the linear part of a permeation profile. However, it is difficult to objectively determine when the profile becomes linear, or the experiment duration may be insufficient to reach a maximum or steady state. Here, we present a software tool for Skin And Membrane Permeation data Analysis, SAMPA, that is easy to use and overcomes several of these difficulties. The SAMPA method and software have been validated on in vitro and in vivo permeation data on human, pig and rat skin and model stratum corneum lipid membranes using compounds that range from highly lipophilic polycyclic aromatic hydrocarbons to highly hydrophilic antiviral drug, with and without two permeation enhancers. The SAMPA performance was compared with the standard method using a linear part of the permeation profile and a complex mathematical model. SAMPA is a user-friendly, open-source software tool for analyzing the data obtained from skin and membrane permeation experiments. It runs on a Microsoft Windows platform and is freely available as a Supporting file to this article. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Laser metrology and optic active control system for GAIA

    Science.gov (United States)

    D'Angelo, F.; Bonino, L.; Cesare, S.; Castorina, G.; Mottini, S.; Bertinetto, F.; Bisi, M.; Canuto, E.; Musso, F.

    2017-11-01

    The Laser Metrology and Optic Active Control (LM&OAC) program has been carried out under ESA contract with the purpose to design and validate a laser metrology system and an actuation mechanism to monitor and control at microarcsec level the stability of the Basic Angle (angle between the lines of sight of the two telescopes) of GAIA satellite. As part of the program, a breadboard (including some EQM elements) of the laser metrology and control system has been built and submitted to functional, performance and environmental tests. In the followings we describe the mission requirements, the system architecture, the breadboard design, and finally the performed validation tests. Conclusion and appraisals from this experience are also reported.

  16. Advanced software tool for the creation of a typical meteorological year

    International Nuclear Information System (INIS)

    Skeiker, Kamal; Ghani, Bashar Abdul

    2008-01-01

    The generation of a typical meteorological year is of great importance for calculations concerning many applications in the field of thermal engineering. In this context, method that has been proposed by Hall et al. is selected for generating typical data, and an improved criterion for final selection of typical meteorological month (TMM) was demonstrated. The final selection of the most representative year was done by examining a composite score S. The composite score was calculated as the weighed sum of the scores of the four meteorological parameters used. These parameters are air dry bulb temperature, relative humidity, wind velocity and global solar radiation intensity. Moreover, a new modern software tool using Delphi 6.0 has been developed, utilizing the Filkenstein-Schafer statistical method for the creation of a typical meteorological year for any site of concern. Whereas, an improved criterion for final selection of typical meteorological month was employed. Such tool allows the user to perform this task without an intimate knowledge of all of the computational details. The final alphanumerical and graphical results are presented on screen, and can be saved to a file or printed as a hard copy. Using this software tool, a typical meteorological year was generated for Damascus, capital of Syria, as a test run example. The data processed used were obtained from the Department of Meteorology and cover a period of 10 years (1991-2000)

  17. Report of AAPM Task Group 162: Software for planar image quality metrology.

    Science.gov (United States)

    Samei, Ehsan; Ikejimba, Lynda C; Harrawood, Brian P; Rong, John; Cunningham, Ian A; Flynn, Michael J

    2018-02-01

    The AAPM Task Group 162 aimed to provide a standardized approach for the assessment of image quality in planar imaging systems. This report offers a description of the approach as well as the details of the resultant software bundle to measure detective quantum efficiency (DQE) as well as its basis components and derivatives. The methodology and the associated software include the characterization of the noise power spectrum (NPS) from planar images acquired under specific acquisition conditions, modulation transfer function (MTF) using an edge test object, the DQE, and effective DQE (eDQE). First, a methodological framework is provided to highlight the theoretical basis of the work. Then, a step-by-step guide is included to assist in proper execution of each component of the code. Lastly, an evaluation of the method is included to validate its accuracy against model-based and experimental data. The code was built using a Macintosh OSX operating system. The software package contains all the source codes to permit an experienced user to build the suite on a Linux or other *nix type system. The package further includes manuals and sample images and scripts to demonstrate use of the software for new users. The results of the code are in close alignment with theoretical expectations and published results of experimental data. The methodology and the software package offered in AAPM TG162 can be used as baseline for characterization of inherent image quality attributes of planar imaging systems. © 2017 American Association of Physicists in Medicine.

  18. Advances in speckle metrology and related techniques

    CERN Document Server

    Kaufmann, Guillermo H

    2010-01-01

    Speckle metrology includes various optical techniques that are based on the speckle fields generated by reflection from a rough surface or by transmission through a rough diffuser. These techniques have proven to be very useful in testing different materials in a non-destructive way. They have changed dramatically during the last years due to the development of modern optical components, with faster and more powerful digital computers, and novel data processing approaches. This most up-to-date overview of the topic describes new techniques developed in the field of speckle metrology over the l

  19. Using a Self-Administered Visual Basic Software Tool To Teach Psychological Concepts.

    Science.gov (United States)

    Strang, Harold R.; Sullivan, Amie K.; Schoeny, Zahrl G.

    2002-01-01

    Introduces LearningLinks, a Visual Basic software tool that allows teachers to create individualized learning modules that use constructivist and behavioral learning principles. Describes field testing of undergraduates at the University of Virginia that tested a module designed to improve understanding of the psychological concepts of…

  20. New tools for digital medical image processing implemented in DIP software

    International Nuclear Information System (INIS)

    Araujo, Erica A.C.; Santana, Ivan E.; Lima, Fernando R.A.; Viera, Jose W.

    2011-01-01

    The anthropomorphic models used in computational dosimetry, also called phantoms, are mostly built from stacks of images CT (Computed Tomography) or MRI (Magnetic Resonance Imaging) obtained from scans of patients or volunteers. The construction of voxel phantoms requires computational processing for transforming image formats, dimensional image compression (2D) to form three-dimensional arrays (3D), quantization, resampling, enhancement, restoration and image segmentation, among others. Hardly the computational dosimetry researcher finds all these skills into a single software and often it results in a decreased development of their research or inadequate use of alternative tools. The need to integrate the various tasks of the original digital image processing to obtain an image that can be used in a computational model of exposure led to the development of software DIP (Digital Image Processing). This software reads, writes and edits binary files containing the 3D matrix corresponding to a stack of cross-sectional images of a given geometry that can be a human body or other volume of interest. It can also read any type of computer image and do conversions. When the task involves only one output image, it is saved in the JPEG standard Windows. When it involves a stack of images, the binary output file is called SGI (Interactive Graphic Simulations, a symbol already used in other publications of the Research Group in Numerical Dosimetry). The following paper presents the third version of the DIP software and emphasizes the new tools it implemented. Currently it has the menus Basics, Views, Spatial Domain, Frequency Domain, Segmentations and Study. Each menu contains items and subitems with features that generally require an image as input and produce an image or an attribute in the output. (author)

  1. On in-vivo skin topography metrology and replication techniques

    International Nuclear Information System (INIS)

    Rosen, B-G; Blunt, L; Thomas, T R

    2005-01-01

    Human skin metrology is an area of growing interest for many disciplines both in research and for commercial purposes. Changes in the skin topography are an early stage diagnosis tool not only for diseases but also give indication of the response to medical and cosmetic treatment. This paper focuses on the evaluation of in vivo and in vitro methodologies for accurate measurements of skin and outlines the quantitative characterisation of the skin topography. The study shows the applicability of in-vivo skin topography characterisation and also the advantages and limitations compared to conventional replication techniques. Finally, aspects of stripe projection methodology and 3D characterisation are discussed as a background to the proposed methodology in this paper

  2. Practical experience with software tools to assess and improve the quality of existing nuclear analysis and safety codes

    International Nuclear Information System (INIS)

    Marshall, N.H.; Marwil, E.S.; Matthews, S.D.; Stacey, B.J.

    1990-01-01

    Within the constraints of schedule and budget, software tools and techniques were applied to existing FORTRAN codes determining software quality metrics and improving the code quality. Specifically discussed are INEL experiences in applying pretty printers, cross-reference analyzers, and computer aided software engineering (CASE) tools and techniques. These have provided management with measures of the risk potential for individual program modules so that rational decisions can be made on resource allocation. Selected program modules have been modified to reduce the complexity, achieve higher functional independence, and improve the code vectorization. (orig.)

  3. Use of a quality improvement tool, the prioritization matrix, to identify and prioritize triage software algorithm enhancement.

    Science.gov (United States)

    North, Frederick; Varkey, Prathiba; Caraballo, Pedro; Vsetecka, Darlene; Bartel, Greg

    2007-10-11

    Complex decision support software can require significant effort in maintenance and enhancement. A quality improvement tool, the prioritization matrix, was successfully used to guide software enhancement of algorithms in a symptom assessment call center.

  4. Comparison of asphere measurements by tactile and optical metrological instruments

    NARCIS (Netherlands)

    Bergmans, R.H.; Nieuwenkamp, H.J.; Kok, G.J.P.; Blobel, G.; Nouira, H.; Küng, A.; Baas, M.; Voert, M.J.A. te; Baer, G.; Stuerwald, S.

    2015-01-01

    A comparison of topography measurements of aspherical surfaces was carried out by European metrology institutes, other research institutes and a company as part of an European metrology research project. In this paper the results of this comparison are presented. Two artefacts were circulated, a

  5. Software tools to aid Pascal and Ada program design

    Energy Technology Data Exchange (ETDEWEB)

    Jankowitz, H.T.

    1987-01-01

    This thesis describes a software tool which analyses the style and structure of Pascal and Ada programs by ensuring that some minimum design requirements are fulfilled. The tool is used in much the same way as a compiler is used to teach students the syntax of a language, only in this case issues related to the design and structure of the program are of paramount importance. The tool operates by analyzing the design and structure of a syntactically correct program, automatically generating a report detailing changes that need to be made in order to ensure that the program is structurally sound. The author discusses how the model gradually evolved from a plagiarism detection system which extracted several measurable characteristics in a program to a model that analyzed the style of Pascal programs. In order to incorporate more-sophistical concepts like data abstraction, information hiding and data protection, this model was then extended to analyze the composition of Ada programs. The Ada model takes full advantage of facilities offered in the language and by using this tool the standard and quality of written programs is raised whilst the fundamental principles of program design are grasped through a process of self-tuition.

  6. Activity Theory applied to Global Software Engineering: Theoretical Foundations and Implications for Tool Builders

    DEFF Research Database (Denmark)

    Tell, Paolo; Ali Babar, Muhammad

    2012-01-01

    Although a plethora of tools are available for Global Software Engineering (GSE) teams, it is being realized increasingly that the most prevalent desktop metaphor underpinning the majority of tools have several inherent limitations. We have proposed that Activity-Based Computing (ABC) can be a pr...... in building supporting infrastructure for GSE, and describe a proof of concept prototype....

  7. Harnessing scientific literature reports for pharmacovigilance. Prototype software analytical tool development and usability testing.

    Science.gov (United States)

    Sorbello, Alfred; Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier

    2017-03-22

    We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers' capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. All usability test participants cited the tool's ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool's automated literature search relative to a manual 'all fields' PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction.

  8. Methods and tools used at the IPSN for the safety assessment of critical software

    International Nuclear Information System (INIS)

    Regnier, P.; Henry, J.Y.

    1998-01-01

    A significant feature of EDF's latest 1400MWe ''N4'' generation of pressurized water reactor (PWR) is the extensive use of computerized instrumentation and control, including a fully digital system for the reactor protection function. For the safety assessment of the software driving the operation of this digital reactor protection called SPIN, IPSN has developed and implemented a set of methods and tools. Using the lessons learned from this experience, IPSN has worked at improving those methods and tools, mainly trying to make them more automatic to use, and has participated in an international assessment exercise to test some other methods and tools, either new products on the market or self-developed products. As a result of these works, this paper presents an up to date overview of the IPSN methods and tools used for the assessment of safety critical software. This assessment, which consists of an analysis of all the documentation associated with the technical specifications and of a representative set of functions, is usually carried out in five steps: (1) critical examination of the documents, (2) evaluation of the quality of the code, (3) determination of the critical software components, (4) development of test cases and choice of testing strategy, (5) dynamic analysis (consistency and robustness). This paper also presents methods and tools developed or implemented by IPSN in order to: evaluate the completeness and consistency of specification and design documents written in natural language; build a model and simulate specification or design items; evaluate the quality of the source code; carry out FMEA analysis; run the binary code and perform tests (CLAIRE); perform random or mutational tests. (author)

  9. QCScreen: a software tool for data quality control in LC-HRMS based metabolomics.

    Science.gov (United States)

    Simader, Alexandra Maria; Kluger, Bernhard; Neumann, Nora Katharina Nicole; Bueschl, Christoph; Lemmens, Marc; Lirk, Gerald; Krska, Rudolf; Schuhmacher, Rainer

    2015-10-24

    Metabolomics experiments often comprise large numbers of biological samples resulting in huge amounts of data. This data needs to be inspected for plausibility before data evaluation to detect putative sources of error e.g. retention time or mass accuracy shifts. Especially in liquid chromatography-high resolution mass spectrometry (LC-HRMS) based metabolomics research, proper quality control checks (e.g. for precision, signal drifts or offsets) are crucial prerequisites to achieve reliable and comparable results within and across experimental measurement sequences. Software tools can support this process. The software tool QCScreen was developed to offer a quick and easy data quality check of LC-HRMS derived data. It allows a flexible investigation and comparison of basic quality-related parameters within user-defined target features and the possibility to automatically evaluate multiple sample types within or across different measurement sequences in a short time. It offers a user-friendly interface that allows an easy selection of processing steps and parameter settings. The generated results include a coloured overview plot of data quality across all analysed samples and targets and, in addition, detailed illustrations of the stability and precision of the chromatographic separation, the mass accuracy and the detector sensitivity. The use of QCScreen is demonstrated with experimental data from metabolomics experiments using selected standard compounds in pure solvent. The application of the software identified problematic features, samples and analytical parameters and suggested which data files or compounds required closer manual inspection. QCScreen is an open source software tool which provides a useful basis for assessing the suitability of LC-HRMS data prior to time consuming, detailed data processing and subsequent statistical analysis. It accepts the generic mzXML format and thus can be used with many different LC-HRMS platforms to process both multiple

  10. Magnetic nanoparticles. Metrological aspects

    International Nuclear Information System (INIS)

    Nikiforov, V N; Nikiforov, A V; Oxengendler, B L; Turaeva, N N; Sredin, V G

    2011-01-01

    The experiments on influence of the iron oxide cluster size on the specific magnetic moment are performed. Both free and covered clusters are investigated. The experiments are interpreted on the base of core-shell model by analogy to Weizsaecker formula in the nuclear physics. Metrological parameters for the cluster size investigation are obtained.

  11. Statistical metrology - measurement and modeling of variation for advanced process development and design rule generation

    International Nuclear Information System (INIS)

    Boning, Duane S.; Chung, James E.

    1998-01-01

    Advanced process technology will require more detailed understanding and tighter control of variation in devices and interconnects. The purpose of statistical metrology is to provide methods to measure and characterize variation, to model systematic and random components of that variation, and to understand the impact of variation on both yield and performance of advanced circuits. Of particular concern are spatial or pattern-dependencies within individual chips; such systematic variation within the chip can have a much larger impact on performance than wafer-level random variation. Statistical metrology methods will play an important role in the creation of design rules for advanced technologies. For example, a key issue in multilayer interconnect is the uniformity of interlevel dielectric (ILD) thickness within the chip. For the case of ILD thickness, we describe phases of statistical metrology development and application to understanding and modeling thickness variation arising from chemical-mechanical polishing (CMP). These phases include screening experiments including design of test structures and test masks to gather electrical or optical data, techniques for statistical decomposition and analysis of the data, and approaches to calibrating empirical and physical variation models. These models can be integrated with circuit CAD tools to evaluate different process integration or design rule strategies. One focus for the generation of interconnect design rules are guidelines for the use of 'dummy fill' or 'metal fill' to improve the uniformity of underlying metal density and thus improve the uniformity of oxide thickness within the die. Trade-offs that can be evaluated via statistical metrology include the improvements to uniformity possible versus the effect of increased capacitance due to additional metal

  12. TSOM method for semiconductor metrology

    Science.gov (United States)

    Attota, Ravikiran; Dixson, Ronald G.; Kramar, John A.; Potzick, James E.; Vladár, András E.; Bunday, Benjamin; Novak, Erik; Rudack, Andrew

    2011-03-01

    Through-focus scanning optical microscopy (TSOM) is a new metrology method that achieves 3D nanoscale measurement sensitivity using conventional optical microscopes; measurement sensitivities are comparable to what is typical when using scatterometry, scanning electron microscopy (SEM), and atomic force microscopy (AFM). TSOM can be used in both reflection and transmission modes and is applicable to a variety of target materials and shapes. Nanometrology applications that have been demonstrated by experiments or simulations include defect analysis, inspection and process control; critical dimension, photomask, overlay, nanoparticle, thin film, and 3D interconnect metrologies; line-edge roughness measurements; and nanoscale movements of parts in MEMS/NEMS. Industries that could benefit include semiconductor, data storage, photonics, biotechnology, and nanomanufacturing. TSOM is relatively simple and inexpensive, has a high throughput, and provides nanoscale sensitivity for 3D measurements with potentially significant savings and yield improvements in manufacturing.

  13. New Tools for New Literacies Research: An Exploration of Usability Testing Software

    Science.gov (United States)

    Asselin, Marlene; Moayeri, Maryam

    2010-01-01

    Competency in the new literacies of the Internet is essential for participating in contemporary society. Researchers studying these new literacies are recognizing the limitations of traditional methodological tools and adapting new technologies and new media for use in research. This paper reports our exploration of usability testing software to…

  14. Present status of metrology of electro-optical surveillance systems

    Science.gov (United States)

    Chrzanowski, K.

    2017-10-01

    There has been a significant progress in equipment for testing electro-optical surveillance systems over the last decade. Modern test systems are increasingly computerized, employ advanced image processing and offer software support in measurement process. However, one great challenge, in form of relative low accuracy, still remains not solved. It is quite common that different test stations, when testing the same device, produce different results. It can even happen that two testing teams, while working on the same test station, with the same tested device, produce different results. Rapid growth of electro-optical technology, poor standardization, limited metrology infrastructure, subjective nature of some measurements, fundamental limitations from laws of physics, tendering rules and advances in artificial intelligence are major factors responsible for such situation. Regardless, next decade should bring significant improvements, since improvement in measurement accuracy is needed to sustain fast growth of electro-optical surveillance technology.

  15. Towards a Reference Architecture to Provision Tools as a Service for Global Software Development

    DEFF Research Database (Denmark)

    Chauhan, Aufeef; Babar, Muhammad Ali

    2014-01-01

    Organizations involve in Global Software Development (GSD) face challenges in terms of having access to appropriate set of tools for performing distributed engineering and development activities, integration between heterogeneous desktop and web-based tools, management of artifacts developed...... distributed environment. In this paper, we argue the need to have a cloud-enabled platform for supporting GSD and propose reference architecture of a cloud based Platform for providing support to provision ecosystem of the Tools as a Service (PTaaS)....

  16. Training, Quality Assurance Factors, and Tools Investigation: a Work Report and Suggestions on Software Quality Assurance

    Science.gov (United States)

    Lee, Pen-Nan

    1991-01-01

    Previously, several research tasks have been conducted, some observations were obtained, and several possible suggestions have been contemplated involving software quality assurance engineering at NASA Johnson. These research tasks are briefly described. Also, a brief discussion is given on the role of software quality assurance in software engineering along with some observations and suggestions. A brief discussion on a training program for software quality assurance engineers is provided. A list of assurance factors as well as quality factors are also included. Finally, a process model which can be used for searching and collecting software quality assurance tools is presented.

  17. Neutron metrology in the HFR

    International Nuclear Information System (INIS)

    Kraakman, R.; Voorbraak, W.P.

    1993-04-01

    Additional to the in-core EXOTIC experiments, six irradiations of ceramic material, R212-001 to R212-006, have been performed in the PSF of the HFR. This note presents the neutron metrology results for these irradiations. (orig.)

  18. Metrology/viewing system for next generation fusion reactors

    International Nuclear Information System (INIS)

    Spampinato, P.T.; Barry, R.E.; Chesser, J.B.; Menon, M.M.; Dagher, M.A.

    1997-01-01

    Next generation fusion reactors require accurate measuring systems to verify sub-millimeter alignment of plasma-facing components in the reactor vessel. A metrology system capable of achieving such accuracy must be compatible with the vessel environment of high gamma radiation, high vacuum, elevated temperature, and magnetic field. This environment requires that the system must be remotely deployed. A coherent, frequency modulated laser radar system is being integrated with a remotely operated deployment system to meet these requirements. The metrology/viewing system consists of a compact laser transceiver optics module which is linked through fiber optics to the laser source and imaging units that are located outside of the harsh environment. The deployment mechanism is a telescopic-mast positioning system. This paper identifies the requirements for the International Thermonuclear Experimental Reactor metrology and viewing system, and describes a remotely operated precision ranging and surface mapping system

  19. Development of an analysis methodology applied to 4πβ-γ software coincidence data acquisition system

    International Nuclear Information System (INIS)

    Brancaccio, Franco; Dias, Mauro da Silva; Toledo, Fabio de

    2009-01-01

    The present work describes the new software methodology under development at the IPEN Nuclear Metrology Laboratory for radionuclide standardizations with 4πβ-γ coincidence technique. The software includes the Coincidence Graphic User Interface (GUI) and the Coincidence Analysis Program. The first results for a 60 Co sample measurement are discussed and compared to the results obtained with two different conventional coincidence systems. (author)

  20. Comparison of quality control software tools for diffusion tensor imaging.

    Science.gov (United States)

    Liu, Bilan; Zhu, Tong; Zhong, Jianhui

    2015-04-01

    Image quality of diffusion tensor imaging (DTI) is critical for image interpretation, diagnostic accuracy and efficiency. However, DTI is susceptible to numerous detrimental artifacts that may impair the reliability and validity of the obtained data. Although many quality control (QC) software tools are being developed and are widely used and each has its different tradeoffs, there is still no general agreement on an image quality control routine for DTIs, and the practical impact of these tradeoffs is not well studied. An objective comparison that identifies the pros and cons of each of the QC tools will be helpful for the users to make the best choice among tools for specific DTI applications. This study aims to quantitatively compare the effectiveness of three popular QC tools including DTI studio (Johns Hopkins University), DTIprep (University of North Carolina at Chapel Hill, University of Iowa and University of Utah) and TORTOISE (National Institute of Health). Both synthetic and in vivo human brain data were used to quantify adverse effects of major DTI artifacts to tensor calculation as well as the effectiveness of different QC tools in identifying and correcting these artifacts. The technical basis of each tool was discussed, and the ways in which particular techniques affect the output of each of the tools were analyzed. The different functions and I/O formats that three QC tools provide for building a general DTI processing pipeline and integration with other popular image processing tools were also discussed. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Interoperability: linking design and tolerancing with metrology.

    Science.gov (United States)

    Morse, Edward; Heysiattalab, Saeed; Barnard-Feeney, Allison; Hedberg, Thomas

    2016-01-01

    On October 30, 2014 the American National Standards Institute (ANSI) approved QIF v 2.0 (Quality Information Framework, version 2.0) as an American National Standard. Subsequently in early 2016 QIF version 2.1 was approved. This paper describes how the QIF standard models the information necessary for quality workflow across the full metrology enterprise. After a brief description of the XML 'language' used in the standard, the paper reports on how the standard enables information exchange among four major activities in the metrology enterprise (product definition; measurement planning; measurement execution; and the analysis and reporting of the quality data).

  2. Calibration of an interfacial force microscope for MEMS metrology : FY08-09 activities.

    Energy Technology Data Exchange (ETDEWEB)

    Houston, Jack E.; Baker, Michael Sean; Crowson, Douglas A.; Mitchell, John Anthony; Moore, Nathan W.

    2009-10-01

    Progress in MEMS fabrication has enabled a wide variety of force and displacement sensing devices to be constructed. One device under intense development at Sandia is a passive shock switch, described elsewhere (Mitchell 2008). A goal of all MEMS devices, including the shock switch, is to achieve a high degree of reliability. This, in turn, requires systematic methods for validating device performance during each iteration of design. Once a design is finalized, suitable tools are needed to provide quality assurance for manufactured devices. To ensure device performance, measurements on these devices must be traceable to NIST standards. In addition, accurate metrology of MEMS components is needed to validate mechanical models that are used to design devices to accelerate development and meet emerging needs. Progress towards a NIST-traceable calibration method is described for a next-generation, 2D Interfacial Force Microscope (IFM) for applications in MEMS metrology and qualification. Discussed are the results of screening several suitable calibration methods and the known sources of uncertainty in each method.

  3. Genomic Islands: an overview of current software tools and future improvements

    Directory of Open Access Journals (Sweden)

    Soares Siomar de Castro

    2016-03-01

    Full Text Available Microbes are highly diverse and widely distributed organisms. They account for ~60% of Earth’s biomass and new predictions point for the existence of 1011 to 1012 species, which are constantly sharing genes through several different mechanisms. Genomic Islands (GI are critical in this context, as they are large regions acquired through horizontal gene transfer. Also, they present common features like genomic signature deviation, transposase genes, flanking tRNAs and insertion sequences. GIs carry large numbers of genes related to specific lifestyle and are commonly classified in Pathogenicity, Resistance, Metabolic or Symbiotic Islands. With the advent of the next-generation sequencing technologies and the deluge of genomic data, many software tools have been developed that aim to tackle the problem of GI prediction and they are all based on the prediction of GI common features. However, there is still room for the development of new software tools that implements new approaches, such as, machine learning and pangenomics based analyses. Finally, GIs will always hold a potential application in every newly invented genomic approach as they are directly responsible for much of the genomic plasticity of bacteria.

  4. Genomic Islands: an overview of current software tools and future improvements.

    Science.gov (United States)

    Soares, Siomar de Castro; Oliveira, Letícia de Castro; Jaiswal, Arun Kumar; Azevedo, Vasco

    2016-03-01

    Microbes are highly diverse and widely distributed organisms. They account for ~60% of Earth's biomass and new predictions point for the existence of 1011 to 1012 species, which are constantly sharing genes through several different mechanisms. Genomic Islands (GI) are critical in this context, as they are large regions acquired through horizontal gene transfer. Also, they present common features like genomic signature deviation, transposase genes, flanking tRNAs and insertion sequences. GIs carry large numbers of genes related to specific lifestyle and are commonly classified in Pathogenicity, Resistance, Metabolic or Symbiotic Islands. With the advent of the next-generation sequencing technologies and the deluge of genomic data, many software tools have been developed that aim to tackle the problem of GI prediction and they are all based on the prediction of GI common features. However, there is still room for the development of new software tools that implements new approaches, such as, machine learning and pangenomics based analyses. Finally, GIs will always hold a potential application in every newly invented genomic approach as they are directly responsible for much of the genomic plasticity of bacteria.

  5. Creating a strategic plan for configuration management using computer aided software engineering (CASE) tools

    International Nuclear Information System (INIS)

    Smith, P.R.; Sarfaty, R.

    1993-01-01

    This paper provides guidance in the definition, documentation, measurement, enhancement of processes, and validation of a strategic plan for configuration management (CM). The approach and methodology used in establishing a strategic plan is the same for any enterprise, including the Department of Energy (DOE), commercial nuclear plants, the Department of Defense (DOD), or large industrial complexes. The principles and techniques presented are used world wide by some of the largest corporations. The authors used industry knowledge and the areas of their current employment to illustrate and provide examples. Developing a strategic configuration and information management plan for DOE Idaho Field Office (DOE-ID) facilities is discussed in this paper. A good knowledge of CM principles is the key to successful strategic planning. This paper will describe and define CM elements, and discuss how CM integrates the facility's physical configuration, design basis, and documentation. The strategic plan does not need the support of a computer aided software engineering (CASE) tool. However, the use of the CASE tool provides a methodology for consistency in approach, graphics, and database capability combined to form an encyclopedia and a method of presentation that is easily understood and aids the process of reengineering. CASE tools have much more capability than those stated above. Some examples are supporting a joint application development group (JAD) to prepare a software functional specification document and, if necessary, provide the capability to automatically generate software application code. This paper briefly discusses characteristics and capabilities of two CASE tools that use different methodologies to generate similar deliverables

  6. The Need for Systematic Naming Software Tools for Exchange of Chemical Information

    Directory of Open Access Journals (Sweden)

    Andrey Yerin

    1999-09-01

    Full Text Available The availability of systematic names can enable the simple textual exchange of chemical structure information. The exchange of molecular structures in graphical format or connection tables has become well established in the field of cheminformatics and many structure drawing tools exist to enable this exchange. However, even with the availability of systematic naming rules, software tools to allow the generation of names from structures, and hopefully the reversal of these systematic names back to the original chemical structure, have been sorely lacking in capability and quality. Here we review the need for systematic naming as well as some of the tools and approaches being taken today in this area.

  7. Software Tools for Emittance Measurement and Matching for 12 GeV CEBAF

    Energy Technology Data Exchange (ETDEWEB)

    Turner, Dennis L. [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States)

    2016-05-01

    This paper discusses model-driven setup of the Continuous Electron Beam Accelerator Facility (CEBAF) for the 12GeV era, focusing on qsUtility. qsUtility is a set of software tools created to perform emittance measurements, analyze those measurements, and compute optics corrections based upon the measurements.qsUtility was developed as a toolset to facilitate reducing machine configuration time and reproducibility by way of an accurate accelerator model, and to provide Operations staff with tools to measure and correct machine optics with little or no assistance from optics experts.

  8. GMFilter and SXTestPlate: software tools for improving the SNPlex™ genotyping system

    Directory of Open Access Journals (Sweden)

    Schreiber Stefan

    2009-03-01

    Full Text Available Abstract Background Genotyping of single-nucleotide polymorphisms (SNPs is a fundamental technology in modern genetics. The SNPlex™ mid-throughput genotyping system (Applied Biosystems, Foster City, CA, USA enables the multiplexed genotyping of up to 48 SNPs simultaneously in a single DNA sample. The high level of automation and the large amount of data produced in a high-throughput laboratory require advanced software tools for quality control and workflow management. Results We have developed two programs, which address two main aspects of quality control in a SNPlex™ genotyping environment: GMFilter improves the analysis of SNPlex™ plates by removing wells with a low overall signal intensity. It enables scientists to automatically process the raw data in a standardized way before analyzing a plate with the proprietary GeneMapper software from Applied Biosystems. SXTestPlate examines the genotype concordance of a SNPlex™ test plate, which was typed with a control SNP set. This program allows for regular quality control checks of a SNPlex™ genotyping platform. It is compatible to other genotyping methods as well. Conclusion GMFilter and SXTestPlate provide a valuable tool set for laboratories engaged in genotyping based on the SNPlex™ system. The programs enhance the analysis of SNPlex™ plates with the GeneMapper software and enable scientists to evaluate the performance of their genotyping platform.

  9. Metrology for Grayscale Lithography

    International Nuclear Information System (INIS)

    Murali, Raghunath

    2007-01-01

    Three dimensional microstructures find applications in diffractive optical elements, photonic elements, etc. and can be efficiently fabricated by grayscale lithography. Good process control is important for achieving the desired structures. Metrology methods for grayscale lithography are discussed. Process optimization for grayscale e-beam lithography is explored and various process parameters that affect the grayscale process are discussed

  10. A parallel and sensitive software tool for methylation analysis on multicore platforms.

    Science.gov (United States)

    Tárraga, Joaquín; Pérez, Mariano; Orduña, Juan M; Duato, José; Medina, Ignacio; Dopazo, Joaquín

    2015-10-01

    DNA methylation analysis suffers from very long processing time, as the advent of Next-Generation Sequencers has shifted the bottleneck of genomic studies from the sequencers that obtain the DNA samples to the software that performs the analysis of these samples. The existing software for methylation analysis does not seem to scale efficiently neither with the size of the dataset nor with the length of the reads to be analyzed. As it is expected that the sequencers will provide longer and longer reads in the near future, efficient and scalable methylation software should be developed. We present a new software tool, called HPG-Methyl, which efficiently maps bisulphite sequencing reads on DNA, analyzing DNA methylation. The strategy used by this software consists of leveraging the speed of the Burrows-Wheeler Transform to map a large number of DNA fragments (reads) rapidly, as well as the accuracy of the Smith-Waterman algorithm, which is exclusively employed to deal with the most ambiguous and shortest reads. Experimental results on platforms with Intel multicore processors show that HPG-Methyl significantly outperforms in both execution time and sensitivity state-of-the-art software such as Bismark, BS-Seeker or BSMAP, particularly for long bisulphite reads. Software in the form of C libraries and functions, together with instructions to compile and execute this software. Available by sftp to anonymous@clariano.uv.es (password 'anonymous'). juan.orduna@uv.es or jdopazo@cipf.es. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  11. Development of Safety-Critical Software for Nuclear Power Plant using a CASE Tool

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Chang Ho; Oh, Do Young; Kim, Koh Eun; Choi, Woong Seock; Sohn, Se Do; Kim, Jae Hack; Kim, Hang Bae [KEPCO E and C, Daejeon (Korea, Republic of)

    2011-08-15

    The Integrated SOftware Development Environment (ISODE) is developed to provide the major S/W life cycle processes that are composed of development process, V/V process, requirements traceability process, and automated document generation process and target importing process to Programmable Logic Controller (PLC) platform. This provides critical safety software developers with a certified, domain optimized, model-based development environment, and the associated services to reduce time and efforts to develop software such as debugging, simulation, code generation and document generation. This also provides critical safety software verifiers with integrated V/V features of each phase of the software life cycle using appropriate tools such as model test coverage, formal verification, and automated report generation. In addition to development and verification, the ISODE gives a complete traceability solution from the SW design phase to the testing phase. Using this information, the coverage and impact analysis can be done easily whenever software modification is necessary. The final source codes of ISODE are imported into the newly developed PLC environment, as a module based after automatically converted into the format required by PLC. Additional tests for module and unit level are performed on the target platform.

  12. Development of Safety-Critical Software for Nuclear Power Plant using a CASE Tool

    International Nuclear Information System (INIS)

    Kim, Chang Ho; Oh, Do Young; Kim, Koh Eun; Choi, Woong Seock; Sohn, Se Do; Kim, Jae Hack; Kim, Hang Bae

    2011-01-01

    The Integrated SOftware Development Environment (ISODE) is developed to provide the major S/W life cycle processes that are composed of development process, V/V process, requirements traceability process, and automated document generation process and target importing process to Programmable Logic Controller (PLC) platform. This provides critical safety software developers with a certified, domain optimized, model-based development environment, and the associated services to reduce time and efforts to develop software such as debugging, simulation, code generation and document generation. This also provides critical safety software verifiers with integrated V/V features of each phase of the software life cycle using appropriate tools such as model test coverage, formal verification, and automated report generation. In addition to development and verification, the ISODE gives a complete traceability solution from the SW design phase to the testing phase. Using this information, the coverage and impact analysis can be done easily whenever software modification is necessary. The final source codes of ISODE are imported into the newly developed PLC environment, as a module based after automatically converted into the format required by PLC. Additional tests for module and unit level are performed on the target platform

  13. Utilization of Software Tools for Uncertainty Calculation in Measurement Science Education

    International Nuclear Information System (INIS)

    Zangl, Hubert; Zine-Zine, Mariam; Hoermaier, Klaus

    2015-01-01

    Despite its importance, uncertainty is often neglected by practitioners in the design of system even in safety critical applications. Thus, problems arising from uncertainty may only be identified late in the design process and thus lead to additional costs. Although there exists numerous tools to support uncertainty calculation, reasons for limited usage in early design phases may be low awareness of the existence of the tools and insufficient training in the practical application. We present a teaching philosophy that addresses uncertainty from the very beginning of teaching measurement science, in particular with respect to the utilization of software tools. The developed teaching material is based on the GUM method and makes use of uncertainty toolboxes in the simulation environment. Based on examples in measurement science education we discuss advantages and disadvantages of the proposed teaching philosophy and include feedback from students

  14. QUALITY SERVICES EVALUATION MODEL BASED ON DEDICATED SOFTWARE TOOL

    Directory of Open Access Journals (Sweden)

    ANDREEA CRISTINA IONICĂ

    2012-10-01

    Full Text Available In this paper we introduced a new model, called Service Quality (SQ, which combines QFD and SERVQUAL methods. This model takes from the SERVQUAL method the five dimensions of requirements and three of characteristics and from the QFD method the application methodology. The originality of the SQ model consists in computing a global index that reflects the customers’ requirements accomplishment level by the quality characteristics. In order to prove the viability of the SQ model, there was developed a software tool that was applied for the evaluation of a health care services provider.

  15. Advanced software tools for digital loose part monitoring systems

    International Nuclear Information System (INIS)

    Ding, Y.

    1996-01-01

    The paper describes two software modules as analysis tools for digital loose part monitoring systems. The first module is called acoustic module which utilizes the multi-media features of modern personal computers to replay the digital stored short-time bursts with sufficient length and in good quality. This is possible due to the so-called puzzle technique developed at ISTec. The second module is called classification module which calculates advanced burst parameters and classifies the acoustic events in pre-defined classes with the help of an artificial multi-layer perception neural network trained with the back propagation algorithm. (author). 7 refs, 7 figs

  16. Object-Oriented Software Tools for the Construction of Preconditioners

    Directory of Open Access Journals (Sweden)

    Eva Mossberg

    1997-01-01

    Full Text Available In recent years, there has been considerable progress concerning preconditioned iterative methods for large and sparse systems of equations arising from the discretization of differential equations. Such methods are particularly attractive in the context of high-performance (parallel computers. However, the implementation of a preconditioner is a nontrivial task. The focus of the present contribution is on a set of object-oriented software tools that support the construction of a family of preconditioners based on fast transforms. By combining objects of different classes, it is possible to conveniently construct any preconditioner within this family.

  17. Advanced software tools for digital loose part monitoring systems

    Energy Technology Data Exchange (ETDEWEB)

    Ding, Y [Institute for Safety Technology (ISTec) GmbH, Garching (Germany)

    1997-12-31

    The paper describes two software modules as analysis tools for digital loose part monitoring systems. The first module is called acoustic module which utilizes the multi-media features of modern personal computers to replay the digital stored short-time bursts with sufficient length and in good quality. This is possible due to the so-called puzzle technique developed at ISTec. The second module is called classification module which calculates advanced burst parameters and classifies the acoustic events in pre-defined classes with the help of an artificial multi-layer perception neural network trained with the back propagation algorithm. (author). 7 refs, 7 figs.

  18. Coordinate metrology accuracy of systems and measurements

    CERN Document Server

    Sładek, Jerzy A

    2016-01-01

    This book focuses on effective methods for assessing the accuracy of both coordinate measuring systems and coordinate measurements. It mainly reports on original research work conducted by Sladek’s team at Cracow University of Technology’s Laboratory of Coordinate Metrology. The book describes the implementation of different methods, including artificial neural networks, the Matrix Method, the Monte Carlo method and the virtual CMM (Coordinate Measuring Machine), and demonstrates how these methods can be effectively used in practice to gauge the accuracy of coordinate measurements. Moreover, the book includes an introduction to the theory of measurement uncertainty and to key techniques for assessing measurement accuracy. All methods and tools are presented in detail, using suitable mathematical formulations and illustrated with numerous examples. The book fills an important gap in the literature, providing readers with an advanced text on a topic that has been rapidly developing in recent years. The book...

  19. Machine learning and predictive data analytics enabling metrology and process control in IC fabrication

    Science.gov (United States)

    Rana, Narender; Zhang, Yunlin; Wall, Donald; Dirahoui, Bachir; Bailey, Todd C.

    2015-03-01

    Integrate circuit (IC) technology is going through multiple changes in terms of patterning techniques (multiple patterning, EUV and DSA), device architectures (FinFET, nanowire, graphene) and patterning scale (few nanometers). These changes require tight controls on processes and measurements to achieve the required device performance, and challenge the metrology and process control in terms of capability and quality. Multivariate data with complex nonlinear trends and correlations generally cannot be described well by mathematical or parametric models but can be relatively easily learned by computing machines and used to predict or extrapolate. This paper introduces the predictive metrology approach which has been applied to three different applications. Machine learning and predictive analytics have been leveraged to accurately predict dimensions of EUV resist patterns down to 18 nm half pitch leveraging resist shrinkage patterns. These patterns could not be directly and accurately measured due to metrology tool limitations. Machine learning has also been applied to predict the electrical performance early in the process pipeline for deep trench capacitance and metal line resistance. As the wafer goes through various processes its associated cost multiplies. It may take days to weeks to get the electrical performance readout. Predicting the electrical performance early on can be very valuable in enabling timely actionable decision such as rework, scrap, feedforward, feedback predicted information or information derived from prediction to improve or monitor processes. This paper provides a general overview of machine learning and advanced analytics application in the advanced semiconductor development and manufacturing.

  20. Wiki as a Corporate Learning Tool: Case Study for Software Development Company

    Science.gov (United States)

    Milovanovic, Milos; Minovic, Miroslav; Stavljanin, Velimir; Savkovic, Marko; Starcevic, Dusan

    2012-01-01

    In our study, we attempted to further investigate how Web 2.0 technologies influence workplace learning. Our particular interest was on using Wiki as a tool for corporate exchange of knowledge with the focus on informal learning. In this study, we collaborated with a multinational software development company that uses Wiki as a corporate tool…

  1. Enhanced methodology of focus control and monitoring on scanner tool

    Science.gov (United States)

    Chen, Yen-Jen; Kim, Young Ki; Hao, Xueli; Gomez, Juan-Manuel; Tian, Ye; Kamalizadeh, Ferhad; Hanson, Justin K.

    2017-03-01

    As the demand of the technology node shrinks from 14nm to 7nm, the reliability of tool monitoring techniques in advanced semiconductor fabs to achieve high yield and quality becomes more critical. Tool health monitoring methods involve periodic sampling of moderately processed test wafers to detect for particles, defects, and tool stability in order to ensure proper tool health. For lithography TWINSCAN scanner tools, the requirements for overlay stability and focus control are very strict. Current scanner tool health monitoring methods include running BaseLiner to ensure proper tool stability on a periodic basis. The focus measurement on YIELDSTAR by real-time or library-based reconstruction of critical dimensions (CD) and side wall angle (SWA) has been demonstrated as an accurate metrology input to the control loop. The high accuracy and repeatability of the YIELDSTAR focus measurement provides a common reference of scanner setup and user process. In order to further improve the metrology and matching performance, Diffraction Based Focus (DBF) metrology enabling accurate, fast, and non-destructive focus acquisition, has been successfully utilized for focus monitoring/control of TWINSCAN NXT immersion scanners. The optimal DBF target was determined to have minimized dose crosstalk, dynamic precision, set-get residual, and lens aberration sensitivity. By exploiting this new measurement target design, 80% improvement in tool-to-tool matching, >16% improvement in run-to-run mean focus stability, and >32% improvement in focus uniformity have been demonstrated compared to the previous BaseLiner methodology. Matching control and monitoring on multiple illumination conditions, opens an avenue to significantly reduce Focus-Exposure Matrix (FEM) wafer exposure for new product/layer best focus (BF) setup.

  2. Academic software development tools and techniques (Report on the 1st Workshop WASDeTT at ECOOP 2008)

    NARCIS (Netherlands)

    Wuyts, R.; Kienle, H.M.; Mens, K.; Brand, van den M.G.J.; Kuhn, A.; Eugster, P.

    2009-01-01

    The objective of the 1st International Workshop on Advanced Software Development Tools and Techniques (WASDeTT-1) was to provide interested researchers with a forum to share their tool building experiences and to explore how tools can be built more effectively and efficiently. The theme for this

  3. A software tool for evaluation of hydrogen ingress in CANDU pressure tubes

    International Nuclear Information System (INIS)

    Mihalache, Maria; Vasile, Radu; Deaconu, Mariea

    2009-01-01

    The prediction of hydrogen isotopes concentration into the body and in the rolled joints of operating pressure tubes as a function of reactor hot hours is very important in many fitness-for-service assessments and end of life estimates. The rolled joints are high stress zones with potential for delayed hydride cracking. Predictive models for assessing the long-term deuterium ingress in both body and rolled joint of the pressure tubes have been implemented in a software tool, ROHID, developed in INR-Pitesti. ROHID is a PC-based Windows application with a user-friendly interface that predicts the equivalent hydrogen ingress for Zr-2.5Nb pressure tubes. It uses colour-coded reactor core maps to display the predicted deuterium concentration as a function of time for selected axial locations. Plots of deuterium versus axial location and time for individual pressure tubes are also available. Also, the software tool can predict the exceeding of hydrogen terminal solid solubility (HTSS) from hydrides during precipitation and dissolving processes as a function of time and axial location. (authors)

  4. A software tool for soil clean-up technology selection

    International Nuclear Information System (INIS)

    Vranes, S.; Gonzalez-Valencia, E.; Lodolo, A.; Miertus, S.

    2002-01-01

    Soil remediation is a difficult, time-consuming and expensive operation. A variety of mature and emerging soil remediation technologies is available and future trends in remediation will include continued competition among environmental service companies and technology developers, which will definitely result in further increase in the clean-up options. Consequently, the demand has enhanced developing decision support tools that could help the decision makers to select the most appropriate technology for the specific contaminated site, before the costly remedial actions are taken. Therefore, a software tool for soil clean-up technology selection is currently being developed with the aim of closely working with human decision makers (site owners, local community representatives, environmentalists, regulators, etc.) to assess the available technologies and preliminarily select the preferred remedial options. The analysis for the identification of the best remedial options is based on technical, financial, environmental, and social criteria. These criteria are ranked by all involved parties to determine their relative importance for a particular project. (author)

  5. Metrology network: a case study on the metrology network of defense and security from SIBRATEC

    International Nuclear Information System (INIS)

    Pereira, Marisa Ferraz Figueira

    2016-01-01

    This study is focused on understanding the effects of the infrastructure improvement of these laboratories and the role of network management in offering support and metrological services to the defense and security sector enterprises, within the project purposes. It is also aimed identify gaps on offering calibration and, or testing services to supply demands of the defense and security industries, and analyze adequacy of RDS project to demands of defense and security industries, with the purpose to contribute with information for future actions. The experimental research is qualitative type, with exploratory research characteristics, based on case study. It was structured in two parts, involving primary data collection and secondary data. In order to collect the primary data two questionnaires were prepared, one (Questionnaire A) to the five RDS laboratories representatives and other (Questionnaire B) to the contacts of 63 defense and security enterprises which need calibration and test services, possible customers of RDS laboratories. Answers from four representatives of RDS laboratories and from 26 defense and security enterprises were obtained. The collection of secondary data was obtained from documentary research. The analysis was made based on five dimensions defined in order to organize and improve the understanding of the research setting. They are RDS project coverage, regional, network management, metrological traceability and importance and visibility of RDS. The results indicated that the performance of RDS does not interfere, by that time, in the metrological traceability of the products of the defense and security enterprises that participated in the research. (author)

  6. DASAO: software tool for the management of safeguards, waste and decommissioning

    International Nuclear Information System (INIS)

    Noynaert, Luc; Verwaest, Isi; Libon, Henri; Cuchet, Jean-Marie

    2013-01-01

    Decommissioning of nuclear facilities is a complex process involving operations such as detailed surveys, decontamination and dismantling of equipment's, demolition of buildings and management of resulting waste and nuclear materials if any. This process takes place in a well-developed legal framework and is controlled and followed-up by stakeholders like the Safety Authority, the Radwaste management Agency and the Safeguards Organism. In the framework of its nuclear waste and decommissioning program and more specifically the decommissioning of the BR3 reactor, SCK-CEN has developed different software tools to secure the waste and material traceability, to support the sound management of the decommissioning project and to facilitate the control and the follow-up by the stakeholders. In the case of Belgium, it concerns the Federal Agency for Nuclear Control, the National Agency for radioactive waste management and fissile material and EURATOM and IAEA. In 2005, Belgonucleaire decided to shutdown her Dessel MOX fuel fabrication plant and the production stopped in 2006. According to the final decommissioning plan ('PDF') approved by NIRAS, the decommissioning works should start in 2008 at the earliest. In 2006, the management of Belgonucleaire identified the need for an integrated database and decided to entrust SCK-CEN with its development, because SCK-CEN relies on previous experience in comparable applications namely already approved by authorities such as NIRAS, FANC and EURATOM. The main objectives of this integrated software tool are: - simplified and updated safeguards; - waste and material traceability; - computerized documentation; - support to project management; - periodic and final reporting to waste and safety authorities. The software called DASAO (Database for Safeguards, Waste and Decommissioning) was successfully commissioned in 2008 and extensively used from 2009 to the satisfaction of Belgonucleaire and the stakeholders. SCK-CEN is

  7. Tool Support for Software Lookup Table Optimization

    Directory of Open Access Journals (Sweden)

    Chris Wilcox

    2011-01-01

    Full Text Available A number of scientific applications are performance-limited by expressions that repeatedly call costly elementary functions. Lookup table (LUT optimization accelerates the evaluation of such functions by reusing previously computed results. LUT methods can speed up applications that tolerate an approximation of function results, thereby achieving a high level of fuzzy reuse. One problem with LUT optimization is the difficulty of controlling the tradeoff between performance and accuracy. The current practice of manual LUT optimization adds programming effort by requiring extensive experimentation to make this tradeoff, and such hand tuning can obfuscate algorithms. In this paper we describe a methodology and tool implementation to improve the application of software LUT optimization. Our Mesa tool implements source-to-source transformations for C or C++ code to automate the tedious and error-prone aspects of LUT generation such as domain profiling, error analysis, and code generation. We evaluate Mesa with five scientific applications. Our results show a performance improvement of 3.0× and 6.9× for two molecular biology algorithms, 1.4× for a molecular dynamics program, 2.1× to 2.8× for a neural network application, and 4.6× for a hydrology calculation. We find that Mesa enables LUT optimization with more control over accuracy and less effort than manual approaches.

  8. Establishment of a computer-controlled retroreflection measurement system at the National Metrology Institute of Turkey (UME)

    International Nuclear Information System (INIS)

    Samedov, Farhad; Celikel, Oguz; Bazkir, Ozcan

    2005-01-01

    In order to characterize photometric properties of retroreflectors, a fully automated retroreflector measurement system is designed in National Metrology Institute of Turkey (UME). The system is composed of a lighting projector, a goniometer, filter radiometers, 100 dB transimpedance amplifiers, and 24-bit resolution analog-digital converter card with a special software. The established system provides a new calibration capability to determine the luminous intensity and retroreflection coefficients of the retroreflective materials with the expanded uncertainties of 1.07% and 1.13% (k=2), respectively. The traceability in retroreflection measurements was linked to the detector-based photometric scale of UME

  9. Using McIDAS-V data analysis and visualization software as an educational tool for understanding the atmosphere

    Science.gov (United States)

    Achtor, T. H.; Rink, T.

    2010-12-01

    The University of Wisconsin’s Space Science and Engineering Center (SSEC) has been at the forefront in developing data analysis and visualization tools for environmental satellites and other geophysical data. The fifth generation of the Man-computer Interactive Data Access System (McIDAS-V) is Java-based, open-source, freely available software that operates on Linux, Macintosh and Windows systems. The software tools provide powerful new data manipulation and visualization capabilities that work with geophysical data in research, operational and educational environments. McIDAS-V provides unique capabilities to support innovative techniques for evaluating research results, teaching and training. McIDAS-V is based on three powerful software elements. VisAD is a Java library for building interactive, collaborative, 4 dimensional visualization and analysis tools. The Integrated Data Viewer (IDV) is a reference application based on the VisAD system and developed by the Unidata program that demonstrates the flexibility that is needed in this evolving environment, using a modern, object-oriented software design approach. The third tool, HYDRA, allows users to build, display and interrogate multi and hyperspectral environmental satellite data in powerful ways. The McIDAS-V software is being used for training and education in several settings. The McIDAS User Group provides training workshops at its annual meeting. Numerous online tutorials with training data sets have been developed to aid users in learning simple and more complex operations in McIDAS-V, all are available online. In a University of Wisconsin-Madison undergraduate course in Radar and Satellite Meteorology, McIDAS-V is used to create and deliver laboratory exercises using case study and real time data. At the high school level, McIDAS-V is used in several exercises in our annual Summer Workshop in Earth and Atmospheric Sciences to provide young scientists the opportunity to examine data with friendly and

  10. Adjustment method for embedded metrology engine in an EM773 series microcontroller.

    Science.gov (United States)

    Blazinšek, Iztok; Kotnik, Bojan; Chowdhury, Amor; Kačič, Zdravko

    2015-09-01

    This paper presents the problems of implementation and adjustment (calibration) of a metrology engine embedded in NXP's EM773 series microcontroller. The metrology engine is used in a smart metering application to collect data about energy utilization and is controlled with the use of metrology engine adjustment (calibration) parameters. The aim of this research is to develop a method which would enable the operators to find and verify the optimum parameters which would ensure the best possible accuracy. Properly adjusted (calibrated) metrology engines can then be used as a base for variety of products used in smart and intelligent environments. This paper focuses on the problems encountered in the development, partial automatisation, implementation and verification of this method. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  11. Metrological analysis of a virtual flowmeter-based transducer for cryogenic helium

    Energy Technology Data Exchange (ETDEWEB)

    Arpaia, P., E-mail: pasquale.arpaia@unina.it [Department of Electrical Engineering and Information Technology, University of Napoli Federico II, Naples (Italy); Technology Department, European Organization for Nuclear Research (CERN), Geneva (Switzerland); Girone, M., E-mail: mario.girone@cern.ch [Technology Department, European Organization for Nuclear Research (CERN), Geneva (Switzerland); Department of Engineering, University of Sannio, Benevento (Italy); Liccardo, A., E-mail: annalisa.liccardo@unina.it [Department of Electrical Engineering and Information Technology, University of Napoli Federico II, Naples (Italy); Pezzetti, M., E-mail: marco.pezzetti@cern.ch [Technology Department, European Organization for Nuclear Research (CERN), Geneva (Switzerland); Piccinelli, F., E-mail: fabio.piccinelli@cern.ch [Department of Mechanical Engineering, University of Brescia, Brescia (Italy)

    2015-12-15

    The metrological performance of a virtual flowmeter-based transducer for monitoring helium under cryogenic conditions is assessed. At this aim, an uncertainty model of the transducer, mainly based on a valve model, exploiting finite-element approach, and a virtual flowmeter model, based on the Sereg-Schlumberger method, are presented. The models are validated experimentally on a case study for helium monitoring in cryogenic systems at the European Organization for Nuclear Research (CERN). The impact of uncertainty sources on the transducer metrological performance is assessed by a sensitivity analysis, based on statistical experiment design and analysis of variance. In this way, the uncertainty sources most influencing metrological performance of the transducer are singled out over the input range as a whole, at varying operating and setting conditions. This analysis turns out to be important for CERN cryogenics operation because the metrological design of the transducer is validated, and its components and working conditions with critical specifications for future improvements are identified.

  12. Evaluation of The Virtual Cells Software: a Teaching Tool

    Directory of Open Access Journals (Sweden)

    C.C.P. da Silva

    2005-07-01

    handling,  having an accessible language,  supporting the  software  as an education  tool that is capable  to facilitate  the learning  of the fundamental concepts  about the theme.  Other  workshops are programmed to happen with participants from different educational institutions of Sao Carlos  city,  with the goal to broaden our sample.

  13. Advanced overlay analysis through design based metrology

    Science.gov (United States)

    Ji, Sunkeun; Yoo, Gyun; Jo, Gyoyeon; Kang, Hyunwoo; Park, Minwoo; Kim, Jungchan; Park, Chanha; Yang, Hyunjo; Yim, Donggyu; Maruyama, Kotaro; Park, Byungjun; Yamamoto, Masahiro

    2015-03-01

    As design rule shrink, overlay has been critical factor for semiconductor manufacturing. However, the overlay error which is determined by a conventional measurement with an overlay mark based on IBO and DBO often does not represent the physical placement error in the cell area. The mismatch may arise from the size or pitch difference between the overlay mark and the cell pattern. Pattern distortion caused by etching or CMP also can be a source of the mismatch. In 2014, we have demonstrated that method of overlay measurement in the cell area by using DBM (Design Based Metrology) tool has more accurate overlay value than conventional method by using an overlay mark. We have verified the reproducibility by measuring repeatable patterns in the cell area, and also demonstrated the reliability by comparing with CD-SEM data. We have focused overlay mismatching between overlay mark and cell area until now, further more we have concerned with the cell area having different pattern density and etch loading. There appears a phenomenon which has different overlay values on the cells with diverse patterning environment. In this paper, the overlay error was investigated from cell edge to center. For this experiment, we have verified several critical layers in DRAM by using improved(Better resolution and speed) DBM tool, NGR3520.

  14. Optical metrology tools for the Virgo projet

    Science.gov (United States)

    Loriette, V.

    For more than thirty years the search for gravitationnal waves, predicted by Einstein's relativistic theory of gravitation, has been an intense research field in experimental as well as theoretical physics. Today, with the constant advance of technology in optics, lasers, data analysis and processing, ... a promising way of directly detecting gravitationnal waves with earth-based instruments is optical interferometry. Before the end of this century many experiments will be carried on in Australia, Europe, Japan and the United States to detect the passage of a gravitationnal wave through giant Michelson-type interferometers. The effects predicted are so small, (a gravitationnal wave changes the length of three kilometer long arms by one thousandth of a fermi) that the need for “perfect” optical components is a key to the success of these experiments. Still a few years ago it would have been impossible to make optical components that would satisfy the required specifications for such interferometric detectors. For nearly ten years constant R&D efforts in optical coating manufacturing, optical material fabrication and optical metrology, allow us today to make such components. This text is intended to describe the field of optical metrology as it is needed for the testing of optical parts having performances far beyond than everything previously made. The first chapter is an introduction to gravitationnal waves, their sources and their effects on detectors. Starting by newtonian mechanics we jump rapidly to the general theory of relativity and describe particular solutions of Einstein's equations in the case of weak gravitationnal fields, which are periodic perturbations of the space-time metric in the form of plane waves, the so-called gravitationnal waves. We present various candidate sources, terrestrial and extra-terrestrial and give a short description of the two families of detectors: resonnant bars and optical interferometers. The second part of this chapter

  15. Software reengineering

    Science.gov (United States)

    Fridge, Ernest M., III

    1991-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. JSC created a significant set of tools to develop and maintain FORTRAN and C code during development of the Space Shuttle. This tool set forms the basis for an integrated environment to re-engineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. A beta vision of the environment was released in Mar. 1991. The commercial potential for such re-engineering tools is very great. CASE TRENDS magazine reported it to be the primary concern of over four hundred of the top MIS executives.

  16. Metrology Sampling Strategies for Process Monitoring Applications

    KAUST Repository

    Vincent, Tyrone L.

    2011-11-01

    Shrinking process windows in very large scale integration semiconductor manufacturing have already necessitated the development of control systems capable of addressing sub-lot-level variation. Within-wafer control is the next milestone in the evolution of advanced process control from lot-based and wafer-based control. In order to adequately comprehend and control within-wafer spatial variation, inline measurements must be performed at multiple locations across the wafer. At the same time, economic pressures prompt a reduction in metrology, for both capital and cycle-time reasons. This paper explores the use of modeling and minimum-variance prediction as a method to select the sites for measurement on each wafer. The models are developed using the standard statistical tools of principle component analysis and canonical correlation analysis. The proposed selection method is validated using real manufacturing data, and results indicate that it is possible to significantly reduce the number of measurements with little loss in the information obtained for the process control systems. © 2011 IEEE.

  17. TSOM Method for Nanoelectronics Dimensional Metrology

    International Nuclear Information System (INIS)

    Attota, Ravikiran

    2011-01-01

    Through-focus scanning optical microscopy (TSOM) is a relatively new method that transforms conventional optical microscopes into truly three-dimensional metrology tools for nanoscale to microscale dimensional analysis. TSOM achieves this by acquiring and analyzing a set of optical images collected at various focus positions going through focus (from above-focus to under-focus). The measurement resolution is comparable to what is possible with typical light scatterometry, scanning electron microscopy (SEM) and atomic force microscopy (AFM). TSOM method is able to identify nanometer scale difference, type of the difference and magnitude of the difference between two nano/micro scale targets using a conventional optical microscope with visible wavelength illumination. Numerous industries could benefit from the TSOM method--such as the semiconductor industry, MEMS, NEMS, biotechnology, nanomanufacturing, data storage, and photonics. The method is relatively simple and inexpensive, has a high throughput, provides nanoscale sensitivity for 3D measurements and could enable significant savings and yield improvements in nanometrology and nanomanufacturing. Potential applications are demonstrated using experiments and simulations.

  18. Design, Development and Delivery of Active Learning Tools in Software Verification & Validation Education

    Science.gov (United States)

    Acharya, Sushil; Manohar, Priyadarshan Anant; Wu, Peter; Maxim, Bruce; Hansen, Mary

    2018-01-01

    Active learning tools are critical in imparting real world experiences to the students within a classroom environment. This is important because graduates are expected to develop software that meets rigorous quality standards in functional and application domains with little to no training. However, there is a well-recognized need for the…

  19. PC Software graphics tool for conceptual design of space/planetary electrical power systems

    Science.gov (United States)

    Truong, Long V.

    1995-01-01

    This paper describes the Decision Support System (DSS), a personal computer software graphics tool for designing conceptual space and/or planetary electrical power systems. By using the DSS, users can obtain desirable system design and operating parameters, such as system weight, electrical distribution efficiency, and bus power. With this tool, a large-scale specific power system was designed in a matter of days. It is an excellent tool to help designers make tradeoffs between system components, hardware architectures, and operation parameters in the early stages of the design cycle. The DSS is a user-friendly, menu-driven tool with online help and a custom graphical user interface. An example design and results are illustrated for a typical space power system with multiple types of power sources, frequencies, energy storage systems, and loads.

  20. Coordinate Metrology by Traceable Computed Tomography

    DEFF Research Database (Denmark)

    Müller, Pavel

    is an important factor for decision making about manufactured parts. However, due to many influences in CT, estimation of the uncertainty is a challenge, also because standardized procedures and guidelines are not available yet. In this thesis, several methods for uncertainty estimation were applied in connection......, characterization and correction of measurement errors in the CT volume. Their application appeared to be suitable for this task. Because the two objects consist of ruby spheres and carbon fibre, CT scans did not produce image artifacts, and evaluation of sphere-to-sphere distances was robust. Several methods...... metrology and coordinate metrology and is currently becoming more and more important measuring technique for dimensional measurements. This is mainly due to the fact that with CT, a complete three-dimensional model of the scanned part is in a relatively short time visualized using a computer...

  1. Metrology and analytical chemistry: Bridging the cultural gap

    International Nuclear Information System (INIS)

    King, Bernard

    2002-01-01

    Metrology in general and issues such as traceability and measurement uncertainty in particular are new to most analytical chemists and many remain to be convinced of their value. There is a danger of the cultural gap between metrologists and analytical chemists widening with unhelpful consequences and it is important that greater collaboration and cross-fertilisation is encouraged. This paper discusses some of the similarities and differences in the approaches adopted by metrologists and analytical chemists and indicates how these approaches can be combined to establish a unique metrology of chemical measurement which could be accepted by both cultures. (author)

  2. Improved capacity in ionizing radiation metrology at SANAEM

    International Nuclear Information System (INIS)

    Yucel, U.

    2014-01-01

    Full text : Turkey is planning to build nuclear power plants in the south and north coasts to supply the ever-increasing energy demand. The nuclear power plants based on old soviet technology in Armenia and Bulgaria close to Turkey's borders also makes constant monitoring of environmental radioactivity extremely important due to public health and environment contamination concerns. Radiation Metrology Division at SANAEM has been established in 2012 to provide uniformity and reliability of the measurements in the field of ionizing radiation metrology by R and D studies and by constituting, developing, keeping and extending internationally accepted reference measurement standards and techniques

  3. Development of Software to Model AXAF-I Image Quality

    Science.gov (United States)

    Ahmad, Anees; Hawkins, Lamar

    1996-01-01

    This draft final report describes the work performed under the delivery order number 145 from May 1995 through August 1996. The scope of work included a number of software development tasks for the performance modeling of AXAF-I. A number of new capabilities and functions have been added to the GT software, which is the command mode version of the GRAZTRACE software, originally developed by MSFC. A structural data interface has been developed for the EAL (old SPAR) finite element analysis FEA program, which is being used by MSFC Structural Analysis group for the analysis of AXAF-I. This interface utility can read the structural deformation file from the EAL and other finite element analysis programs such as NASTRAN and COSMOS/M, and convert the data to a suitable format that can be used for the deformation ray-tracing to predict the image quality for a distorted mirror. There is a provision in this utility to expand the data from finite element models assuming 180 degrees symmetry. This utility has been used to predict image characteristics for the AXAF-I HRMA, when subjected to gravity effects in the horizontal x-ray ground test configuration. The development of the metrology data processing interface software has also been completed. It can read the HDOS FITS format surface map files, manipulate and filter the metrology data, and produce a deformation file, which can be used by GT for ray tracing for the mirror surface figure errors. This utility has been used to determine the optimum alignment (axial spacing and clocking) for the four pairs of AXAF-I mirrors. Based on this optimized alignment, the geometric images and effective focal lengths for the as built mirrors were predicted to cross check the results obtained by Kodak.

  4. Sub-50 nm metrology on extreme ultra violet chemically amplified resist—A systematic assessment

    International Nuclear Information System (INIS)

    Maas, D. J.; Herfst, R.; Veldhoven, E. van; Fliervoet, T.; Meessen, J.; Vaenkatesan, V.; Sadeghian, H.

    2015-01-01

    With lithographic patterning dimensions decreasing well below 50 nm, it is of high importance to understand metrology at such small scales. This paper presents results obtained from dense arrays of contact holes (CHs) with various Critical Dimension (CD) between 15 and 50 nm, as patterned in a chemically amplified resist using an ASML EUV scanner and measured at ASML and TNO. To determine the differences between various (local) CD metrology techniques, we conducted an experiment using optical scatterometry, CD-Scanning Electron Microscopy (CD-SEM), Helium ion Microscopy (HIM), and Atomic Force Microscopy (AFM). CD-SEM requires advanced beam scan strategies to mitigate sample charging; the other tools did not need that. We discuss the observed main similarities and differences between the various techniques. To this end, we assessed the spatial frequency content in the raw images for SEM, HIM, and AFM. HIM and AFM resolve the highest spatial frequencies, which are attributed to the more localized probe-sample interaction for these techniques. Furthermore, the SEM, HIM, and AFM waveforms are analyzed in detail. All techniques show good mutual correlation, albeit the reported CD values systematically differ significantly. HIM systematically reports a 25% higher CD uniformity number than CD-SEM for the same arrays of CHs, probably because HIM has a higher resolution than the CD-SEM used in this assessment. A significant speed boost for HIM and AFM is required before these techniques are to serve the demanding industrial metrology applications like optical critical dimension and CD-SEM do nowadays

  5. Automation of testing the metrological reliability of nondestructive control systems

    International Nuclear Information System (INIS)

    Zhukov, Yu.A.; Isakov, V.B.; Karlov, Yu.K.; Kovalevskij, Yu.A.

    1987-01-01

    Opportunities of microcomputers are used to solve the problem of testing control-measuring systems. Besides the main program the program of data processing when characterizing the nondestructive control systems is written in the microcomputer. The program includes two modules. The first module contains tests-programs, by which accuracy of functional elements of the microcomputer and interface elements with issuing a message to the operator on readiness of the elements for operation and failure of a certain element are determined. The second module includes: calculational programs when determining metrological reliability of measuring channel reliability, a calculational subprogram for random statistical measuring error, time instability and ''dead time''. Automation of testing metrological reliability of the nondestructive control systems increases reliability of determining metrological parameters and reduces time of system testing

  6. A software tool for advanced MRgFUS prostate therapy planning and follow up

    Science.gov (United States)

    van Straaten, Dörte; Hoogenboom, Martijn; van Amerongen, Martinus J.; Weiler, Florian; Issawi, Jumana Al; Günther, Matthias; Fütterer, Jurgen; Jenne, Jürgen W.

    2017-03-01

    US guided HIFU/FUS ablation for the therapy of prostate cancer is a clinical established method, while MR guided HIFU/FUS applications for prostate recently started clinical evaluation. Even if MRI examination is an excellent diagnostic tool for prostate cancer, it is a time consuming procedure and not practicable within an MRgFUS therapy session. The aim of our ongoing work is to develop software to support therapy planning and post-therapy follow-up for MRgFUS on localized prostate cancer, based on multi-parametric MR protocols. The clinical workflow of diagnosis, therapy and follow-up of MR guided FUS on prostate cancer was deeply analyzed. Based on this, the image processing workflow was designed and all necessary components, e.g. GUI, viewer, registration tools etc. were defined and implemented. The software bases on MeVisLab with several implemented C++ modules for the image processing tasks. The developed software, called LTC (Local Therapy Control) will register and visualize automatically all images (T1w, T2w, DWI etc.) and ADC or perfusion maps gained from the diagnostic MRI session. This maximum of diagnostic information helps to segment all necessary ROIs, e.g. the tumor, for therapy planning. Final therapy planning will be performed based on these segmentation data in the following MRgFUS therapy session. In addition, the developed software should help to evaluate the therapy success, by synchronization and display of pre-therapeutic, therapy and follow-up image data including the therapy plan and thermal dose information. In this ongoing project, the first stand-alone prototype was completed and will be clinically evaluated.

  7. Claire, a tool used for the simulation of events in software tests

    International Nuclear Information System (INIS)

    Henry, J.Y.; Boulc'h, J.; Raguideau, J.; Schoen, D.

    1994-06-01

    CLAIRE provides a purely software system which makes it possible to validate the on line applications dealt out to the specifications domain or the code. This tool offers easy graphic design of the application and of its environment. It caries out quite efficiently the simulation of any logged in model and runs the control of the evolution either dynamically or with prerecorded time. (TEC)

  8. In-line CD metrology with combined use of scatterometry and CD-SEM

    Science.gov (United States)

    Asano, Masafumi; Ikeda, Takahiro; Koike, Toru; Abe, Hideaki

    2006-03-01

    Measurement characteristics in scatterometry and CD-SEM for lot acceptance sampling of inline critical dimension (CD) metrology were investigated by using a statistical approach with Monte Carlo simulation. By operation characteristics curve analysis, producer's risk and consumer's risk arising from sampling were clarified. Single use of scatterometry involves a higher risk, such risk being particularly acute in the case of large intra-chip CD variation because it is unable to sufficiently monitor intra-chip CD variation including local CD error. Substituting scatterometry for conventional SEM metrology is accompanied with risks, resulting in the increase of unnecessary cost. The combined use of scatterometry and SEM metrology in which the sampling plan for SEM is controlled by scatterometry is a promising metrology from the viewpoint of the suppression of risks and cost. This is due to the effect that CD errors existing in the distribution tails are efficiently caught.

  9. A software tool for rapid flood inundation mapping

    Science.gov (United States)

    Verdin, James; Verdin, Kristine; Mathis, Melissa L.; Magadzire, Tamuka; Kabuchanga, Eric; Woodbury, Mark; Gadain, Hussein

    2016-06-02

    The GIS Flood Tool (GFT) was developed by the U.S. Geological Survey with support from the U.S. Agency for International Development’s Office of U.S. Foreign Disaster Assistance to provide a means for production of reconnaissance-level flood inundation mapping for data-sparse and resource-limited areas of the world. The GFT has also attracted interest as a tool for rapid assessment flood inundation mapping for the Flood Inundation Mapping Program of the U.S. Geological Survey. The GFT can fill an important gap for communities that lack flood inundation mapping by providing a first-estimate of inundation zones, pending availability of resources to complete an engineering study. The tool can also help identify priority areas for application of scarce flood inundation mapping resources. The technical basis of the GFT is an application of the Manning equation for steady flow in an open channel, operating on specially processed digital elevation data. The GFT is implemented as a software extension in ArcGIS. Output maps from the GFT were validated at 11 sites with inundation maps produced previously by the Flood Inundation Mapping Program using standard one-dimensional hydraulic modeling techniques. In 80 percent of the cases, the GFT inundation patterns matched 75 percent or more of the one-dimensional hydraulic model inundation patterns. Lower rates of pattern agreement were seen at sites with low relief and subtle surface water divides. Although the GFT is simple to use, it should be applied with the oversight or review of a qualified hydraulic engineer who understands the simplifying assumptions of the approach.

  10. The Act of 17 March 2000 on metrology and on changes and amendments of some acts

    International Nuclear Information System (INIS)

    2000-01-01

    This act metrology for organization of unity and correctness of mensuration adapts (a) the law measurement units, (b) the requests on committed gauges and their metrological control, (c) the conditions of official mensuration, (d) the requests on consumptive packages articles; (e) the conditions of authorization and registration, (f) operation of organs of the state administration for metrology, (g) the metrological authority (h) putting of fines. This act shall into effect on 1 July 2000

  11. Optical metrology techniques for dimensional stability measurements

    NARCIS (Netherlands)

    Ellis, Jonathan David

    2010-01-01

    This thesis work is optical metrology techniques to determine material stability. In addition to displacement interferometry, topics such as periodic nonlinearity, Fabry-Perot interferometry, refractometry, and laser stabilization are covered.

  12. Performance analysis and optimization of an advanced pharmaceutical wastewater treatment plant through a visual basic software tool (PWWT.VB).

    Science.gov (United States)

    Pal, Parimal; Thakura, Ritwik; Chakrabortty, Sankha

    2016-05-01

    A user-friendly, menu-driven simulation software tool has been developed for the first time to optimize and analyze the system performance of an advanced continuous membrane-integrated pharmaceutical wastewater treatment plant. The software allows pre-analysis and manipulation of input data which helps in optimization and shows the software performance visually on a graphical platform. Moreover, the software helps the user to "visualize" the effects of the operating parameters through its model-predicted output profiles. The software is based on a dynamic mathematical model, developed for a systematically integrated forward osmosis-nanofiltration process for removal of toxic organic compounds from pharmaceutical wastewater. The model-predicted values have been observed to corroborate well with the extensive experimental investigations which were found to be consistent under varying operating conditions like operating pressure, operating flow rate, and draw solute concentration. Low values of the relative error (RE = 0.09) and high values of Willmott-d-index (d will = 0.981) reflected a high degree of accuracy and reliability of the software. This software is likely to be a very efficient tool for system design or simulation of an advanced membrane-integrated treatment plant for hazardous wastewater.

  13. Review of free software tools for image analysis of fluorescence cell micrographs.

    Science.gov (United States)

    Wiesmann, V; Franz, D; Held, C; Münzenmayer, C; Palmisano, R; Wittenberg, T

    2015-01-01

    An increasing number of free software tools have been made available for the evaluation of fluorescence cell micrographs. The main users are biologists and related life scientists with no or little knowledge of image processing. In this review, we give an overview of available tools and guidelines about which tools the users should use to segment fluorescence micrographs. We selected 15 free tools and divided them into stand-alone, Matlab-based, ImageJ-based, free demo versions of commercial tools and data sharing tools. The review consists of two parts: First, we developed a criteria catalogue and rated the tools regarding structural requirements, functionality (flexibility, segmentation and image processing filters) and usability (documentation, data management, usability and visualization). Second, we performed an image processing case study with four representative fluorescence micrograph segmentation tasks with figure-ground and cell separation. The tools display a wide range of functionality and usability. In the image processing case study, we were able to perform figure-ground separation in all micrographs using mainly thresholding. Cell separation was not possible with most of the tools, because cell separation methods are provided only by a subset of the tools and are difficult to parametrize and to use. Most important is that the usability matches the functionality of a tool. To be usable, specialized tools with less functionality need to fulfill less usability criteria, whereas multipurpose tools need a well-structured menu and intuitive graphical user interface. © 2014 Fraunhofer-Institute for Integrated Circuits IIS Journal of Microscopy © 2014 Royal Microscopical Society.

  14. Activities of IPEN Nuclear Metrology Laboratory

    International Nuclear Information System (INIS)

    Dias, M.S.; Koskinas, M.F.; Pocobi, E.; Silva, C.A.M.; Machado, R.R.

    1987-01-01

    The activities of IPEN Nuclear Metrology Laboratory, which the principal objective is radionuclides activities determination for supplying sources and standard radioactive solutions in activity are presented. The systems installed, the activity bands and some of standards radionuclides are shown. (C.G.C.) [pt

  15. A decade of innovation with laser speckle metrology

    Science.gov (United States)

    Ettemeyer, Andreas

    2003-05-01

    Speckle Pattern Interferometry has emerged from the experimental substitution of holographic interferometry to become a powerful problem solving tool in research and industry. The rapid development of computer and digital imaging techniques in combination with minaturization of the optical equipment led to new applications which had not been anticipated before. While classical holographic interferometry had always required careful consideration of the environmental conditions such as vibration, noise, light, etc. and could generally only be performed in the optical laboratory, it is now state of the art, to handle portable speckle measuring equipment at almost any place. During the last decade, the change in design and technique has dramatically influenced the range of applications of speckle metrology and opened new markets. The integration of recent research results into speckle measuring equipment has led to handy equipment, simplified the operation and created high quality data output.

  16. Contingency Contractor Optimization Phase 3 Sustainment Third-Party Software List - Contingency Contractor Optimization Tool - Prototype

    Energy Technology Data Exchange (ETDEWEB)

    Durfee, Justin David; Frazier, Christopher Rawls; Bandlow, Alisa

    2016-05-01

    The Contingency Contractor Optimization Tool - Prototype (CCOT-P) requires several third-party software packages. These are documented below for each of the CCOT-P elements: client, web server, database server, solver, web application and polling application.

  17. A new online software tool for pressure ulcer monitoring as an educational instrument for unified nursing assessment in clinical settings

    Directory of Open Access Journals (Sweden)

    Andrea Pokorná

    2016-07-01

    Full Text Available Data collection and evaluation of that data is crucial for effective quality management and naturally also for prevention and treatment of pressure ulcers. Data collected in a uniform manner by nurses in clinical practice could be used for further analyses. Data about pressure ulcers are collected to differing degrees of quality based on the local policy of the given health care facility and in relation to the nurse’s actual level of knowledge concerning pressure ulcer identification and use of objective scales (i.e. categorization of pressure ulcers. Therefore, we have developed software suitable for data collection which includes some educational tools to promote unified reporting of data by nurses. A description of this software and some educational and learning components of the tool is presented herein. The planned process of clinical application of the newly developed software is also briefly mentioned. The discussion is focused on the usability of the online reporting tool and possible further development of the tool.

  18. Coherence enhanced quantum metrology in a nonequilibrium optical molecule

    Science.gov (United States)

    Wang, Zhihai; Wu, Wei; Cui, Guodong; Wang, Jin

    2018-03-01

    We explore the quantum metrology in an optical molecular system coupled to two environments with different temperatures, using a quantum master equation beyond secular approximation. We discover that the steady-state coherence originating from and sustained by the nonequilibrium condition can enhance quantum metrology. We also study the quantitative measures of the nonequilibrium condition in terms of the curl flux, heat current and entropy production at the steady state. They are found to grow with temperature difference. However, an apparent paradox arises considering the contrary behaviors of the steady-state coherence and the nonequilibrium measures in relation to the inter-cavity coupling strength. This paradox is resolved by decomposing the heat current into a population part and a coherence part. Only the latter, the coherence part of the heat current, is tightly connected to the steady-state coherence and behaves similarly with respect to the inter-cavity coupling strength. Interestingly, the coherence part of the heat current flows from the low-temperature reservoir to the high-temperature reservoir, opposite to the direction of the population heat current. Our work offers a viable way to enhance quantum metrology for open quantum systems through steady-state coherence sustained by the nonequilibrium condition, which can be controlled and manipulated to maximize its utility. The potential applications go beyond quantum metrology and extend to areas such as device designing, quantum computation and quantum technology in general.

  19. Speckle-based at-wavelength metrology of X-ray mirrors with super accuracy.

    Science.gov (United States)

    Kashyap, Yogesh; Wang, Hongchang; Sawhney, Kawal

    2016-05-01

    X-ray active mirrors, such as bimorph and mechanically bendable mirrors, are increasingly being used on beamlines at modern synchrotron source facilities to generate either focused or "tophat" beams. As well as optical tests in the metrology lab, it is becoming increasingly important to optimise and characterise active optics under actual beamline operating conditions. Recently developed X-ray speckle-based at-wavelength metrology technique has shown great potential. The technique has been established and further developed at the Diamond Light Source and is increasingly being used to optimise active mirrors. Details of the X-ray speckle-based at-wavelength metrology technique and an example of its applicability in characterising and optimising a micro-focusing bimorph X-ray mirror are presented. Importantly, an unprecedented angular sensitivity in the range of two nanoradians for measuring the slope error of an optical surface has been demonstrated. Such a super precision metrology technique will be beneficial to the manufacturers of polished mirrors and also in optimization of beam shaping during experiments.

  20. Mycotoxin metrology: Gravimetric production of zearalenone calibration solution

    Science.gov (United States)

    Rego, E. C. P.; Simon, M. E.; Li, Xiuqin; Li, Xiaomin; Daireaux, A.; Choteau, T.; Westwood, S.; Josephs, R. D.; Wielgosz, R. I.; Cunha, V. S.

    2018-03-01

    Food safety is a major concern for countries developing metrology and quality assurance systems, including the contamination of food and feed by mycotoxins. To improve the mycotoxin analysis and ensure the metrological traceability, CRM of calibration solution should be used. The production of certified mycotoxin solutions is a major challenge due to the limited amount of standard for conducting a proper purity study and due to the cost of standards. The CBKT project was started at BIPM and Inmetro produced gravimetrically one batch of zearelenone in acetronitrile (14.708 ± 0.016 μg/g, k=2) and conducted homogeneity, stability and value assignment studies.

  1. An open source software tool to assign the material properties of bone for ABAQUS finite element simulations.

    Science.gov (United States)

    Pegg, Elise C; Gill, Harinderjit S

    2016-09-06

    A new software tool to assign the material properties of bone to an ABAQUS finite element mesh was created and compared with Bonemat, a similar tool originally designed to work with Ansys finite element models. Our software tool (py_bonemat_abaqus) was written in Python, which is the chosen scripting language for ABAQUS. The purpose of this study was to compare the software packages in terms of the material assignment calculation and processing speed. Three element types were compared (linear hexahedral (C3D8), linear tetrahedral (C3D4) and quadratic tetrahedral elements (C3D10)), both individually and as part of a mesh. Comparisons were made using a CT scan of a hemi-pelvis as a test case. A small difference, of -0.05kPa on average, was found between Bonemat version 3.1 (the current version) and our Python package. Errors were found in the previous release of Bonemat (version 3.0 downloaded from www.biomedtown.org) during calculation of the quadratic tetrahedron Jacobian, and conversion of the apparent density to modulus when integrating over the Young׳s modulus field. These issues caused up to 2GPa error in the modulus assignment. For these reasons, we recommend users upgrade to the most recent release of Bonemat. Processing speeds were assessed for the three different element types. Our Python package took significantly longer (110s on average) to perform the calculations compared with the Bonemat software (10s). Nevertheless, the workflow advantages of the package and added functionality makes 'py_bonemat_abaqus' a useful tool for ABAQUS users. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Characteristics and evolution of the ecosystem of software tools supporting research in molecular biology.

    Science.gov (United States)

    Pazos, Florencio; Chagoyen, Monica

    2018-01-16

    Daily work in molecular biology presently depends on a large number of computational tools. An in-depth, large-scale study of that 'ecosystem' of Web tools, its characteristics, interconnectivity, patterns of usage/citation, temporal evolution and rate of decay is crucial for understanding the forces that shape it and for informing initiatives aimed at its funding, long-term maintenance and improvement. In particular, the long-term maintenance of these tools is compromised because of their specific development model. Hundreds of published studies become irreproducible de facto, as the software tools used to conduct them become unavailable. In this study, we present a large-scale survey of >5400 publications describing Web servers within the two main bibliographic resources for disseminating new software developments in molecular biology. For all these servers, we studied their citation patterns, the subjects they address, their citation networks and the temporal evolution of these factors. We also analysed how these factors affect the availability of these servers (whether they are alive). Our results show that this ecosystem of tools is highly interconnected and adapts to the 'trendy' subjects in every moment. The servers present characteristic temporal patterns of citation/usage, and there is a worrying rate of server 'death', which is influenced by factors such as the server popularity and the institutions that hosts it. These results can inform initiatives aimed at the long-term maintenance of these resources. © The Author(s) 2018. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  3. Software tool for representation and processing of experimental data on high energy interactions of elementary particles

    International Nuclear Information System (INIS)

    Cherepanov, E.O.; Skachkov, N.B.

    2002-01-01

    The software tool is developed for detailed and evident displaying of information about energy and space distribution of secondary particles produced in the processes of elementary particles collisions. As input information the data on the components of 4-momenta of secondary particles is used. As for these data the information obtained from different parts of physical detector (for example, from the calorimeter or tracker) as well as the information obtained with the help of event generator is taken. The tool is intended for use in Windows operation system and is developed on the basis of Borland Delphi. Mathematical architecture of the software tool allows user to receive complete information without making additional calculations. The program automatically performs analysis of structure and distributions of signals and displays the results in a transparent form which allows their quick analysis. To display the information the three-dimensional graphic methods as well as colour decisions based on intuitive associations are also used. (author)

  4. SAGES: a suite of freely-available software tools for electronic disease surveillance in resource-limited settings.

    Directory of Open Access Journals (Sweden)

    Sheri L Lewis

    Full Text Available Public health surveillance is undergoing a revolution driven by advances in the field of information technology. Many countries have experienced vast improvements in the collection, ingestion, analysis, visualization, and dissemination of public health data. Resource-limited countries have lagged behind due to challenges in information technology infrastructure, public health resources, and the costs of proprietary software. The Suite for Automated Global Electronic bioSurveillance (SAGES is a collection of modular, flexible, freely-available software tools for electronic disease surveillance in resource-limited settings. One or more SAGES tools may be used in concert with existing surveillance applications or the SAGES tools may be used en masse for an end-to-end biosurveillance capability. This flexibility allows for the development of an inexpensive, customized, and sustainable disease surveillance system. The ability to rapidly assess anomalous disease activity may lead to more efficient use of limited resources and better compliance with World Health Organization International Health Regulations.

  5. Speckle-based at-wavelength metrology of x-ray optics at Diamond Light Source

    Science.gov (United States)

    Wang, Hongchang; Zhou, Tunhe; Kashyap, Yogesh; Sawhney, Kawal

    2017-08-01

    To achieve high resolution and sensitivity on the nanometer scale, further development of X-ray optics is required. Although ex-situ metrology provides valuable information about X-ray optics, the ultimate performance of X-ray optics is critically dependent on the exact nature of the working conditions. Therefore, it is equally important to perform in-situ metrology at the optics' operating wavelength (`at-wavelength' metrology) to optimize the performance of X-ray optics and correct and minimize the collective distortions of the upstream beamline optics, e.g. monochromator, windows, etc. Speckle-based technique has been implemented and further improved at Diamond Light Source. We have demonstrated that the angular sensitivity for measuring the slope error of an optical surface can reach an accuracy of two nanoradians. The recent development of the speckle-based at-wavelength metrology techniques will be presented. Representative examples of the applications of the speckle-based technique will also be given - including optimization of X-ray mirrors and characterization of compound refraction lenses. Such a high-precision metrology technique will be extremely beneficial for the manufacture and in-situ alignment/optimization of X-ray mirrors for next-generation synchrotron beamlines.

  6. A practical comparison of de novo genome assembly software tools for next-generation sequencing technologies.

    Directory of Open Access Journals (Sweden)

    Wenyu Zhang

    Full Text Available The advent of next-generation sequencing technologies is accompanied with the development of many whole-genome sequence assembly methods and software, especially for de novo fragment assembly. Due to the poor knowledge about the applicability and performance of these software tools, choosing a befitting assembler becomes a tough task. Here, we provide the information of adaptivity for each program, then above all, compare the performance of eight distinct tools against eight groups of simulated datasets from Solexa sequencing platform. Considering the computational time, maximum random access memory (RAM occupancy, assembly accuracy and integrity, our study indicate that string-based assemblers, overlap-layout-consensus (OLC assemblers are well-suited for very short reads and longer reads of small genomes respectively. For large datasets of more than hundred millions of short reads, De Bruijn graph-based assemblers would be more appropriate. In terms of software implementation, string-based assemblers are superior to graph-based ones, of which SOAPdenovo is complex for the creation of configuration file. Our comparison study will assist researchers in selecting a well-suited assembler and offer essential information for the improvement of existing assemblers or the developing of novel assemblers.

  7. SimPhospho: a software tool enabling confident phosphosite assignment.

    Science.gov (United States)

    Suni, Veronika; Suomi, Tomi; Tsubosaka, Tomoya; Imanishi, Susumu Y; Elo, Laura L; Corthals, Garry L

    2018-03-27

    Mass spectrometry combined with enrichment strategies for phosphorylated peptides has been successfully employed for two decades to identify sites of phosphorylation. However, unambiguous phosphosite assignment is considered challenging. Given that site-specific phosphorylation events function as different molecular switches, validation of phosphorylation sites is of utmost importance. In our earlier study we developed a method based on simulated phosphopeptide spectral libraries, which enables highly sensitive and accurate phosphosite assignments. To promote more widespread use of this method, we here introduce a software implementation with improved usability and performance. We present SimPhospho, a fast and user-friendly tool for accurate simulation of phosphopeptide tandem mass spectra. Simulated phosphopeptide spectral libraries are used to validate and supplement database search results, with a goal to improve reliable phosphoproteome identification and reporting. The presented program can be easily used together with the Trans-Proteomic Pipeline and integrated in a phosphoproteomics data analysis workflow. SimPhospho is available for Windows, Linux and Mac operating systems at https://sourceforge.net/projects/simphospho/. It is open source and implemented in C ++. A user's manual with detailed description of data analysis using SimPhospho as well as test data can be found as supplementary material of this article. Supplementary data are available at https://www.btk.fi/research/ computational-biomedicine/software/.

  8. Development of the software dead time methodology for the 4πβ-γ software coincidence system analysis program

    International Nuclear Information System (INIS)

    Toledo, Fabio de; Brancaccio, Franco; Dias, Mauro da Silva

    2009-01-01

    The Laboratorio de Metrologia Nuclear - LMN, Nuclear Metrology Laboratory -, at IPEN-CNEN/SP, Sao Paulo, Brazil, developed a new Software Coincidence System (SCS) for 4πβ-γ radioisotope standardization. SCS is composed by the data acquisition hardware, for the coincidence data recording, and the coincidence data analysis program that performs the radioactive activity calculation for the target sample. Due to hardware intrinsic signal sampling characteristics, multiple undesired data recording occurs from a single saturated pulse. Also pulse pileup leads to bad data recording. As the beta counting rates are much greater than the gamma ones, due to the high 4π geometry beta detecting efficiencies, the beta counting significantly increases because of multiple pulse recordings, resulting in a respective increasing in the calculated activity value. In order to minimize such bad recordings effect, a software dead time value was introduced in the coincidence analysis program, under development at LMN, discarding multiple recordings, due to pulse pileup or saturation. This work presents the methodology developed to determine the optimal software dead time data value, for better accuracy results attaining, and discusses the results, pointing to software improvement possibilities. (author)

  9. Simulation software support (S3) system a software testing and debugging tool

    International Nuclear Information System (INIS)

    Burgess, D.C.; Mahjouri, F.S.

    1990-01-01

    The largest percentage of technical effort in the software development process is accounted for debugging and testing. It is not unusual for a software development organization to spend over 50% of the total project effort on testing. In the extreme, testing of human-rated software (e.g., nuclear reactor monitoring, training simulator) can cost three to five times as much as all other software engineering steps combined. The Simulation Software Support (S 3 ) System, developed by the Link-Miles Simulation Corporation is ideally suited for real-time simulation applications which involve a large database with models programmed in FORTRAN. This paper will focus on testing elements of the S 3 system. In this paper system support software utilities are provided which enable the loading and execution of modules in the development environment. These elements include the Linking/Loader (LLD) for dynamically linking program modules and loading them into memory and the interactive executive (IEXEC) for controlling the execution of the modules. Features of the Interactive Symbolic Debugger (SD) and the Real Time Executive (RTEXEC) to support the unit and integrated testing will be explored

  10. Updates on resources, software tools, and databases for plant proteomics in 2016-2017.

    Science.gov (United States)

    Misra, Biswapriya B

    2018-02-08

    Proteomics data processing, annotation, and analysis can often lead to major hurdles in large-scale high-throughput bottom-up proteomics experiments. Given the recent rise in protein-based big datasets being generated, efforts in in silico tool development occurrences have had an unprecedented increase; so much so, that it has become increasingly difficult to keep track of all the advances in a particular academic year. However, these tools benefit the plant proteomics community in circumventing critical issues in data analysis and visualization, as these continually developing open-source and community-developed tools hold potential in future research efforts. This review will aim to introduce and summarize more than 50 software tools, databases, and resources developed and published during 2016-2017 under the following categories: tools for data pre-processing and analysis, statistical analysis tools, peptide identification tools, databases and spectral libraries, and data visualization and interpretation tools. Intended for a well-informed proteomics community, finally, efforts in data archiving and validation datasets for the community will be discussed as well. Additionally, the author delineates the current and most commonly used proteomics tools in order to introduce novice readers to this -omics discovery platform. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Review of Ground Systems Development and Operations (GSDO) Tools for Verifying Command and Control Software

    Science.gov (United States)

    Aguilar, Michael L.; Bonanne, Kevin H.; Favretto, Jeffrey A.; Jackson, Maddalena M.; Jones, Stephanie L.; Mackey, Ryan M.; Sarrel, Marc A.; Simpson, Kimberly A.

    2014-01-01

    The Exploration Systems Development (ESD) Standing Review Board (SRB) requested the NASA Engineering and Safety Center (NESC) conduct an independent review of the plan developed by Ground Systems Development and Operations (GSDO) for identifying models and emulators to create a tool(s) to verify their command and control software. The NESC was requested to identify any issues or weaknesses in the GSDO plan. This document contains the outcome of the NESC review.

  12. Regional metrology organisations and the JCRB

    International Nuclear Information System (INIS)

    Hetherington, Paul

    2004-01-01

    In 1999, National Metrology Institutes (NMIs) from some 39 countries signed the International Committee of Weights and Measures (CIPM) Mutual Recognition Arrangement (MRA) in Paris. The MRA, drawn up by the CIPM, under the authority given to it in the Metre Convention, was in response to requirements of Governments and Regulators to provide a sound technical foundation for trade agreements. Core objectives of the MRA are to allow for the establishment of the degree of equivalence of national measurement standards and to provide for mutual recognition of calibration certificates issued by NMIs. This presentation will detail the evolution of the MRA. Globally, NMIs are affiliated to Regional Metrology Organisations (RMOs). The key role of the RMOs in the MRA process will be discussed along with the structure and objectives of the various RMOs worldwide. The Joint Committee of the RMOs and the BIPM (JCRB) plays a central part in the effective operation of the MRA. Its tasks, membership and output will also be described

  13. PREFACE: Fundamental Constants in Physics and Metrology

    Science.gov (United States)

    Klose, Volkmar; Kramer, Bernhard

    1986-01-01

    This volume contains the papers presented at the 70th PTB Seminar which, the second on the subject "Fundamental Constants in Physics and Metrology", was held at the Physikalisch-Technische Bundesanstalt in Braunschweig from October 21 to 22, 1985. About 100 participants from the universities and various research institutes of the Federal Republic of Germany participated in the meeting. Besides a number of review lectures on various broader subjects there was a poster session which contained a variety of topical contributed papers ranging from the theory of the quantum Hall effect to reports on the status of the metrological experiments at the PTB. In addition, the participants were also offered the possibility to visit the PTB laboratories during the course of the seminar. During the preparation of the meeting we noticed that even most of the general subjects which were going to be discussed in the lectures are of great importance in connection with metrological experiments and should be made accessible to the scientific community. This eventually resulted in the idea of the publication of the papers in a regular journal. We are grateful to the editor of Metrologia for providing this opportunity. We have included quite a number of papers from basic physical research. For example, certain aspects of high-energy physics and quantum optics, as well as the many-faceted role of Sommerfeld's fine-structure constant, are covered. We think that questions such as "What are the intrinsic fundamental parameters of nature?" or "What are we doing when we perform an experiment?" can shed new light on the art of metrology, and do, potentially, lead to new ideas. This appears to be especially necessary when we notice the increasing importance of the role of the fundamental constants and macroscopic quantum effects for the definition and the realization of the physical units. In some cases we have reached a point where the limitations of our knowledge of a fundamental constant and

  14. Hardware replacements and software tools for digital control computers

    International Nuclear Information System (INIS)

    Walker, R.A.P.; Wang, B-C.; Fung, J.

    1996-01-01

    computers which use 'Varian' technology. A new software program, Desk Top Tools, permits the designer greater flexibility in digital control computer software design and testing. This software development allows the user to emulate control of the CANDU reactor system by system. All discussions will highlight the ability of the replacements and the new developments to enhance the operation of the existing and 'repeat' plant digital control computers and will explore future applications of these developments. Examples of current use of all replacement components and software are provided. (author)

  15. The future of 2D metrology for display manufacturing

    Science.gov (United States)

    Sandstrom, Tor; Wahlsten, Mikael; Park, Youngjin

    2016-10-01

    The race to 800 PPI and higher in mobile devices and the transition to OLED displays are driving a dramatic development of mask quality: resolution, CDU, registration, and complexity. 2D metrology for large area masks is necessary and must follow the roadmap. Driving forces in the market place point to continued development of even more dense displays. State-of-the-art metrology has proven itself capable of overlay below 40 nm and registration below 65 nm for G6 masks. Future developments include incoming and recurrent measurements of pellicalized masks at the panel maker's factory site. Standardization of coordinate systems across supplier networks is feasible. This will enable better yield and production economy for both mask and panel maker. Better distortion correction methods will give better registration on the panels and relax the flatness requirements of the mask blanks. If panels are measured together with masks and the results are used to characterize the aligners, further quality and yield improvements are possible. Possible future developments include in-cell metrology and integration with other instruments in the same platform.

  16. Metrology requirements for the serial production of ELT primary mirror segments

    Science.gov (United States)

    Rees, Paul C. T.; Gray, Caroline

    2015-08-01

    The manufacture of the next generation of large astronomical telescopes, the extremely large telescopes (ELT), requires the rapid manufacture of greater than 500 1.44m hexagonal segments for the primary mirror of each telescope. Both leading projects, the Thirty Meter Telescope (TMT) and the European Extremely Large Telescope (E-ELT), have set highly demanding technical requirements for each fabricated segment. These technical requirements, when combined with the anticipated construction schedule for each telescope, suggest that more than one optical fabricator will be involved in the delivery of the primary mirror segments in order to meet the project schedule. For one supplier, the technical specification is challenging and requires highly consistent control of metrology in close coordination with the polishing technologies used in order to optimize production rates. For production using multiple suppliers, however the supply chain is structured, consistent control of metrology along the supply chain will be required. This requires a broader pattern of independent verification than is the case of a single supplier. This paper outlines the metrology requirements for a single supplier throughout all stages of the fabrication process. We identify and outline those areas where metrology accuracy and duration have a significant impact on production efficiency. We use the challenging ESO E-ELT technical specification as an example of our treatment, including actual process data. We further develop this model for the case of a supply chain consisting of multiple suppliers. Here, we emphasize the need to control metrology throughout the supply chain in order to optimize net production efficiency.

  17. Techniques and tools for measuring energy efficiency of scientific software applications

    CERN Document Server

    Abdurachmanov, David; Eulisse, Giulio; Knight, Robert; Niemi, Tapio; Nurminen, Jukka K.; Nyback, Filip; Pestana, Goncalo; Ou, Zhonghong; Khan, Kashif

    2014-01-01

    The scale of scientific High Performance Computing (HPC) and High Throughput Computing (HTC) has increased significantly in recent years, and is becoming sensitive to total energy use and cost. Energy-efficiency has thus become an important concern in scientific fields such as High Energy Physics (HEP). There has been a growing interest in utilizing alternate architectures, such as low power ARM processors, to replace traditional Intel x86 architectures. Nevertheless, even though such solutions have been successfully used in mobile applications with low I/O and memory demands, it is unclear if they are suitable and more energy-efficient in the scientific computing environment. Furthermore, there is a lack of tools and experience to derive and compare power consumption between the architectures for various workloads, and eventually to support software optimizations for energy efficiency. To that end, we have performed several physical and software-based measurements of workloads from HEP applications running o...

  18. Practical requirements for software tools to assist in the validation and verification of hybrid expert systems

    International Nuclear Information System (INIS)

    Singh, G.P.; Cadena, D.; Burgess, J.

    1992-01-01

    Any practical software development effort must remain focused on verification and validation of user requirements. Knowledge-based system development is no different in this regard. In industry today, most expert systems being produced are, in reality, hybrid software systems which, in addition to those components that provide the knowledge base and expert reasoning over the problem domain using various rule-based and object-oriented paradigms, incorporate significant bodies of code based on more traditional software techniques such as database management, graphical user interfaces, hypermedia, spreadsheets, as well as specially developed sequential code. Validation and verification of such hybrid systems must perforce integrate suitable methodologies from all such fields. This paper attempts to provide a broad overview of the practical requirements for methodologies and the concomitant groupware tools which would assist in such an enterprise. These methodologies and groupware tools would facilitate the teamwork efforts necessary to validate and verify all components of such hybrid systems by emphasizing cooperative recording of requirements and negotiated resolutions of any conflicts grounded in a solid understanding of the semantics of such a system

  19. Software tool for resolution of inverse problems using artificial intelligence techniques: an application in neutron spectrometry

    International Nuclear Information System (INIS)

    Castaneda M, V. H.; Martinez B, M. R.; Solis S, L. O.; Castaneda M, R.; Leon P, A. A.; Hernandez P, C. F.; Espinoza G, J. G.; Ortiz R, J. M.; Vega C, H. R.; Mendez, R.; Gallego, E.; Sousa L, M. A.

    2016-10-01

    The Taguchi methodology has proved to be highly efficient to solve inverse problems, in which the values of some parameters of the model must be obtained from the observed data. There are intrinsic mathematical characteristics that make a problem known as inverse. Inverse problems appear in many branches of science, engineering and mathematics. To solve this type of problem, researches have used different techniques. Recently, the use of techniques based on Artificial Intelligence technology is being explored by researches. This paper presents the use of a software tool based on artificial neural networks of generalized regression in the solution of inverse problems with application in high energy physics, specifically in the solution of the problem of neutron spectrometry. To solve this problem we use a software tool developed in the Mat Lab programming environment, which employs a friendly user interface, intuitive and easy to use for the user. This computational tool solves the inverse problem involved in the reconstruction of the neutron spectrum based on measurements made with a Bonner spheres spectrometric system. Introducing this information, the neural network is able to reconstruct the neutron spectrum with high performance and generalization capability. The tool allows that the end user does not require great training or technical knowledge in development and/or use of software, so it facilitates the use of the program for the resolution of inverse problems that are in several areas of knowledge. The techniques of Artificial Intelligence present singular veracity to solve inverse problems, given the characteristics of artificial neural networks and their network topology, therefore, the tool developed has been very useful, since the results generated by the Artificial Neural Network require few time in comparison to other techniques and are correct results comparing them with the actual data of the experiment. (Author)

  20. Herramientas libres para modelar software Free tools to model software

    Directory of Open Access Journals (Sweden)

    Mauro Callejas Cuervo Óscar Yovany Baquero Moreno

    2010-11-01

    Full Text Available Observación acerca del  software libre y de suimplicación en procesos de desarrollo de  softwarecon herramientas 4G por parte de entidades opersonas sin capitales astronómicos y sin lamentalidad acaparadora de dominar el mercado conproductos costosos que las hagan multimillonarias yque no ofrecen una garantía real, ni la posibilidadsiquiera de conocer el  software por el que se hapagado, y mucho menos de modificarlo si no cumplenuestras expectativas.

  1. FY1995 study of very flexible software structures based on soft-software components; 1995 nendo yawarankana software buhin ni motozuku software no choju kozo ni kansuru kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    The purpose of this study is to develop the method and tools for changing the software structure flexibly along with the continuous continuous change of its environment and conditions of use. The goal is the software of very high adaptability by using soft-software components and flexible assembly. The CASE tool platform Sapid based on a fine-grained repository was developed and enforced for raising the abstraction level of program code and for mining potential flexible components. To reconstruct the software adaptable to a required environment, the SQM (Software Quark Model) was used in managing interconnectivity and other semantic relationships of among components. On these two basic systems, we developed various methods and tools such as those for static and dynamic analysis of very flexible software structures, program transformation description, program pattern extraction and composition component optimization by partial evaluation, component extraction by function slicing, code encapsulation, and component navigation and application. (NEDO)

  2. A software tool for modification of human voxel models used for application in radiation protection

    International Nuclear Information System (INIS)

    Becker, Janine; Zankl, Maria; Petoussi-Henss, Nina

    2007-01-01

    This note describes a new software tool called 'VolumeChange' that was developed to modify the masses and location of organs of virtual human voxel models. A voxel model is a three-dimensional representation of the human body in the form of an array of identification numbers that are arranged in slices, rows and columns. Each entry in this array represents a voxel; organs are represented by those voxels having the same identification number. With this tool, two human voxel models were adjusted to fit the reference organ masses of a male and a female adult, as defined by the International Commission on Radiological Protection (ICRP). The alteration of an already existing voxel model is a complicated process, leading to many problems that have to be solved. To solve those intricacies in an easy way, a new software tool was developed and is presented here. If the organs are modified, no bit of tissue, i.e. voxel, may vanish nor should an extra one appear. That means that organs cannot be modified without considering the neighbouring tissue. Thus, the principle of organ modification is based on the reassignment of voxels from one organ/tissue to another; actually deleting and adding voxels is only possible at the external surface, i.e. skin. In the software tool described here, the modifications are done by semi-automatic routines but including human control. Because of the complexity of the matter, a skilled person has to validate that the applied changes to organs are anatomically reasonable. A graphical user interface was designed to fulfil the purpose of a comfortable working process, and an adequate graphical display of the modified voxel model was developed. Single organs, organ complexes and even whole limbs can be edited with respect to volume, shape and location. (note)

  3. Evaluation of a new software tool for the automatic volume calculation of hepatic tumors. First results

    International Nuclear Information System (INIS)

    Meier, S.; Mildenberger, P.; Pitton, M.; Thelen, M.; Schenk, A.; Bourquain, H.

    2004-01-01

    Purpose: computed tomography has become the preferred method in detecting liver carcinomas. The introduction of spiral CT added volumetric assessment of intrahepatic tumors, which was unattainable in the clinical routine with incremental CT due to complex planimetric revisions and excessive computing time. In an ongoing clinical study, a new software tool was tested for the automatic detection of tumor volume and the time needed for this procedure. Materials and methods: we analyzed patients suffering from hepatocellular carcinoma (HCC). All patients underwent treatment with repeated transcatheter chemoembolization of the hepatic arteria. The volumes of the HCC lesions detected in CT were measured with the new software tool in HepaVison (MeVis, Germany). The results were compared with manual planimetric calculation of the volume performed by three independent radiologists. Results: our first results in 16 patients show a correlation between the automatically and the manually calculated volumes (up to a difference of 2 ml) of 96.8%. While the manual method of analyzing the volume of a lesion requires 2.5 minutes on average, the automatic method merely requires about 30 seconds of user interaction time. Conclusion: These preliminary results show a good correlation between automatic and manual calculations of the tumor volume. The new software tool requires less time for accurate determination of the tumor volume and can be applied in the daily clinical routine. (orig.) [de

  4. Software tools for quantification of X-ray microtomography at the UGCT

    Energy Technology Data Exchange (ETDEWEB)

    Vlassenbroeck, J. [Department of Subatomic and Radiation Physics, Ghent University, Proeftuinstraat 86, B-9000 Gent (Belgium)], E-mail: jelle.vlassenbroeck@ugent.be; Dierick, M.; Masschaele, B. [Department of Subatomic and Radiation Physics, Ghent University, Proeftuinstraat 86, B-9000 Gent (Belgium); Cnudde, V. [Department of Geology and Soil Science, Ghent University, Krijgslaan 281/S8, B-9000 Gent (Belgium); Van Hoorebeke, L. [Department of Subatomic and Radiation Physics, Ghent University, Proeftuinstraat 86, B-9000 Gent (Belgium); Jacobs, P. [Department of Geology and Soil Science, Ghent University, Krijgslaan 281/S8, B-9000 Gent (Belgium)

    2007-09-21

    The technique of X-ray microtomography using X-ray tube radiation offers an interesting tool for the non-destructive investigation of a wide range of materials. A major challenge lies in the analysis and quantification of the resulting data, allowing for a full characterization of the sample under investigation. In this paper, we discuss the software tools for reconstruction and analysis of tomographic data that are being developed at the UGCT. The tomographic reconstruction is performed using Octopus, a high-performance and user-friendly software package. The reconstruction process transforms the raw acquisition data into a stack of 2D cross-sections through the sample, resulting in a 3D data set. A number of artifact and noise reduction algorithms are integrated to reduce ring artifacts, beam hardening artifacts, COR misalignment, detector or stage tilt, pixel non-linearities, etc. These corrections are very important to facilitate the analysis of the 3D data. The analysis of the 3D data focuses primarily on the characterization of pore structures, but will be extended to other applications. A first package for the analysis of pore structures in three dimensions was developed under Matlab. A new package, called Morpho+, is being developed in a C++ environment, with optimizations and extensions of the previously used algorithms. The current status of this project will be discussed. Examples of pore analysis can be found in pharmaceuticals, material science, geology and numerous other fields.

  5. Metrological traceability of holmium oxide solution

    Science.gov (United States)

    Gonçalves, D. E. F.; Gomes, J. F. S.; Alvarenga, A. P. D.; Borges, P. P.; Araujo, T. O.

    2018-03-01

    Holmium oxide solution was prepared as a candidate of certified reference material for spectrophotometer wavelength scale calibration. Here is presented the necessary steps for evaluation of the uncertainty and the establishment of metrological traceability for the production of this material. Preliminary results from the first produced batch are shown.

  6. Issues of Teaching Metrology in Higher Education Institutions of Civil Engineering in Russia

    Science.gov (United States)

    Pukharenko, Yurii Vladimirovich; Norin, Veniamin Aleksandrovich

    2017-01-01

    The work analyses the training process condition in teaching the discipline "Metrology, Standardization, Certification and Quality Control." It proves that the current educational standard regarding the instruction of the discipline "Metrology, Standardization, Certification and Quality Control" does not meet the needs of the…

  7. Quantum interference metrology at deep-UV wavelengths using phase-controlled ultrashort laser pulses

    NARCIS (Netherlands)

    Zinkstok, R. Th; Witte, S.; Ubachs, W.; Hogervorst, W.; Eikema, K. S E

    2005-01-01

    High-resolution metrology at wavelengths shorter than ultraviolet is in general hampered by a limited availability of appropriate laser sources. It is demonstrated that this limitation can be overcome by quantum-interference metrology with frequency up-converted ultrafast laser pulses. The required

  8. Economic Consequence Analysis of Disasters: The ECAT Software Tool

    Energy Technology Data Exchange (ETDEWEB)

    Rose, Adam; Prager, Fynn; Chen, Zhenhua; Chatterjee, Samrat; Wei, Dan; Heatwole, Nathaniel; Warren, Eric

    2017-04-15

    This study develops a methodology for rapidly obtaining approximate estimates of the economic consequences from numerous natural, man-made and technological threats. This software tool is intended for use by various decision makers and analysts to obtain estimates rapidly. It is programmed in Excel and Visual Basic for Applications (VBA) to facilitate its use. This tool is called E-CAT (Economic Consequence Analysis Tool) and accounts for the cumulative direct and indirect impacts (including resilience and behavioral factors that significantly affect base estimates) on the U.S. economy. E-CAT is intended to be a major step toward advancing the current state of economic consequence analysis (ECA) and also contributing to and developing interest in further research into complex but rapid turnaround approaches. The essence of the methodology involves running numerous simulations in a computable general equilibrium (CGE) model for each threat, yielding synthetic data for the estimation of a single regression equation based on the identification of key explanatory variables (threat characteristics and background conditions). This transforms the results of a complex model, which is beyond the reach of most users, into a "reduced form" model that is readily comprehensible. Functionality has been built into E-CAT so that its users can switch various consequence categories on and off in order to create customized profiles of economic consequences of numerous risk events. E-CAT incorporates uncertainty on both the input and output side in the course of the analysis.

  9. Metrology for ITER Assembly

    International Nuclear Information System (INIS)

    Bogusch, E.

    2006-01-01

    The overall dimensions of the ITER Tokamak and the particular assembly sequence preclude the use of conventional optical metrology, mechanical jigs and traditional dimensional control equipment, as used for the assembly of smaller, previous generation, fusion devices. This paper describes the state of the art of the capabilities of available metrology systems, with reference to the previous experience in Fusion engineering and in other industries. Two complementary procedures of transferring datum from the primary datum network on the bioshield to the secondary datum s inside the VV with the desired accuracy of about 0.1 mm is described, one method using the access directly through the ports and the other using transfer techniques, developed during the co-operation with ITER/EFDA. Another important task described is the development of a method for the rapid and easy measurement of the gaps between sectors, required for the production of the customised splice plates between them. The scope of the paper includes the evaluation of the composition and cost of the systems and team of technical staff required to meet the requirements of the assembly procedure. The results from a practical, full-scale demonstration of the methodologies used, using the proposed equipment, is described. This work has demonstrated the feasibility of achieving the necessary accuracies for the successful building of ITER. (author)

  10. In-depth evaluation of software tools for data-independent acquisition based label-free quantification.

    Science.gov (United States)

    Kuharev, Jörg; Navarro, Pedro; Distler, Ute; Jahn, Olaf; Tenzer, Stefan

    2015-09-01

    Label-free quantification (LFQ) based on data-independent acquisition workflows currently experiences increasing popularity. Several software tools have been recently published or are commercially available. The present study focuses on the evaluation of three different software packages (Progenesis, synapter, and ISOQuant) supporting ion mobility enhanced data-independent acquisition data. In order to benchmark the LFQ performance of the different tools, we generated two hybrid proteome samples of defined quantitative composition containing tryptically digested proteomes of three different species (mouse, yeast, Escherichia coli). This model dataset simulates complex biological samples containing large numbers of both unregulated (background) proteins as well as up- and downregulated proteins with exactly known ratios between samples. We determined the number and dynamic range of quantifiable proteins and analyzed the influence of applied algorithms (retention time alignment, clustering, normalization, etc.) on quantification results. Analysis of technical reproducibility revealed median coefficients of variation of reported protein abundances below 5% for MS(E) data for Progenesis and ISOQuant. Regarding accuracy of LFQ, evaluation with synapter and ISOQuant yielded superior results compared to Progenesis. In addition, we discuss reporting formats and user friendliness of the software packages. The data generated in this study have been deposited to the ProteomeXchange Consortium with identifier PXD001240 (http://proteomecentral.proteomexchange.org/dataset/PXD001240). © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Metrology of radiation protection. Pt. 1. Physical requirements and terminology

    Energy Technology Data Exchange (ETDEWEB)

    Wagner, S R

    1979-10-01

    Starting from a general consideration of the needs for radiation protection the physical requirements of a relevant metrology are developed. The expedient physical quantities are introduced and problems in the realization and dissemination of their units discussed. It is shown that owing to these difficulties, derived or operational quantities have to be developed for the construction and calibration of practical measuring instruments. Finally the relations between the metrology of radiation protection and of medical radiology are pointed out and commented. (orig.).

  12. Forum metrology 2009: control of optics, targets and optical analyzers

    International Nuclear Information System (INIS)

    Desenne, D.; Andre, R.

    2010-01-01

    The 1. 'Forum Metrologie' of the CEA/DAM has been held in the 'Institut Laser et Plasma' on the December 9, 2009, close to the 'Centre d'etudes Scientifiques et Techniques d'Aquitaine'. It has been set up by the 'Departement Lasers de Puissance'. The chosen thematic was the metrology around laser experiments, with a special focus on the metrology of the dedicated optics, targets and optical analysers. The talks have shown the progress and difficulties in each of these fields. (authors)

  13. Software Product Manager: A Mechanism to manage software products in small and medium ISVs

    NARCIS (Netherlands)

    Katchow, R.; van de Weerd, I.; Brinkkemper, S.; Rooswinkel, A.

    2009-01-01

    In this paper, we present SP Manager as an innovative tool for managing software products in small and medium independent software vendors (ISVs). This tool incorporates the operational software product management (SPM) processes focused on requirements management and release planning. By using

  14. Speckle-based at-wavelength metrology of X-ray mirrors with super accuracy

    International Nuclear Information System (INIS)

    Kashyap, Yogesh; Wang, Hongchang; Sawhney, Kawal

    2016-01-01

    X-ray active mirrors, such as bimorph and mechanically bendable mirrors, are increasingly being used on beamlines at modern synchrotron source facilities to generate either focused or “tophat” beams. As well as optical tests in the metrology lab, it is becoming increasingly important to optimise and characterise active optics under actual beamline operating conditions. Recently developed X-ray speckle-based at-wavelength metrology technique has shown great potential. The technique has been established and further developed at the Diamond Light Source and is increasingly being used to optimise active mirrors. Details of the X-ray speckle-based at-wavelength metrology technique and an example of its applicability in characterising and optimising a micro-focusing bimorph X-ray mirror are presented. Importantly, an unprecedented angular sensitivity in the range of two nanoradians for measuring the slope error of an optical surface has been demonstrated. Such a super precision metrology technique will be beneficial to the manufacturers of polished mirrors and also in optimization of beam shaping during experiments.

  15. Speckle-based at-wavelength metrology of X-ray mirrors with super accuracy

    Energy Technology Data Exchange (ETDEWEB)

    Kashyap, Yogesh; Wang, Hongchang; Sawhney, Kawal, E-mail: kawal.sawhney@diamond.ac.uk [Diamond Light Source, Harwell Science and Innovation Campus, Didcot OX11 0DE (United Kingdom)

    2016-05-15

    X-ray active mirrors, such as bimorph and mechanically bendable mirrors, are increasingly being used on beamlines at modern synchrotron source facilities to generate either focused or “tophat” beams. As well as optical tests in the metrology lab, it is becoming increasingly important to optimise and characterise active optics under actual beamline operating conditions. Recently developed X-ray speckle-based at-wavelength metrology technique has shown great potential. The technique has been established and further developed at the Diamond Light Source and is increasingly being used to optimise active mirrors. Details of the X-ray speckle-based at-wavelength metrology technique and an example of its applicability in characterising and optimising a micro-focusing bimorph X-ray mirror are presented. Importantly, an unprecedented angular sensitivity in the range of two nanoradians for measuring the slope error of an optical surface has been demonstrated. Such a super precision metrology technique will be beneficial to the manufacturers of polished mirrors and also in optimization of beam shaping during experiments.

  16. Holistic metrology qualification extension and its application to characterize overlay targets with asymmetric effects

    Science.gov (United States)

    Dos Santos Ferreira, Olavio; Sadat Gousheh, Reza; Visser, Bart; Lie, Kenrick; Teuwen, Rachel; Izikson, Pavel; Grzela, Grzegorz; Mokaberi, Babak; Zhou, Steve; Smith, Justin; Husain, Danish; Mandoy, Ram S.; Olvera, Raul

    2018-03-01

    Ever increasing need for tighter on-product overlay (OPO), as well as enhanced accuracy in overlay metrology and methodology, is driving semiconductor industry's technologists to innovate new approaches to OPO measurements. In case of High Volume Manufacturing (HVM) fabs, it is often critical to strive for both accuracy and robustness. Robustness, in particular, can be challenging in metrology since overlay targets can be impacted by proximity of other structures next to the overlay target (asymmetric effects), as well as symmetric stack changes such as photoresist height variations. Both symmetric and asymmetric contributors have impact on robustness. Furthermore, tweaking or optimizing wafer processing parameters for maximum yield may have an adverse effect on physical target integrity. As a result, measuring and monitoring physical changes or process abnormalities/artefacts in terms of new Key Performance Indicators (KPIs) is crucial for the end goal of minimizing true in-die overlay of the integrated circuits (ICs). IC manufacturing fabs often relied on CD-SEM in the past to capture true in-die overlay. Due to destructive and intrusive nature of CD-SEMs on certain materials, it's desirable to characterize asymmetry effects for overlay targets via inline KPIs utilizing YieldStar (YS) metrology tools. These KPIs can also be integrated as part of (μDBO) target evaluation and selection for final recipe flow. In this publication, the Holistic Metrology Qualification (HMQ) flow was extended to account for process induced (asymmetric) effects such as Grating Imbalance (GI) and Bottom Grating Asymmetry (BGA). Local GI typically contributes to the intrafield OPO whereas BGA typically impacts the interfield OPO, predominantly at the wafer edge. Stack height variations highly impact overlay metrology accuracy, in particular in case of multi-layer LithoEtch Litho-Etch (LELE) overlay control scheme. Introducing a GI impact on overlay (in nm) KPI check quantifies the

  17. Emerging role of bioinformatics tools and software in evolution of clinical research

    Directory of Open Access Journals (Sweden)

    Supreet Kaur Gill

    2016-01-01

    Full Text Available Clinical research is making toiling efforts for promotion and wellbeing of the health status of the people. There is a rapid increase in number and severity of diseases like cancer, hepatitis, HIV etc, resulting in high morbidity and mortality. Clinical research involves drug discovery and development whereas clinical trials are performed to establish safety and efficacy of drugs. Drug discovery is a long process starting with the target identification, validation and lead optimization. This is followed by the preclinical trials, intensive clinical trials and eventually post marketing vigilance for drug safety. Softwares and the bioinformatics tools play a great role not only in the drug discovery but also in drug development. It involves the use of informatics in the development of new knowledge pertaining to health and disease, data management during clinical trials and to use clinical data for secondary research. In addition, new technology likes molecular docking, molecular dynamics simulation, proteomics and quantitative structure activity relationship in clinical research results in faster and easier drug discovery process. During the preclinical trials, the software is used for randomization to remove bias and to plan study design. In clinical trials software like electronic data capture, Remote data capture and electronic case report form (eCRF is used to store the data. eClinical, Oracle clinical are software used for clinical data management and for statistical analysis of the data. After the drug is marketed the safety of a drug could be monitored by drug safety software like Oracle Argus or ARISg. Therefore, softwares are used from the very early stages of drug designing, to drug development, clinical trials and during pharmacovigilance. This review describes different aspects related to application of computers and bioinformatics in drug designing, discovery and development, formulation designing and clinical research.

  18. Trapped atomic ions for quantum-limited metrology

    Science.gov (United States)

    Wineland, David

    2017-04-01

    Laser-beam-manipulated trapped ions are a candidate for large-scale quantum information processing and quantum simulation but the basic techniques used can also be applied to quantum-limited metrology and sensing. Some examples being explored at NIST are: 1) As charged harmonic oscillators, trapped ions can be used to sense electric fields; this can be used to characterize the electrode-surface-based noisy electric fields that compromise logic-gate fidelities and may eventually be used as a tool in surface science. 2) Since typical qubit logic gates depend on state-dependent forces, we can adapt the gate dynamics to sensitively detect additional forces. 3) We can use extensions of Bell inequality measurements to further restrict the degree of local realism possessed by Bell states. 4) We also briefly describe experiments for creation of Bell states using Hilbert space engineering. This work is a joint effort including the Ion-Storage group, the Quantum processing group, and the Computing and Communications Theory group at NIST, Boulder. Supported by IARPA, ONR, and the NIST Quantum Information Program.

  19. Software Quality Assessment Tool Based on Meta-Models

    OpenAIRE

    Doneva Rositsa; Gaftandzhieva Silvia; Doneva Zhelyana; Staevsky Nevena

    2015-01-01

    In the software industry it is indisputably essential to control the quality of produced software systems in terms of capabilities for easy maintenance, reuse, portability and others in order to ensure reliability in the software development. But it is also clear that it is very difficult to achieve such a control through a ‘manual’ management of quality.There are a number of approaches for software quality assurance based typically on software quality models (e.g. ISO 9126, McCall’s, Boehm’s...

  20. Optimal adaptive control for quantum metrology with time-dependent Hamiltonians

    Science.gov (United States)

    Pang, Shengshi; Jordan, Andrew N.

    2017-01-01

    Quantum metrology has been studied for a wide range of systems with time-independent Hamiltonians. For systems with time-dependent Hamiltonians, however, due to the complexity of dynamics, little has been known about quantum metrology. Here we investigate quantum metrology with time-dependent Hamiltonians to bridge this gap. We obtain the optimal quantum Fisher information for parameters in time-dependent Hamiltonians, and show proper Hamiltonian control is generally necessary to optimize the Fisher information. We derive the optimal Hamiltonian control, which is generally adaptive, and the measurement scheme to attain the optimal Fisher information. In a minimal example of a qubit in a rotating magnetic field, we find a surprising result that the fundamental limit of T2 time scaling of quantum Fisher information can be broken with time-dependent Hamiltonians, which reaches T4 in estimating the rotation frequency of the field. We conclude by considering level crossings in the derivatives of the Hamiltonians, and point out additional control is necessary for that case. PMID:28276428