WorldWideScience

Sample records for complex surfaces verification

  1. Verification and Examination Management of Complex Systems

    Directory of Open Access Journals (Sweden)

    Stian Ruud

    2014-10-01

    Full Text Available As ship systems become more complex, with an increasing number of safety-critical functions, many interconnected subsystems, tight integration to other systems, and a large amount of potential failure modes, several industry parties have identified the need for improved methods for managing the verification and examination efforts of such complex systems. Such needs are even more prominent now that the marine and offshore industries are targeting more activities and operations in the Arctic environment. In this paper, a set of requirements and a method for verification and examination management are proposed for allocating examination efforts to selected subsystems. The method is based on a definition of a verification risk function for a given system topology and given requirements. The marginal verification risks for the subsystems may then be evaluated, so that examination efforts for the subsystem can be allocated. Two cases of requirements and systems are used to demonstrate the proposed method. The method establishes a systematic relationship between the verification loss, the logic system topology, verification method performance, examination stop criterion, the required examination effort, and a proposed sequence of examinations to reach the examination stop criterion.

  2. Formal verification of complex properties on PLC programs

    CERN Document Server

    Darvas, D; Voros, A; Bartha, T; Blanco Vinuela, E; Gonzalez Suarez, V M

    2014-01-01

    Formal verification has become a recommended practice in the safety-critical application areas. However, due to the complexity of practical control and safety systems, the state space explosion often prevents the use of formal analysis. In this paper we extend our former verification methodology with effective property preserving reduction techniques. For this purpose we developed general rule-based reductions and a customized version of the Cone of Influence (COI) reduction. Using these methods, the verification of complex requirements formalised with temporal logics (e.g. CTL, LTL) can be orders of magnitude faster. We use the NuSMV model checker on a real-life PLC program from CERN to demonstrate the performance of our reduction techniques.

  3. Verification and application of beam steering Phased Array UT technique for complex structures

    International Nuclear Information System (INIS)

    Yamamoto, Setsu; Miura, Takahiro; Semboshi, Jun; Ochiai, Makoto; Mitsuhashi, Tadahiro; Adachi, Hiroyuki; Yamamoto, Satoshi

    2013-01-01

    Phased Array Ultrasonic Testing (PAUT) techniques for complex geometries are greatly progressing. We developed an immersion PAUT which is suitable for complex surface profiles such as nozzles and deformed welded areas. Furthermore, we have developed a shape adaptive beam steering technique for 3D complex surface structures with conventional array probe and flexible coupling gel which makes the immersion beam forming technique usable under dry conditions. This system consists of 3 steps. Step1 is surface profile measurement which based on 3D Synthesis Aperture Focusing Technique (SAFT), Step2 is delay law calculation which could take into account the measured 3D surface profiles and steer a shape adjusted ultrasonic beam, Step3 is shape adjusted B-scope construction. In this paper, verification results of property of this PAUT system using R60 curved specimen and nozzle shaped specimen which simulated actual BWR structure. (author)

  4. Fluence complexity for IMRT field and simplification of IMRT verification

    International Nuclear Information System (INIS)

    Hanushova, Tereza; Vondarchek, Vladimir

    2013-01-01

    Intensity Modulated Radiation Therapy (IMRT) requires dosimetric verification of each patient’s plan, which is time consuming. This work deals with the idea of minimizing the number of fields for control, or even replacing plan verification by machine quality assurance (QA). We propose methods for estimation of fluence complexity in an IMRT field based on dose gradients and investigate the relation between results of gamma analysis and this quantity. If there is a relation, it might be possible to only verify the most complex field of a plan. We determine the average fluence complexity in clinical fields and design a test fluence corresponding to this amount of complexity which might be used in daily QA and potentially replace patient-related verification. Its applicability is assessed in clinical practice. The relation between fluence complexity and results of gamma analysis has been confirmed for plans but not for single fields. There is an agreement between the suggested test fluence and clinical fields in the average gamma parameter. A critical value of average gamma has been specified for the test fluence as a criterion for distinguishing between poorly and well deliverable plans. It will not be possible to only verify the most complex field of a plan but verification of individual plans could be replaced by a morning check of the suggested test fluence, together with a well-established set of QA tests. (Author)

  5. Land surface Verification Toolkit (LVT)

    Science.gov (United States)

    Kumar, Sujay V.

    2017-01-01

    LVT is a framework developed to provide an automated, consolidated environment for systematic land surface model evaluation Includes support for a range of in-situ, remote-sensing and other model and reanalysis products. Supports the analysis of outputs from various LIS subsystems, including LIS-DA, LIS-OPT, LIS-UE. Note: The Land Information System Verification Toolkit (LVT) is a NASA software tool designed to enable the evaluation, analysis and comparison of outputs generated by the Land Information System (LIS). The LVT software is released under the terms and conditions of the NASA Open Source Agreement (NOSA) Version 1.1 or later. Land Information System Verification Toolkit (LVT) NOSA.

  6. Calibration and verification of surface contamination meters --- Procedures and techniques

    International Nuclear Information System (INIS)

    Schuler, C; Butterweck, G.; Wernli, C.; Bochud, F.; Valley, J.-F.

    2007-03-01

    A standardised measurement procedure for surface contamination meters (SCM) is presented. The procedure aims at rendering surface contamination measurements to be simply and safely interpretable. Essential for the approach is the introduction and common use of the radionuclide specific quantity 'guideline value' specified in the Swiss Radiation Protection Ordinance as unit for the measurement of surface activity. The according radionuclide specific 'guideline value count rate' can be summarized as verification reference value for a group of radionuclides ('basis guideline value count rate'). The concept can be generalized for SCM of the same type or for SCM of different types using he same principle of detection. A SCM multi source calibration technique is applied for the determination of the instrument efficiency. Four different electron radiation energy regions, four different photon radiation energy regions and an alpha radiation energy region are represented by a set of calibration sources built according to ISO standard 8769-2. A guideline value count rate representing the activity per unit area of a surface contamination of one guideline value can be calculated for any radionuclide using instrument efficiency, radionuclide decay data, contamination source efficiency, guideline value averaging area (100 cm 2 ), and radionuclide specific guideline value. n this way, instrument responses for the evaluation of surface contaminations are obtained for radionuclides without available calibration sources as well as for short-Iived radionuclides, for which the continuous replacement of certified calibration sources can lead to unreasonable costs. SCM verification is based on surface emission rates of reference sources with an active area of 100 cm 2 . The verification for a given list of radionuclides is based on the radionuclide specific quantity guideline value count rate. Guideline value count rates for groups of radionuclides can be represented within the maximum

  7. Uncertainty analysis of point-by-point sampling complex surfaces using touch probe CMMs DOE for complex surfaces verification with CMM

    DEFF Research Database (Denmark)

    Barini, Emanuele Modesto; Tosello, Guido; De Chiffre, Leonardo

    2010-01-01

    The paper describes a study concerning point-by-point sampling of complex surfaces using tactile CMMs. A four factor, two level completely randomized factorial experiment was carried out, involving measurements on a complex surface configuration item comprising a sphere, a cylinder and a cone, co...

  8. Land Surface Verification Toolkit (LVT) - A Generalized Framework for Land Surface Model Evaluation

    Science.gov (United States)

    Kumar, Sujay V.; Peters-Lidard, Christa D.; Santanello, Joseph; Harrison, Ken; Liu, Yuqiong; Shaw, Michael

    2011-01-01

    Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT) is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS), it also supports hydrological data products from other, non-LIS environments. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  9. Land surface Verification Toolkit (LVT) - a generalized framework for land surface model evaluation

    Science.gov (United States)

    Kumar, S. V.; Peters-Lidard, C. D.; Santanello, J.; Harrison, K.; Liu, Y.; Shaw, M.

    2012-06-01

    Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT) is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS), it supports hydrological data products from non-LIS environments as well. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  10. Application of surface-enhanced Raman spectroscopy (SERS) for cleaning verification in pharmaceutical manufacture.

    Science.gov (United States)

    Corrigan, Damion K; Cauchi, Michael; Piletsky, Sergey; Mccrossen, Sean

    2009-01-01

    Cleaning verification is the process by which pharmaceutical manufacturing equipment is determined as sufficiently clean to allow manufacture to continue. Surface-enhanced Raman spectroscopy (SERS) is a very sensitive spectroscopic technique capable of detection at levels appropriate for cleaning verification. In this paper, commercially available Klarite SERS substrates were employed in order to obtain the necessary enhancement of signal for the identification of chemical species at concentrations of 1 to 10 ng/cm2, which are relevant to cleaning verification. The SERS approach was combined with principal component analysis in the identification of drug compounds recovered from a contaminated steel surface.

  11. One-time pad, complexity of verification of keys, and practical security of quantum cryptography

    Energy Technology Data Exchange (ETDEWEB)

    Molotkov, S. N., E-mail: sergei.molotkov@gmail.com [Russian Academy of Sciences, Institute of Solid State Physics (Russian Federation)

    2016-11-15

    A direct relation between the complexity of the complete verification of keys, which is one of the main criteria of security in classical systems, and a trace distance used in quantum cryptography is demonstrated. Bounds for the minimum and maximum numbers of verification steps required to determine the actual key are obtained.

  12. One-time pad, complexity of verification of keys, and practical security of quantum cryptography

    International Nuclear Information System (INIS)

    Molotkov, S. N.

    2016-01-01

    A direct relation between the complexity of the complete verification of keys, which is one of the main criteria of security in classical systems, and a trace distance used in quantum cryptography is demonstrated. Bounds for the minimum and maximum numbers of verification steps required to determine the actual key are obtained.

  13. Inspection and verification of waste packages for near surface disposal

    International Nuclear Information System (INIS)

    2000-01-01

    Extensive experience has been gained with various disposal options for low and intermediate level waste at or near surface disposal facilities. Near surface disposal is based on proven and well demonstrated technologies. To ensure the safety of near surface disposal facilities when available technologies are applied, it is necessary to control and assure the quality of the repository system's performance, which includes waste packages, engineered features and natural barriers, as well as siting, design, construction, operation, closure and institutional controls. Recognizing the importance of repository performance, the IAEA is producing a set of technical publications on quality assurance and quality control (QA/QC) for waste disposal to provide Member States with technical guidance and current information. These publications cover issues on the application of QA/QC programmes to waste disposal, long term record management, and specific QA/QC aspects of waste packaging, repository design and R and D. Waste package QA/QC is especially important because the package is the primary barrier to radionuclide release from a disposal facility. Waste packaging also involves interface issues between the waste generator and the disposal facility operator. Waste should be packaged by generators to meet waste acceptance requirements set for a repository or disposal system. However, it is essential that the disposal facility operator ensure that waste packages conform with disposal facility acceptance requirements. Demonstration of conformance with disposal facility acceptance requirements can be achieved through the systematic inspection and verification of waste packages at both the waste generator's site and at the disposal facility, based on a waste package QA/QC programme established by the waste generator and approved by the disposal operator. However, strategies, approaches and the scope of inspection and verification will be somewhat different from country to country

  14. Automation bias and verification complexity: a systematic review.

    Science.gov (United States)

    Lyell, David; Coiera, Enrico

    2017-03-01

    While potentially reducing decision errors, decision support systems can introduce new types of errors. Automation bias (AB) happens when users become overreliant on decision support, which reduces vigilance in information seeking and processing. Most research originates from the human factors literature, where the prevailing view is that AB occurs only in multitasking environments. This review seeks to compare the human factors and health care literature, focusing on the apparent association of AB with multitasking and task complexity. EMBASE, Medline, Compendex, Inspec, IEEE Xplore, Scopus, Web of Science, PsycINFO, and Business Source Premiere from 1983 to 2015. Evaluation studies where task execution was assisted by automation and resulted in errors were included. Participants needed to be able to verify automation correctness and perform the task manually. Tasks were identified and grouped. Task and automation type and presence of multitasking were noted. Each task was rated for its verification complexity. Of 890 papers identified, 40 met the inclusion criteria; 6 were in health care. Contrary to the prevailing human factors view, AB was found in single tasks, typically involving diagnosis rather than monitoring, and with high verification complexity. The literature is fragmented, with large discrepancies in how AB is reported. Few studies reported the statistical significance of AB compared to a control condition. AB appears to be associated with the degree of cognitive load experienced in decision tasks, and appears to not be uniquely associated with multitasking. Strategies to minimize AB might focus on cognitive load reduction. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  15. Complex Systems Models and Their Applications: Towards a New Science of Verification, Validation & Uncertainty Quantification

    Energy Technology Data Exchange (ETDEWEB)

    Tsao, Jeffrey Y. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Trucano, Timothy G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kleban, Stephen D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Naugle, Asmeret Bier [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Verzi, Stephen Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Swiler, Laura Painton [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Johnson, Curtis M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Smith, Mark A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Flanagan, Tatiana Paz [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vugrin, Eric D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gabert, Kasimir Georg [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lave, Matthew Samuel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Chen, Wei [Northwestern Univ., Evanston, IL (United States); DeLaurentis, Daniel [Purdue Univ., West Lafayette, IN (United States); Hubler, Alfred [Univ. of Illinois, Urbana, IL (United States); Oberkampf, Bill [WLO Consulting, Austin, TX (United States)

    2016-08-01

    This report contains the written footprint of a Sandia-hosted workshop held in Albuquerque, New Mexico, June 22-23, 2016 on “Complex Systems Models and Their Applications: Towards a New Science of Verification, Validation and Uncertainty Quantification,” as well as of pre-work that fed into the workshop. The workshop’s intent was to explore and begin articulating research opportunities at the intersection between two important Sandia communities: the complex systems (CS) modeling community, and the verification, validation and uncertainty quantification (VVUQ) community The overarching research opportunity (and challenge) that we ultimately hope to address is: how can we quantify the credibility of knowledge gained from complex systems models, knowledge that is often incomplete and interim, but will nonetheless be used, sometimes in real-time, by decision makers?

  16. Complex-Wide Waste Flow Analysis V1.0 verification and validation report

    International Nuclear Information System (INIS)

    Hsu, K.M.; Lundeen, A.S.; Oswald, K.B.; Shropshire, D.E.; Robinson, J.M.; West, W.H.

    1997-01-01

    The complex-wide waste flow analysis model (CWWFA) was developed to assist the Department of Energy (DOE) Environmental Management (EM) Office of Science and Technology (EM-50) to evaluate waste management scenarios with emphasis on identifying and prioritizing technology development opportunities to reduce waste flows and public risk. In addition, the model was intended to support the needs of the Complex-Wide Environmental Integration (EMI) team supporting the DOE's Accelerating Cleanup: 2006 Plan. CWWFA represents an integrated environmental modeling system that covers the life cycle of waste management activities including waste generation, interim process storage, retrieval, characterization and sorting, waste preparation and processing, packaging, final interim storage, transport, and disposal at a final repository. The CWWFA shows waste flows through actual site-specific and facility-specific conditions. The system requirements for CWWFA are documented in the Technical Requirements Document (TRD). The TRD is intended to be a living document that will be modified over the course of the execution of CWWFA development. Thus, it is anticipated that CWWFA will continue to evolve as new requirements are identified (i.e., transportation, small sites, new streams, etc.). This report provides a documented basis for system verification of CWWFA requirements. System verification is accomplished through formal testing and evaluation to ensure that all performance requirements as specified in the TRD have been satisfied. A Requirement Verification Matrix (RVM) was used to map the technical requirements to the test procedures. The RVM is attached as Appendix A. Since February of 1997, substantial progress has been made toward development of the CWWFA to meet the system requirements. This system verification activity provides a baseline on system compliance to requirements and also an opportunity to reevaluate what requirements need to be satisfied in FY-98

  17. A Scalable Approach for Hardware Semiformal Verification

    OpenAIRE

    Grimm, Tomas; Lettnin, Djones; Hübner, Michael

    2018-01-01

    The current verification flow of complex systems uses different engines synergistically: virtual prototyping, formal verification, simulation, emulation and FPGA prototyping. However, none is able to verify a complete architecture. Furthermore, hybrid approaches aiming at complete verification use techniques that lower the overall complexity by increasing the abstraction level. This work focuses on the verification of complex systems at the RT level to handle the hardware peculiarities. Our r...

  18. Wind turbine power performance verification in complex terrain and wind farms

    DEFF Research Database (Denmark)

    Friis Pedersen, Troels; Gjerding, S.; Enevoldsen, P.

    2002-01-01

    is a power performance verification procedure for individual wind turbines. The third is a power performance measurement procedure of whole wind farms, and the fourth is a power performance measurement procedurefor non-grid (small) wind turbines. This report presents work that was made to support the basis......The IEC/EN 61400-12 Ed 1 standard for wind turbine power performance testing is being revised. The standard will be divided into four documents. The first one of these is more or less a revision of the existing document on power performance measurementson individual wind turbines. The second one...... then been investigated in more detail. The work has given rise to a range of conclusionsand recommendations regarding: guaranties on power curves in complex terrain; investors and bankers experience with verification of power curves; power performance in relation to regional correction curves for Denmark...

  19. Fiction and reality in the modelling world - Balance between simplicity and complexity, calibration and identifiability, verification and falsification

    DEFF Research Database (Denmark)

    Harremoës, P.; Madsen, H.

    1999-01-01

    Where is the balance between simplicity and complexity in model prediction of urban drainage structures? The calibration/verification approach to testing of model performance gives an exaggerated sense of certainty. Frequently, the model structure and the parameters are not identifiable by calibr......Where is the balance between simplicity and complexity in model prediction of urban drainage structures? The calibration/verification approach to testing of model performance gives an exaggerated sense of certainty. Frequently, the model structure and the parameters are not identifiable...... by calibration/verification on the basis of the data series available, which generates elements of sheer guessing - unless the universality of the model is be based on induction, i.e. experience from the sum of all previous investigations. There is a need to deal more explicitly with uncertainty...

  20. The virtual product-process design laboratory to manage the complexity in the verification of formulated products

    DEFF Research Database (Denmark)

    Conte, Elisa; Gani, Rafiqul; Malik, Tahir I.

    2011-01-01

    -Process Design laboratory (virtual PPD-lab) software is based on this decomposition strategy for the design of formulated liquid products. When the needed models are available in the software, the solution of formulation design/verification problems is straightforward, while when models are not available...... mixtures need to be predicted. This complexity has to be managed through decomposition of the problem into sub-problems. Each sub-problem is solved and analyzed and, from the knowledge gained, an overall evaluation of the complex chemical system representing the product is made. The virtual Product...... in the software library, they need to be developed and/or implemented. The potential of the virtual PPD-lab in managing the complexity in the verification of formulated products, after the needed models have been developed and implemented, is highlighted in this paper through a case study from industry dealing...

  1. Solid waste operations complex engineering verification program plan

    International Nuclear Information System (INIS)

    Bergeson, C.L.

    1994-01-01

    This plan supersedes, but does not replace, the previous Waste Receiving and Processing/Solid Waste Engineering Development Program Plan. In doing this, it does not repeat the basic definitions of the various types or classes of development activities nor provide the rigorous written description of each facility and assign the equipment to development classes. The methodology described in the previous document is still valid and was used to determine the types of verification efforts required. This Engineering Verification Program Plan will be updated on a yearly basis. This EVPP provides programmatic definition of all engineering verification activities for the following SWOC projects: (1) Project W-026 - Waste Receiving and Processing Facility Module 1; (2) Project W-100 - Waste Receiving and Processing Facility Module 2A; (3) Project W-112 - Phase V Storage Facility; and (4) Project W-113 - Solid Waste Retrieval. No engineering verification activities are defined for Project W-112 as no verification work was identified. The Acceptance Test Procedures/Operational Test Procedures will be part of each project's Title III operation test efforts. The ATPs/OTPs are not covered by this EVPP

  2. One-level modeling for diagnosing surface winds over complex terrain. II - Applicability to short-range forecasting

    Science.gov (United States)

    Alpert, P.; Getenio, B.; Zak-Rosenthal, R.

    1988-01-01

    The Alpert and Getenio (1988) modification of the Mass and Dempsey (1985) one-level sigma-surface model was used to study four synoptic events that included two winter cases (a Cyprus low and a Siberian high) and two summer cases. Results of statistical verification showed that the model is not only capable of diagnosing many details of surface mesoscale flow, but might also be useful for various applications which require operative short-range prediction of the diurnal changes of high-resolution surface flow over complex terrain, for example, in locating wildland fires, determining the dispersion of air pollutants, and predicting changes in wind energy or of surface wind for low-level air flights.

  3. Advanced verification topics

    CERN Document Server

    Bhattacharya, Bishnupriya; Hall, Gary; Heaton, Nick; Kashai, Yaron; Khan Neyaz; Kirshenbaum, Zeev; Shneydor, Efrat

    2011-01-01

    The Accellera Universal Verification Methodology (UVM) standard is architected to scale, but verification is growing and in more than just the digital design dimension. It is growing in the SoC dimension to include low-power and mixed-signal and the system integration dimension to include multi-language support and acceleration. These items and others all contribute to the quality of the SOC so the Metric-Driven Verification (MDV) methodology is needed to unify it all into a coherent verification plan. This book is for verification engineers and managers familiar with the UVM and the benefits it brings to digital verification but who also need to tackle specialized tasks. It is also written for the SoC project manager that is tasked with building an efficient worldwide team. While the task continues to become more complex, Advanced Verification Topics describes methodologies outside of the Accellera UVM standard, but that build on it, to provide a way for SoC teams to stay productive and profitable.

  4. Inspector measurement verification activities

    International Nuclear Information System (INIS)

    George, R.S.; Crouch, R.

    e most difficult and complex activity facing a safeguards inspector involves the verification of measurements and the performance of the measurement system. Remeasurement is the key to measurement verification activities. Remeasurerements using the facility's measurement system provide the bulk of the data needed for determining the performance of the measurement system. Remeasurements by reference laboratories are also important for evaluation of the measurement system and determination of systematic errors. The use of these measurement verification activities in conjunction with accepted inventory verification practices provides a better basis for accepting or rejecting an inventory. (U.S.)

  5. Towards the development of a rapid, portable, surface enhanced Raman spectroscopy based cleaning verification system for the drug nelarabine.

    Science.gov (United States)

    Corrigan, Damion K; Salton, Neale A; Preston, Chris; Piletsky, Sergey

    2010-09-01

    Cleaning verification is a scientific and economic problem for the pharmaceutical industry. A large amount of potential manufacturing time is lost to the process of cleaning verification. This involves the analysis of residues on spoiled manufacturing equipment, with high-performance liquid chromatography (HPLC) being the predominantly employed analytical technique. The aim of this study was to develop a portable cleaning verification system for nelarabine using surface enhanced Raman spectroscopy (SERS). SERS was conducted using a portable Raman spectrometer and a commercially available SERS substrate to develop a rapid and portable cleaning verification system for nelarabine. Samples of standard solutions and swab extracts were deposited onto the SERS active surfaces, allowed to dry and then subjected to spectroscopic analysis. Nelarabine was amenable to analysis by SERS and the necessary levels of sensitivity were achievable. It is possible to use this technology for a semi-quantitative limits test. Replicate precision, however, was poor due to the heterogeneous drying pattern of nelarabine on the SERS active surface. Understanding and improving the drying process in order to produce a consistent SERS signal for quantitative analysis is desirable. This work shows the potential application of SERS for cleaning verification analysis. SERS may not replace HPLC as the definitive analytical technique, but it could be used in conjunction with HPLC so that swabbing is only carried out once the portable SERS equipment has demonstrated that the manufacturing equipment is below the threshold contamination level.

  6. Patient set-up verification by infrared optical localization and body surface sensing in breast radiation therapy

    International Nuclear Information System (INIS)

    Spadea, Maria Francesca; Baroni, Guido; Riboldi, Marco; Orecchia, Roberto; Pedotti, Antonio; Tagaste, Barbara; Garibaldi, Cristina

    2006-01-01

    Background and purpose: The aim of the study was to investigate the clinical application of a technique for patient set-up verification in breast cancer radiotherapy, based on the 3D localization of a hybrid configuration of surface control points. Materials and methods: An infrared optical tracker provided the 3D position of two passive markers and 10 laser spots placed around and within the irradiation field on nine patients. A fast iterative constrained minimization procedure was applied to detect and compensate patient set-up errors, through the control points registration with reference data coming from treatment plan (markers reference position, CT-based surface model). Results: The application of the corrective spatial transformation estimated by the registration procedure led to significant improvement of patient set-up. Median value of 3D errors affecting three additional verification markers within the irradiation field decreased from 5.7 to 3.5 mm. Errors variability (25-75%) decreased from 3.2 to 2.1 mm. Laser spots registration on the reference surface model was documented to contribute substantially to set-up errors compensation. Conclusions: Patient set-up verification through a hybrid set of control points and constrained surface minimization algorithm was confirmed to be feasible in clinical practice and to provide valuable information for the improvement of the quality of patient set-up, with minimal requirement of operator-dependant procedures. The technique combines conveniently the advantages of passive markers based methods and surface registration techniques, by featuring immediate and robust estimation of the set-up accuracy from a redundant dataset

  7. bcROCsurface: an R package for correcting verification bias in estimation of the ROC surface and its volume for continuous diagnostic tests.

    Science.gov (United States)

    To Duc, Khanh

    2017-11-18

    Receiver operating characteristic (ROC) surface analysis is usually employed to assess the accuracy of a medical diagnostic test when there are three ordered disease status (e.g. non-diseased, intermediate, diseased). In practice, verification bias can occur due to missingness of the true disease status and can lead to a distorted conclusion on diagnostic accuracy. In such situations, bias-corrected inference tools are required. This paper introduce an R package, named bcROCsurface, which provides utility functions for verification bias-corrected ROC surface analysis. The shiny web application of the correction for verification bias in estimation of the ROC surface analysis is also developed. bcROCsurface may become an important tool for the statistical evaluation of three-class diagnostic markers in presence of verification bias. The R package, readme and example data are available on CRAN. The web interface enables users less familiar with R to evaluate the accuracy of diagnostic tests, and can be found at http://khanhtoduc.shinyapps.io/bcROCsurface_shiny/ .

  8. Embedded software verification and debugging

    CERN Document Server

    Winterholer, Markus

    2017-01-01

    This book provides comprehensive coverage of verification and debugging techniques for embedded software, which is frequently used in safety critical applications (e.g., automotive), where failures are unacceptable. Since the verification of complex systems needs to encompass the verification of both hardware and embedded software modules, this book focuses on verification and debugging approaches for embedded software with hardware dependencies. Coverage includes the entire flow of design, verification and debugging of embedded software and all key approaches to debugging, dynamic, static, and hybrid verification. This book discusses the current, industrial embedded software verification flow, as well as emerging trends with focus on formal and hybrid verification and debugging approaches. Includes in a single source the entire flow of design, verification and debugging of embedded software; Addresses the main techniques that are currently being used in the industry for assuring the quality of embedded softw...

  9. Verification of aspheric contact lens back surfaces.

    Science.gov (United States)

    Dietze, Holger H; Cox, Michael J; Douthwaite, William A

    2003-08-01

    To suggest a tolerance level for the degree of asphericity of aspheric rigid gas-permeable contact lenses and to find a simple method for its verification. Using existing tolerances for the vertex radius, tolerance limits for eccentricity and p values and were calculated. A keratometer-based method and a method based on sag measurements were used to measure the vertex radius and eccentricity of eight concave progressively aspheric surfaces and six concave ellipsoidal surfaces. The results were compared with a gold standard measurement made using a high-precision mechanical instrument (Form Talysurf). The suggested tolerance for eccentricity and p value and is +/-0.05. The keratometer method was very accurate and precise at measuring the vertex radius (mean deviation +/- SD from Talysurf results, -0.002 +/- 0.008 mm). The keratometer was more precise than and similar in accuracy to the sag method for measurement of asphericity (mean deviation of keratometer method results from Talysurf results, 0.017 +/- 0.018; mean deviation of sag method results from Talysurf results using five semichords, -0.016 +/- 0.032). Neither method was precise enough to verify the asphericity within the suggested tolerance. The keratometer can be efficiently used to verify the back vertex radius within its International Organization for Standardization tolerance and the back surface asphericity within an eccentricity/p value tolerance of +/-0.1. The method is poor for progressive aspheres with large edge blending zones. Deriving the eccentricity from sag measurements is a potential alternative if the mathematical description of the surface is known. The limiting factor of this method is the accuracy and precision of individual sag measurements.

  10. Wind turbine power performance verification in complex terrain and wind farms

    Energy Technology Data Exchange (ETDEWEB)

    Friis Pedersen, T.; Gjerding, S.; Ingham, P.; Enevoldsen, P.; Kjaer Hansen, J.; Kanstrup Joergensen, H.

    2002-04-01

    The IEC/EN 61400-12 Ed 1 standard for wind turbine power performance testing is being revised. The standard will be divided into four documents. The first one of these is more or less a revision of the existing document on power performance measurements on individual wind turbines. The second one is a power performance verification procedure for individual wind turbines. The third is a power performance measurement procedure of whole wind farms, and the fourth is a power performance measurement procedure for non-grid (small) wind turbines. This report presents work that was made to support the basis for this standardisation work. The work addressed experience from several national and international research projects and contractual and field experience gained within the wind energy community on this matter. The work was wide ranging and addressed 'grey' areas of knowledge regarding existing methodologies, which has then been investigated in more detail. The work has given rise to a range of conclusions and recommendations regarding: guaranties on power curves in complex terrain; investors and bankers experience with verification of power curves; power performance in relation to regional correction curves for Denmark; anemometry and the influence of inclined flow. (au)

  11. Multilateral disarmament verification

    International Nuclear Information System (INIS)

    Persbo, A.

    2013-01-01

    Non-governmental organisations, such as VERTIC (Verification Research, Training and Information Centre), can play an important role in the promotion of multilateral verification. Parties involved in negotiating nuclear arms accords are for the most part keen that such agreements include suitable and robust provisions for monitoring and verification. Generally progress in multilateral arms control verification is often painstakingly slow, but from time to time 'windows of opportunity' - that is, moments where ideas, technical feasibility and political interests are aligned at both domestic and international levels - may occur and we have to be ready, so the preparatory work is very important. In the context of nuclear disarmament, verification (whether bilateral or multilateral) entails an array of challenges, hurdles and potential pitfalls relating to national security, health, safety and even non-proliferation, so preparatory work is complex and time-greedy. A UK-Norway Initiative was established in order to investigate the role that a non-nuclear-weapon state such as Norway could potentially play in the field of nuclear arms control verification. (A.C.)

  12. An index of floodplain surface complexity

    Science.gov (United States)

    Scown, Murray W.; Thoms, Martin C.; DeJager, Nathan R.

    2016-01-01

    Floodplain surface topography is an important component of floodplain ecosystems. It is the primary physical template upon which ecosystem processes are acted out, and complexity in this template can contribute to the high biodiversity and productivity of floodplain ecosystems. There has been a limited appreciation of floodplain surface complexity because of the traditional focus on temporal variability in floodplains as well as limitations to quantifying spatial complexity. An index of floodplain surface complexity (FSC) is developed in this paper and applied to eight floodplains from different geographic settings. The index is based on two key indicators of complexity, variability in surface geometry (VSG) and the spatial organisation of surface conditions (SPO), and was determined at three sampling scales. FSC, VSG, and SPO varied between the eight floodplains and these differences depended upon sampling scale. Relationships between these measures of spatial complexity and seven geomorphological and hydrological drivers were investigated. There was a significant decline in all complexity measures with increasing floodplain width, which was explained by either a power, logarithmic, or exponential function. There was an initial rapid decline in surface complexity as floodplain width increased from 1.5 to 5 km, followed by little change in floodplains wider than 10 km. VSG also increased significantly with increasing sediment yield. No significant relationships were determined between any of the four hydrological variables and floodplain surface complexity.

  13. Machining of Complex Sculptured Surfaces

    CERN Document Server

    2012-01-01

    The machining of complex sculptured surfaces is a global technological topic in modern manufacturing with relevance in both industrialized and emerging in countries particularly within the moulds and dies sector whose applications include highly technological industries such as the automotive and aircraft industry. Machining of Complex Sculptured Surfaces considers new approaches to the manufacture of moulds and dies within these industries. The traditional technology employed in the manufacture of moulds and dies combined conventional milling and electro-discharge machining (EDM) but this has been replaced with  high-speed milling (HSM) which has been applied in roughing, semi-finishing and finishing of moulds and dies with great success. Machining of Complex Sculptured Surfaces provides recent information on machining of complex sculptured surfaces including modern CAM systems and process planning for three and five axis machining as well as explanations of the advantages of HSM over traditional methods ra...

  14. Formal Verification of Continuous Systems

    DEFF Research Database (Denmark)

    Sloth, Christoffer

    2012-01-01

    and the verification procedures should be algorithmically synthesizable. Autonomous control plays an important role in many safety-critical systems. This implies that a malfunction in the control system can have catastrophic consequences, e.g., in space applications where a design flaw can result in large economic...... losses. Furthermore, a malfunction in the control system of a surgical robot may cause death of patients. The previous examples involve complex systems that are required to operate according to complex specifications. The systems cannot be formally verified by modern verification techniques, due...

  15. On Verification Modelling of Embedded Systems

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.

    Computer-aided verification of embedded systems hinges on the availability of good verification models of the systems at hand. Such models must be much simpler than full design models or specifications to be of practical value, because of the unavoidable combinatorial complexities in the

  16. Removal of arsenate by ferrihydrite via surface complexation and surface precipitation

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, Xiuli [Department of Environment Engineering, College of the Environment and Ecology, and The Key Laboratory of the Ministry of Education for Coastal and Wetland Ecosystem, Xiamen University, Xiamen (China); Department of Chemical and Biochemical Engineering, College of Chemistry and Chemical Engineering, and The Key Laboratory for Synthetic Biotechnology of Xiamen City, Xiamen University, Xiamen (China); Peng, Changjun; Fu, Dun; Chen, Zheng [Department of Environment Engineering, College of the Environment and Ecology, and The Key Laboratory of the Ministry of Education for Coastal and Wetland Ecosystem, Xiamen University, Xiamen (China); Shen, Liang [Department of Chemical and Biochemical Engineering, College of Chemistry and Chemical Engineering, and The Key Laboratory for Synthetic Biotechnology of Xiamen City, Xiamen University, Xiamen (China); Li, Qingbiao [Department of Environment Engineering, College of the Environment and Ecology, and The Key Laboratory of the Ministry of Education for Coastal and Wetland Ecosystem, Xiamen University, Xiamen (China); Department of Chemical and Biochemical Engineering, College of Chemistry and Chemical Engineering, and The Key Laboratory for Synthetic Biotechnology of Xiamen City, Xiamen University, Xiamen (China); Ouyang, Tong, E-mail: yz3t@xmu.edu.cn [Department of Environment Engineering, College of the Environment and Ecology, and The Key Laboratory of the Ministry of Education for Coastal and Wetland Ecosystem, Xiamen University, Xiamen (China); Wang, Yuanpeng, E-mail: wypp@xmu.edu.cn [Department of Chemical and Biochemical Engineering, College of Chemistry and Chemical Engineering, and The Key Laboratory for Synthetic Biotechnology of Xiamen City, Xiamen University, Xiamen (China)

    2015-10-30

    Graphical abstract: - Highlights: • Surface complexation and surface precipitation of As on ferrihydrite happen at pH 3–6. • The formation of surface precipitation enhanced As(V) adsorption. • The dissolved Fe{sup 3+} had a good linear relationship with the amount of arsenate re-adsorption. - Abstract: In this study, macroscopic and spectroscopic experimental methods accurately modeled the sorption process of arsenate on ferrihydrite. EXAFS, X-ray diffraction and infrared (IR) spectroscopy indicated that the behavior of As(V) adsorption onto ferrihydrite took place mainly via surface complexation and surface precipitation at acidic pH (3.0–6.0), while the surface precipitation was dominated at longer time intervals and higher Fe{sup 3+} concentration. The macroscopic competitive adsorption experiment between arsenate with phosphate indicated two types of adsorption sites existing on the surface of ferrihydrite, i.e., non-exchangeable sites, which are responsible for a rapid surface complex formation; and exchangeable sites for a slow build-up of surface precipitates. In the slow build-up precipitates, the As(V) surface coverage (mmol/g) exhibited a good linear relationship (R{sup 2} = 0.952) with the amount of dissolved Fe{sup 3+}. Three steps are involved during the process of surface precipitation, i.e., (1) an initial uptake of As(V) via surface complexation; (2) re-adsorption of Fe{sup 3+} leaching from ferrihydrite on the surface complex; and (3) As(V) adsorption via surface complexation again and finally forming the surface precipitate.

  17. Predictive Surface Complexation Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Sverjensky, Dimitri A. [Johns Hopkins Univ., Baltimore, MD (United States). Dept. of Earth and Planetary Sciences

    2016-11-29

    Surface complexation plays an important role in the equilibria and kinetics of processes controlling the compositions of soilwaters and groundwaters, the fate of contaminants in groundwaters, and the subsurface storage of CO2 and nuclear waste. Over the last several decades, many dozens of individual experimental studies have addressed aspects of surface complexation that have contributed to an increased understanding of its role in natural systems. However, there has been no previous attempt to develop a model of surface complexation that can be used to link all the experimental studies in order to place them on a predictive basis. Overall, my research has successfully integrated the results of the work of many experimentalists published over several decades. For the first time in studies of the geochemistry of the mineral-water interface, a practical predictive capability for modeling has become available. The predictive correlations developed in my research now enable extrapolations of experimental studies to provide estimates of surface chemistry for systems not yet studied experimentally and for natural and anthropogenically perturbed systems.

  18. Nuclear disarmament verification

    International Nuclear Information System (INIS)

    DeVolpi, A.

    1993-01-01

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification

  19. Transitioning Enhanced Land Surface Initialization and Model Verification Capabilities to the Kenya Meteorological Department (KMD)

    Science.gov (United States)

    Case, Jonathan L.; Mungai, John; Sakwa, Vincent; Zavodsky, Bradley T.; Srikishen, Jayanthi; Limaye, Ashutosh; Blankenship, Clay B.

    2016-01-01

    Flooding, severe weather, and drought are key forecasting challenges for the Kenya Meteorological Department (KMD), based in Nairobi, Kenya. Atmospheric processes leading to convection, excessive precipitation and/or prolonged drought can be strongly influenced by land cover, vegetation, and soil moisture content, especially during anomalous conditions and dry/wet seasonal transitions. It is thus important to represent accurately land surface state variables (green vegetation fraction, soil moisture, and soil temperature) in Numerical Weather Prediction (NWP) models. The NASA SERVIR and the Short-term Prediction Research and Transition (SPoRT) programs in Huntsville, AL have established a working partnership with KMD to enhance its regional modeling capabilities. SPoRT and SERVIR are providing experimental land surface initialization datasets and model verification capabilities for capacity building at KMD. To support its forecasting operations, KMD is running experimental configurations of the Weather Research and Forecasting (WRF; Skamarock et al. 2008) model on a 12-km/4-km nested regional domain over eastern Africa, incorporating the land surface datasets provided by NASA SPoRT and SERVIR. SPoRT, SERVIR, and KMD participated in two training sessions in March 2014 and June 2015 to foster the collaboration and use of unique land surface datasets and model verification capabilities. Enhanced regional modeling capabilities have the potential to improve guidance in support of daily operations and high-impact weather and climate outlooks over Eastern Africa. For enhanced land-surface initialization, the NASA Land Information System (LIS) is run over Eastern Africa at 3-km resolution, providing real-time land surface initialization data in place of interpolated global model soil moisture and temperature data available at coarser resolutions. Additionally, real-time green vegetation fraction (GVF) composites from the Suomi-NPP VIIRS instrument is being incorporated

  20. Complexation of carboxylate on smectite surfaces.

    Science.gov (United States)

    Liu, Xiandong; Lu, Xiancai; Zhang, Yingchun; Zhang, Chi; Wang, Rucheng

    2017-07-19

    We report a first principles molecular dynamics (FPMD) study of carboxylate complexation on clay surfaces. By taking acetate as a model carboxylate, we investigate its inner-sphere complexes adsorbed on clay edges (including (010) and (110) surfaces) and in interlayer space. Simulations show that acetate forms stable monodentate complexes on edge surfaces and a bidentate complex with Ca 2+ in the interlayer region. The free energy calculations indicate that the complexation on edge surfaces is slightly more stable than in interlayer space. By integrating pK a s and desorption free energies of Al coordinated water calculated previously (X. Liu, X. Lu, E. J. Meijer, R. Wang and H. Zhou, Geochim. Cosmochim. Acta, 2012, 81, 56-68; X. Liu, J. Cheng, M. Sprik, X. Lu and R. Wang, Geochim. Cosmochim. Acta, 2014, 140, 410-417), the pH dependence of acetate complexation has been revealed. It shows that acetate forms inner-sphere complexes on (110) in a very limited mildly acidic pH range while it can complex on (010) in the whole common pH range. The results presented in this study form a physical basis for understanding the geochemical processes involving clay-organics interactions.

  1. Procedure generation and verification

    International Nuclear Information System (INIS)

    Sheely, W.F.

    1986-01-01

    The Department of Energy has used Artificial Intelligence of ''AI'' concepts to develop two powerful new computer-based techniques to enhance safety in nuclear applications. The Procedure Generation System, and the Procedure Verification System, can be adapted to other commercial applications, such as a manufacturing plant. The Procedure Generation System can create a procedure to deal with the off-normal condition. The operator can then take correct actions on the system in minimal time. The Verification System evaluates the logic of the Procedure Generator's conclusions. This evaluation uses logic techniques totally independent of the Procedure Generator. The rapid, accurate generation and verification of corrective procedures can greatly reduce the human error, possible in a complex (stressful/high stress) situation

  2. WE-F-16A-06: Using 3D Printers to Create Complex Phantoms for Dose Verification, Quality Assurance, and Treatment Planning System Commissioning in Radiotherapy

    International Nuclear Information System (INIS)

    Kassaee, A; Ding, X; McDonough, J; Reiche, M; Witztum, A; Teo, B

    2014-01-01

    Purpose: To use 3D printers to design and construct complex geometrical phantoms for commissioning treatment planning systems, dose calculation algorithms, quality assurance (QA), dose delivery, and patient dose verifications. Methods: In radiotherapy, complex geometrical phantoms are often required for dose verification, dose delivery and calculation algorithm validation. Presently, fabrication of customized phantoms is limited due to time, expense and challenges in machining of complex shapes. In this work, we designed and utilized 3D printers to fabricate two phantoms for QA purposes. One phantom includes hills and valleys (HV) for verification of intensity modulated radiotherapy for photons, and protons (IMRT and IMPT). The other phantom includes cylindrical cavities (CC) of various sizes for dose verification of inhomogeneities. We evaluated the HV phantoms for an IMPT beam, and the CC phantom to study various inhomogeneity configurations using photon, electron, and proton beams. Gafcromic ™ films were used to quantify the dose distributions delivered to the phantoms. Results: The HV phantom has dimensions of 12 cm × 12 cm and consists of one row and one column of five peaks with heights ranging from 2 to 5 cm. The CC phantom has a size 10 cm × 14 cm and includes 6 cylindrical cavities with length of 7.2 cm and diameters ranging from 0.6 to 1.2 cm. The IMPT evaluation using the HV phantom shows good agreement as compared to the dose distribution calculated with treatment planning system. The CC phantom also shows reasonable agreements for using different algorithms for each beam modalities. Conclusion: 3D printers with submillimiter resolutions are capable of printing complex phantoms for dose verification and QA in radiotherapy. As printing costs decrease and the technology becomes widely available, phantom design and construction will be readily available to any clinic for testing geometries that were not previously feasible

  3. Failure of Cleaning Verification in Pharmaceutical Industry Due to Uncleanliness of Stainless Steel Surface.

    Science.gov (United States)

    Haidar Ahmad, Imad A; Blasko, Andrei

    2017-08-11

    The aim of this work is to identify the parameters that affect the recovery of pharmaceutical residues from the surface of stainless steel coupons. A series of factors were assessed, including drug product spike levels, spiking procedure, drug-excipient ratios, analyst-to-analyst variability, intraday variability, and cleaning procedure of the coupons. The lack of a well-defined procedure that consistently cleaned the coupon surface was identified as the major contributor to low and variable recoveries. Assessment of cleaning the surface of the coupons with clean-in-place solutions (CIP) gave high recovery (>90%) and reproducible results (Srel≤4%) regardless of the conditions that were assessed previously. The approach was successfully applied for cleaning verification of small molecules (MW <1,000 Da) as well as large biomolecules (MW up to 50,000 Da).

  4. Projected Impact of Compositional Verification on Current and Future Aviation Safety Risk

    Science.gov (United States)

    Reveley, Mary S.; Withrow, Colleen A.; Leone, Karen M.; Jones, Sharon M.

    2014-01-01

    The projected impact of compositional verification research conducted by the National Aeronautic and Space Administration System-Wide Safety and Assurance Technologies on aviation safety risk was assessed. Software and compositional verification was described. Traditional verification techniques have two major problems: testing at the prototype stage where error discovery can be quite costly and the inability to test for all potential interactions leaving some errors undetected until used by the end user. Increasingly complex and nondeterministic aviation systems are becoming too large for these tools to check and verify. Compositional verification is a "divide and conquer" solution to addressing increasingly larger and more complex systems. A review of compositional verification research being conducted by academia, industry, and Government agencies is provided. Forty-four aviation safety risks in the Biennial NextGen Safety Issues Survey were identified that could be impacted by compositional verification and grouped into five categories: automation design; system complexity; software, flight control, or equipment failure or malfunction; new technology or operations; and verification and validation. One capability, 1 research action, 5 operational improvements, and 13 enablers within the Federal Aviation Administration Joint Planning and Development Office Integrated Work Plan that could be addressed by compositional verification were identified.

  5. Surface coatings as xenon diffusion barriers on plastic scintillators : Improving Nuclear-Test-Ban Treaty verification

    OpenAIRE

    Bläckberg, Lisa

    2011-01-01

    This thesis investigates the ability of transparent surface coatings to reduce xenon diffusion into plastic scintillators. The motivation for the work is improved radioxenon monitoring equipment, used with in the framework of the verification regime of the Comprehensive Nuclear-Test-Ban Treaty. A large part of the equipment used in this context incorporates plastic scintillators which are in direct contact with the radioactive gas to be detected. One problem with such setup is that radioxenon...

  6. GEOQUIMICO : an interactive tool for comparing sorption conceptual models (surface complexation modeling versus K[D])

    International Nuclear Information System (INIS)

    Hammond, Glenn E.; Cygan, Randall Timothy

    2007-01-01

    Within reactive geochemical transport, several conceptual models exist for simulating sorption processes in the subsurface. Historically, the K D approach has been the method of choice due to ease of implementation within a reactive transport model and straightforward comparison with experimental data. However, for modeling complex sorption phenomenon (e.g. sorption of radionuclides onto mineral surfaces), this approach does not systematically account for variations in location, time, or chemical conditions, and more sophisticated methods such as a surface complexation model (SCM) must be utilized. It is critical to determine which conceptual model to use; that is, when the material variation becomes important to regulatory decisions. The geochemical transport tool GEOQUIMICO has been developed to assist in this decision-making process. GEOQUIMICO provides a user-friendly framework for comparing the accuracy and performance of sorption conceptual models. The model currently supports the K D and SCM conceptual models. The code is written in the object-oriented Java programming language to facilitate model development and improve code portability. The basic theory underlying geochemical transport and the sorption conceptual models noted above is presented in this report. Explanations are provided of how these physicochemical processes are instrumented in GEOQUIMICO and a brief verification study comparing GEOQUIMICO results to data found in the literature is given

  7. Managing Complexity in the MSL/Curiosity Entry, Descent, and Landing Flight Software and Avionics Verification and Validation Campaign

    Science.gov (United States)

    Stehura, Aaron; Rozek, Matthew

    2013-01-01

    The complexity of the Mars Science Laboratory (MSL) mission presented the Entry, Descent, and Landing systems engineering team with many challenges in its Verification and Validation (V&V) campaign. This paper describes some of the logistical hurdles related to managing a complex set of requirements, test venues, test objectives, and analysis products in the implementation of a specific portion of the overall V&V program to test the interaction of flight software with the MSL avionics suite. Application-specific solutions to these problems are presented herein, which can be generalized to other space missions and to similar formidable systems engineering problems.

  8. Property-driven functional verification technique for high-speed vision system-on-chip processor

    Science.gov (United States)

    Nshunguyimfura, Victor; Yang, Jie; Liu, Liyuan; Wu, Nanjian

    2017-04-01

    The implementation of functional verification in a fast, reliable, and effective manner is a challenging task in a vision chip verification process. The main reason for this challenge is the stepwise nature of existing functional verification techniques. This vision chip verification complexity is also related to the fact that in most vision chip design cycles, extensive efforts are focused on how to optimize chip metrics such as performance, power, and area. Design functional verification is not explicitly considered at an earlier stage at which the most sound decisions are made. In this paper, we propose a semi-automatic property-driven verification technique. The implementation of all verification components is based on design properties. We introduce a low-dimension property space between the specification space and the implementation space. The aim of this technique is to speed up the verification process for high-performance parallel processing vision chips. Our experimentation results show that the proposed technique can effectively improve the verification effort up to 20% for the complex vision chip design while reducing the simulation and debugging overheads.

  9. Lessons Learned From Microkernel Verification — Specification is the New Bottleneck

    Directory of Open Access Journals (Sweden)

    Thorsten Bormer

    2012-11-01

    Full Text Available Software verification tools have become a lot more powerful in recent years. Even verification of large, complex systems is feasible, as demonstrated in the L4.verified and Verisoft XT projects. Still, functional verification of large software systems is rare – for reasons beyond the large scale of verification effort needed due to the size alone. In this paper we report on lessons learned for verification of large software systems based on the experience gained in microkernel verification in the Verisoft XT project. We discuss a number of issues that impede widespread introduction of formal verification in the software life-cycle process.

  10. Runtime Verification Through Forward Chaining

    Directory of Open Access Journals (Sweden)

    Alan Perotti

    2014-12-01

    Full Text Available In this paper we present a novel rule-based approach for Runtime Verification of FLTL properties over finite but expanding traces. Our system exploits Horn clauses in implication form and relies on a forward chaining-based monitoring algorithm. This approach avoids the branching structure and exponential complexity typical of tableaux-based formulations, creating monitors with a single state and a fixed number of rules. This allows for a fast and scalable tool for Runtime Verification: we present the technical details together with a working implementation.

  11. The effect of two complexity factors on the performance of emergency tasks-An experimental verification

    International Nuclear Information System (INIS)

    Park, Jinkyun; Jung, Wondea; Jung, Kwangtae

    2008-01-01

    It is well known that the use of procedures is very important in securing the safety of process systems, since good procedures effectively guide human operators by providing 'what should be done' and 'how to do it', especially under stressful conditions. At the same time, it has been emphasized that the use of complicated procedures could drastically impair operators' performance. This means that a systematic approach that can properly evaluate the complexity of procedures is indispensable for minimizing the side effects of complicated procedures. For this reason, Park et al. have developed a task complexity measure called TACOM that can be used to quantify the complexity of tasks stipulated in emergency operating procedures (EOPs) of nuclear power plants (NPPs). The TACOM measure consists of five sub-measures that can cover five important factors making the performance of emergency tasks complicated. However, a verification activity for two kinds of complexity factors-the level of abstraction hierarchy (AH) and engineering decision (ED)-seems to be insufficient. In this study, therefore, an experiment is conducted by using a low-fidelity simulator in order to clarify the appropriateness of these complexity factors. As a result, it seems that subjects' performance data are affected by the level of AH as well as ED. Therefore it is anticipate that both the level of AH and ED will play an important role in evaluating the complexity of EOPs

  12. Design verification for large reprocessing plants (Proposed procedures)

    International Nuclear Information System (INIS)

    Rolandi, G.

    1988-07-01

    In the 1990s, four large commercial reprocessing plants will progressively come into operation: If an effective and efficient safeguards system is to be applied to these large and complex plants, several important factors have to be considered. One of these factors, addressed in the present report, concerns plant design verification. Design verification provides an overall assurance on plant measurement data. To this end design verification, although limited to the safeguards aspects of the plant, must be a systematic activity, which starts during the design phase, continues during the construction phase and is particularly performed during the various steps of the plant's commissioning phase. The detailed procedures for design information verification on commercial reprocessing plants must be defined within the frame of the general provisions set forth in INFCIRC/153 for any type of safeguards related activities and specifically for design verification. The present report is intended as a preliminary contribution on a purely technical level, and focusses on the problems within the Agency. For the purpose of the present study the most complex case was assumed: i.e. a safeguards system based on conventional materials accountancy, accompanied both by special input and output verification and by some form of near-real-time accountancy involving in-process inventory taking, based on authenticated operator's measurement data. C/S measures are also foreseen, where necessary to supplement the accountancy data. A complete ''design verification'' strategy comprehends: informing the Agency of any changes in the plant system which are defined as ''safeguards relevant''; ''reverifying by the Agency upon receiving notice from the Operator on any changes, on ''design information''. 13 refs

  13. High-level verification

    CERN Document Server

    Lerner, Sorin; Kundu, Sudipta

    2011-01-01

    Given the growing size and heterogeneity of Systems on Chip (SOC), the design process from initial specification to chip fabrication has become increasingly complex. This growing complexity provides incentive for designers to use high-level languages such as C, SystemC, and SystemVerilog for system-level design. While a major goal of these high-level languages is to enable verification at a higher level of abstraction, allowing early exploration of system-level designs, the focus so far for validation purposes has been on traditional testing techniques such as random testing and scenario-based

  14. Holographic subregion complexity for singular surfaces

    Energy Technology Data Exchange (ETDEWEB)

    Bakhshaei, Elaheh [Isfahan University of Technology, Department of Physics, Isfahan (Iran, Islamic Republic of); Mollabashi, Ali [Institute for Research in Fundamental Sciences (IPM), School of Physics, Tehran (Iran, Islamic Republic of); Shirzad, Ahmad [Isfahan University of Technology, Department of Physics, Isfahan (Iran, Islamic Republic of); Institute for Research in Fundamental Sciences (IPM), School of Particles and Accelerators, Tehran (Iran, Islamic Republic of)

    2017-10-15

    Recently holographic prescriptions were proposed to compute the quantum complexity of a given state in the boundary theory. A specific proposal known as 'holographic subregion complexity' is supposed to calculate the complexity of a reduced density matrix corresponding to a static subregion. We study different families of singular subregions in the dual field theory and find the divergence structure and universal terms of holographic subregion complexity for these singular surfaces. We find that there are new universal terms, logarithmic in the UV cut-off, due to the singularities of a family of surfaces including a kink in (2 + 1) dimensions and cones in even dimensional field theories. We also find examples of new divergent terms such as squared logarithm and negative powers times the logarithm of the UV cut-off parameter. (orig.)

  15. The interaction between surface color and color knowledge: behavioral and electrophysiological evidence.

    Science.gov (United States)

    Bramão, Inês; Faísca, Luís; Forkstam, Christian; Inácio, Filomena; Araújo, Susana; Petersson, Karl Magnus; Reis, Alexandra

    2012-02-01

    In this study, we used event-related potentials (ERPs) to evaluate the contribution of surface color and color knowledge information in object identification. We constructed two color-object verification tasks - a surface and a knowledge verification task - using high color diagnostic objects; both typical and atypical color versions of the same object were presented. Continuous electroencephalogram was recorded from 26 subjects. A cluster randomization procedure was used to explore the differences between typical and atypical color objects in each task. In the color knowledge task, we found two significant clusters that were consistent with the N350 and late positive complex (LPC) effects. Atypical color objects elicited more negative ERPs compared to typical color objects. The color effect found in the N350 time window suggests that surface color is an important cue that facilitates the selection of a stored object representation from long-term memory. Moreover, the observed LPC effect suggests that surface color activates associated semantic knowledge about the object, including color knowledge representations. We did not find any significant differences between typical and atypical color objects in the surface color verification task, which indicates that there is little contribution of color knowledge to resolve the surface color verification. Our main results suggest that surface color is an important visual cue that triggers color knowledge, thereby facilitating object identification. Copyright © 2011 Elsevier Inc. All rights reserved.

  16. Challenges in the Verification of Reinforcement Learning Algorithms

    Science.gov (United States)

    Van Wesel, Perry; Goodloe, Alwyn E.

    2017-01-01

    Machine learning (ML) is increasingly being applied to a wide array of domains from search engines to autonomous vehicles. These algorithms, however, are notoriously complex and hard to verify. This work looks at the assumptions underlying machine learning algorithms as well as some of the challenges in trying to verify ML algorithms. Furthermore, we focus on the specific challenges of verifying reinforcement learning algorithms. These are highlighted using a specific example. Ultimately, we do not offer a solution to the complex problem of ML verification, but point out possible approaches for verification and interesting research opportunities.

  17. Verification of Triple Modular Redundancy Insertion for Reliable and Trusted Systems

    Science.gov (United States)

    Berg, Melanie; LaBel, Kenneth

    2016-01-01

    If a system is required to be protected using triple modular redundancy (TMR), improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process and the complexity of digital designs, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems.

  18. Automated Verification of Virtualized Infrastructures

    DEFF Research Database (Denmark)

    Bleikertz, Sören; Gross, Thomas; Mödersheim, Sebastian Alexander

    2011-01-01

    Virtualized infrastructures and clouds present new challenges for security analysis and formal verification: they are complex environments that continuously change their shape, and that give rise to non-trivial security goals such as isolation and failure resilience requirements. We present a pla...

  19. Integrated Design Validation: Combining Simulation and Formal Verification for Digital Integrated Circuits

    Directory of Open Access Journals (Sweden)

    Lun Li

    2006-04-01

    Full Text Available The correct design of complex hardware continues to challenge engineers. Bugs in a design that are not uncovered in early design stages can be extremely expensive. Simulation is a predominantly used tool to validate a design in industry. Formal verification overcomes the weakness of exhaustive simulation by applying mathematical methodologies to validate a design. The work described here focuses upon a technique that integrates the best characteristics of both simulation and formal verification methods to provide an effective design validation tool, referred as Integrated Design Validation (IDV. The novelty in this approach consists of three components, circuit complexity analysis, partitioning based on design hierarchy, and coverage analysis. The circuit complexity analyzer and partitioning decompose a large design into sub-components and feed sub-components to different verification and/or simulation tools based upon known existing strengths of modern verification and simulation tools. The coverage analysis unit computes the coverage of design validation and improves the coverage by further partitioning. Various simulation and verification tools comprising IDV are evaluated and an example is used to illustrate the overall validation process. The overall process successfully validates the example to a high coverage rate within a short time. The experimental result shows that our approach is a very promising design validation method.

  20. Near-surface monitoring strategies for geologic carbon dioxide storage verification

    Energy Technology Data Exchange (ETDEWEB)

    Oldenburg, Curtis M.; Lewicki, Jennifer L.; Hepple, Robert P.

    2003-10-31

    Geologic carbon sequestration is the capture of anthropogenic carbon dioxide (CO{sub 2}) and its storage in deep geologic formations. Geologic CO{sub 2} storage verification will be needed to ensure that CO{sub 2} is not leaking from the intended storage formation and seeping out of the ground. Because the ultimate failure of geologic CO{sub 2} storage occurs when CO{sub 2} seeps out of the ground into the atmospheric surface layer, and because elevated concentrations of CO{sub 2} near the ground surface can cause health, safety, and environmental risks, monitoring will need to be carried out in the near-surface environment. The detection of a CO{sub 2} leakage or seepage signal (LOSS) in the near-surface environment is challenging because there are large natural variations in CO{sub 2} concentrations and fluxes arising from soil, plant, and subsurface processes. The term leakage refers to CO{sub 2} migration away from the intended storage site, while seepage is defined as CO{sub 2} passing from one medium to another, for example across the ground surface. The flow and transport of CO{sub 2} at high concentrations in the near-surface environment will be controlled by its high density, low viscosity, and high solubility in water relative to air. Numerical simulations of leakage and seepage show that CO{sub 2} concentrations can reach very high levels in the shallow subsurface even for relatively modest CO{sub 2} leakage fluxes. However, once CO{sub 2} seeps out of the ground into the atmospheric surface layer, surface winds are effective at dispersing CO{sub 2} seepage. In natural ecological systems with no CO{sub 2} LOSS, near-surface CO{sub 2} fluxes and concentrations are controlled by CO{sub 2} uptake by photosynthesis, and production by root respiration, organic carbon biodegradation in soil, deep outgassing of CO{sub 2}, and by exchange of CO{sub 2} with the atmosphere. Existing technologies available for monitoring CO{sub 2} in the near-surface environment

  1. Experimental verification of active IR stealth technology by controlling the surface temperature using a thermoelectric element

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Dong Geon; Han, Kuk Il; Choi, Jun Hyuk; Kim, Tae Kuk [Dept. of Mechanical Engineering, Chung Ang University, Seoul (Korea, Republic of)

    2016-10-15

    In this paper, we propose a technique for IR low-observability that uses an active IR signal tuning through the real time control of the object surface temperature according to the varying background environment. This is achieved by applying the proper object surface temperature obtained to result in the minimum radiance difference between the object and the background. Experimental verification by using the thermoelectric temperature control element shows that the IR radiance contrast between the object and the background can be reduced up to 99% during the night and up to 95% during the day time as compared to the un-tuned original radiance contrast values. The stealth technology demonstrated in this paper may be applied for many military systems needed for the IR stealth performance when a suitable temperature control unit is developed.

  2. Experimental verification of active IR stealth technology by controlling the surface temperature using a thermoelectric element

    International Nuclear Information System (INIS)

    Kim, Dong Geon; Han, Kuk Il; Choi, Jun Hyuk; Kim, Tae Kuk

    2016-01-01

    In this paper, we propose a technique for IR low-observability that uses an active IR signal tuning through the real time control of the object surface temperature according to the varying background environment. This is achieved by applying the proper object surface temperature obtained to result in the minimum radiance difference between the object and the background. Experimental verification by using the thermoelectric temperature control element shows that the IR radiance contrast between the object and the background can be reduced up to 99% during the night and up to 95% during the day time as compared to the un-tuned original radiance contrast values. The stealth technology demonstrated in this paper may be applied for many military systems needed for the IR stealth performance when a suitable temperature control unit is developed

  3. Alkali-crown ether complexes at metal surfaces

    Energy Technology Data Exchange (ETDEWEB)

    Thontasen, Nicha; Deng, Zhitao; Rauschenbach, Stephan [Max Planck Institute for Solid State Research, Stuttgart (Germany); Levita, Giacomo [University of Trieste, Trieste (Italy); Malinowski, Nikola [Max Planck Institute for Solid State Research, Stuttgart (Germany); Bulgarian Academy of Sciences, Sofia (Bulgaria); Kern, Klaus [Max Planck Institute for Solid State Research, Stuttgart (Germany); EPFL, Lausanne (Switzerland)

    2010-07-01

    Crown ethers are polycyclic ethers which, in solution, selectively bind cations depending on the size of the ring cavity. The study of a single host-guest complex is highly desirable in order to reveal the characteristics of these specific interactions at the atomic scale. Such detailed investigation is possible at the surface where high resolution imaging tools like scanning tunneling microscopy (STM) can be applied. Here, electrospray ion beam deposition (ES-IBD) is employed for the deposition of Dibenzo-24-crown-8 (DB24C8)-H{sup +}, -Na{sup +} and -Cs{sup +} complexes on a solid surface in ultrahigh vacuum (UHV). Where other deposition techniques have not been successful, this deposition technique combines the advantages of solution based preparation of the complex ions with a highly clean and controlled deposition in UHV. Single molecular structures and the cation-binding of DB24C8 at the surface are studied in situ by STM and MALDI-MS (matrix assisted laser desorption ionization mass spectrometry). The internal structure of the complex, i.e. ring and cavity, is observable only when alkali cations are incorporated. The BD24C8-H{sup +} complex in contrast appears as a compact feature. This result is in good agreement with theoretical models based on density functional theory calculations.

  4. Multi-centre audit of VMAT planning and pre-treatment verification.

    Science.gov (United States)

    Jurado-Bruggeman, Diego; Hernández, Victor; Sáez, Jordi; Navarro, David; Pino, Francisco; Martínez, Tatiana; Alayrach, Maria-Elena; Ailleres, Norbert; Melero, Alejandro; Jornet, Núria

    2017-08-01

    We performed a multi-centre intercomparison of VMAT dose planning and pre-treatment verification. The aims were to analyse the dose plans in terms of dosimetric quality and deliverability, and to validate whether in-house pre-treatment verification results agreed with those of an external audit. The nine participating centres encompassed different machines, equipment, and methodologies. Two mock cases (prostate and head and neck) were planned using one and two arcs. A plan quality index was defined to compare the plans and different complexity indices were calculated to check their deliverability. We compared gamma index pass rates using the centre's equipment and methodology to those of an external audit (global 3D gamma, absolute dose differences, 10% of maximum dose threshold). Log-file analysis was performed to look for delivery errors. All centres fulfilled the dosimetric goals but plan quality and delivery complexity were heterogeneous and uncorrelated, depending on the manufacturer and the planner's methodology. Pre-treatment verifications results were within tolerance in all cases for gamma 3%-3mm evaluation. Nevertheless, differences between the external audit and in-house measurements arose due to different equipment or methodology, especially for 2%-2mm criteria with differences up to 20%. No correlation was found between complexity indices and verification results amongst centres. All plans fulfilled dosimetric constraints, but plan quality and complexity did not correlate and were strongly dependent on the planner and the vendor. In-house measurements cannot completely replace external audits for credentialing. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Tree dimension in verification of constrained Horn clauses

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick; Ganty, Pierre

    2018-01-01

    In this paper, we show how the notion of tree dimension can be used in the verification of constrained Horn clauses (CHCs). The dimension of a tree is a numerical measure of its branching complexity and the concept here applies to Horn clause derivation trees. Derivation trees of dimension zero c...... algorithms using these constructions to decompose a CHC verification problem. One variation of this decomposition considers derivations of successively increasing dimension. The paper includes descriptions of implementations and experimental results....

  6. Surface-complexation models for sorption onto heterogeneous surfaces

    International Nuclear Information System (INIS)

    Harvey, K.B.

    1997-10-01

    This report provides a description of the discrete-logK spectrum model, together with a description of its derivation, and of its place in the larger context of surface-complexation modelling. The tools necessary to apply the discrete-logK spectrum model are discussed, and background information appropriate to this discussion is supplied as appendices. (author)

  7. CFD code verification and the method of manufactured solutions

    International Nuclear Information System (INIS)

    Pelletier, D.; Roache, P.J.

    2002-01-01

    This paper presents the Method of Manufactured Solutions (MMS) for CFD code verification. The MMS provides benchmark solutions for direct evaluation of the solution error. The best benchmarks are exact analytical solutions with sufficiently complex solution structure to ensure that all terms of the differential equations are exercised in the simulation. The MMS provides a straight forward and general procedure for generating such solutions. When used with systematic grid refinement studies, which are remarkably sensitive, the MMS provides strong code verification with a theorem-like quality. The MMS is first presented on simple 1-D examples. Manufactured solutions for more complex problems are then presented with sample results from grid convergence studies. (author)

  8. Compact complex surfaces with geometric structures related to split quaternions

    International Nuclear Information System (INIS)

    Davidov, Johann; Grantcharov, Gueo; Mushkarov, Oleg; Yotov, Miroslav

    2012-01-01

    We study the problem of existence of geometric structures on compact complex surfaces that are related to split quaternions. These structures, called para-hypercomplex, para-hyperhermitian and para-hyperkähler, are analogs of the hypercomplex, hyperhermitian and hyperkähler structures in the definite case. We show that a compact 4-manifold carries a para-hyperkähler structure iff it has a metric of split signature together with two parallel, null, orthogonal, pointwise linearly independent vector fields. Every compact complex surface admitting a para-hyperhermitian structure has vanishing first Chern class and we show that, unlike the definite case, many of these surfaces carry infinite-dimensional families of such structures. We provide also compact examples of complex surfaces with para-hyperhermitian structures which are not locally conformally para-hyperkähler. Finally, we discuss the problem of non-existence of para-hyperhermitian structures on Inoue surfaces of type S 0 and provide a list of compact complex surfaces which could carry para-hypercomplex structures.

  9. Automated Formal Verification for PLC Control Systems

    CERN Multimedia

    Fernández Adiego, Borja

    2014-01-01

    Programmable Logic Controllers (PLCs) are widely used devices used in industrial control systems. Ensuring that the PLC software is compliant with its specification is a challenging task. Formal verification has become a recommended practice to ensure the correctness of the safety-critical software. However, these techniques are still not widely applied in industry due to the complexity of building formal models, which represent the system and the formalization of requirement specifications. We propose a general methodology to perform automated model checking of complex properties expressed in temporal logics (e.g. CTL, LTL) on PLC programs. This methodology is based on an Intermediate Model (IM), meant to transform PLC programs written in any of the languages described in the IEC 61131-3 standard (ST, IL, etc.) to different modeling languages of verification tools. This approach has been applied to CERN PLC programs validating the methodology.

  10. New Route to Synthesize Surface Organometallic Complexes (SOMC): An Approach by Alkylating Halogenated Surface Organometallic Fragments

    KAUST Repository

    Hamieh, Ali Imad Ali

    2017-01-01

    The aim of this thesis is to explore new simpler and efficient routes for the preparation of surface organometallic complexes (SOMC) for the transformation of small organic molecules to valuable products. The key element in this new route relies on surface alkylation of various halogenated surface coordination complexes or organometallic fragments (SOMF).

  11. New Route to Synthesize Surface Organometallic Complexes (SOMC): An Approach by Alkylating Halogenated Surface Organometallic Fragments

    KAUST Repository

    Hamieh, Ali Imad

    2017-02-01

    The aim of this thesis is to explore new simpler and efficient routes for the preparation of surface organometallic complexes (SOMC) for the transformation of small organic molecules to valuable products. The key element in this new route relies on surface alkylation of various halogenated surface coordination complexes or organometallic fragments (SOMF).

  12. Hazardous Materials Verification and Limited Characterization Report on Sodium and Caustic Residuals in Materials and Fuel Complex Facilities MFC-799/799A

    Energy Technology Data Exchange (ETDEWEB)

    Gary Mecham

    2010-08-01

    This report is a companion to the Facilities Condition and Hazard Assessment for Materials and Fuel Complex Sodium Processing Facilities MFC-799/799A and Nuclear Calibration Laboratory MFC-770C (referred to as the Facilities Condition and Hazards Assessment). This report specifically responds to the requirement of Section 9.2, Item 6, of the Facilities Condition and Hazards Assessment to provide an updated assessment and verification of the residual hazardous materials remaining in the Sodium Processing Facilities processing system. The hazardous materials of concern are sodium and sodium hydroxide (caustic). The information supplied in this report supports the end-point objectives identified in the Transition Plan for Multiple Facilities at the Materials and Fuels Complex, Advanced Test Reactor, Central Facilities Area, and Power Burst Facility, as well as the deactivation and decommissioning critical decision milestone 1, as specified in U.S. Department of Energy Guide 413.3-8, “Environmental Management Cleanup Projects.” Using a tailored approach and based on information obtained through a combination of process knowledge, emergency management hazardous assessment documentation, and visual inspection, this report provides sufficient detail regarding the quantity of hazardous materials for the purposes of facility transfer; it also provides that further characterization/verification of these materials is unnecessary.

  13. Organics Verification Study for Sinclair and Dyes Inlets, Washington

    Energy Technology Data Exchange (ETDEWEB)

    Kohn, Nancy P.; Brandenberger, Jill M.; Niewolny, Laurie A.; Johnston, Robert K.

    2006-09-28

    Sinclair and Dyes Inlets near Bremerton, Washington, are on the State of Washington 1998 303(d) list of impaired waters because of fecal coliform contamination in marine water, metals in sediment and fish tissue, and organics in sediment and fish tissue. Because significant cleanup and source control activities have been conducted in the inlets since the data supporting the 1998 303(d) listings were collected, two verification studies were performed to address the 303(d) segments that were listed for metal and organic contaminants in marine sediment. The Metals Verification Study (MVS) was conducted in 2003; the final report, Metals Verification Study for Sinclair and Dyes Inlets, Washington, was published in March 2004 (Kohn et al. 2004). This report describes the Organics Verification Study that was conducted in 2005. The study approach was similar to the MVS in that many surface sediment samples were screened for the major classes of organic contaminants, and then the screening results and other available data were used to select a subset of samples for quantitative chemical analysis. Because the MVS was designed to obtain representative data on concentrations of contaminants in surface sediment throughout Sinclair Inlet, Dyes Inlet, Port Orchard Passage, and Rich Passage, aliquots of the 160 MVS sediment samples were used in the analysis for the Organics Verification Study. However, unlike metals screening methods, organics screening methods are not specific to individual organic compounds, and are not available for some target organics. Therefore, only the quantitative analytical results were used in the organics verification evaluation. The results of the Organics Verification Study showed that sediment quality outside of Sinclair Inlet is unlikely to be impaired because of organic contaminants. Similar to the results for metals, in Sinclair Inlet, the distribution of residual organic contaminants is generally limited to nearshore areas already within the

  14. Uranium(VI) sorption onto magnetite. Increasing confidence in surface complexation models using chemically evident surface chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Bok, Frank [Helmholtz-Zentrum Dresden-Rossendorf e.V., Dresden (Germany). Surface Processes

    2017-06-01

    Surface complexation models have made great efforts in describing the sorption of various radionuclides on naturally occurring mineral phases. Unfortunately, many of the published sorption parameter sets are built upon unrealistic or even wrong surface chemistry. This work describes the benefit of combining spectroscopic and batch sorption experimental data to create a reliable and consistent surface complexation parameter set.

  15. Independent Verification Final Summary Report for the David Witherspoon, Inc. 1630 Site Knoxville, Tennessee

    Energy Technology Data Exchange (ETDEWEB)

    P.C. Weaver

    2009-04-29

    The primary objective of the independent verification was to determine if BJC performed the appropriate actions to meet the specified “hot spot” cleanup criteria of 500 picocuries per gram (pCi/g) uranium-238 (U-238) in surface soil. Specific tasks performed by the independent verification team (IVT) to satisfy this objective included: 1) performing radiological walkover surveys, and 2) collecting soil samples for independent analyses. The independent verification (IV) efforts were designed to evaluate radioactive contaminants (specifically U-238) in the exposed surfaces below one foot of the original site grade, given that the top one foot layer of soil on the site was removed in its entirety.

  16. Independent Verification Final Summary Report for the David Witherspoon, Inc. 1630 Site Knoxville, Tennessee

    International Nuclear Information System (INIS)

    Weaver, P.C.

    2009-01-01

    The primary objective of the independent verification was to determine if BJC performed the appropriate actions to meet the specified 'hot spot' cleanup criteria of 500 picocuries per gram (pCi/g) uranium-238 (U-238) in surface soil. Specific tasks performed by the independent verification team (IVT) to satisfy this objective included: (1) performing radiological walkover surveys, and (2) collecting soil samples for independent analyses. The independent verification (IV) efforts were designed to evaluate radioactive contaminants (specifically U-238) in the exposed surfaces below one foot of the original site grade, given that the top one foot layer of soil on the site was removed in its entirety.

  17. Knowledge base verification based on enhanced colored petri net

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Hyun; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1998-12-31

    Verification is a process aimed at demonstrating whether a system meets it`s specified requirements. As expert systems are used in various applications, the knowledge base verification of systems takes an important position. The conventional Petri net approach that has been studied recently in order to verify the knowledge base is found that it is inadequate to verify the knowledge base of large and complex system, such as alarm processing system of nuclear power plant. Thus, we propose an improved method that models the knowledge base as enhanced colored Petri net. In this study, we analyze the reachability and the error characteristics of the knowledge base and apply the method to verification of simple knowledge base. 8 refs., 4 figs. (Author)

  18. Knowledge base verification based on enhanced colored petri net

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Hyun; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1997-12-31

    Verification is a process aimed at demonstrating whether a system meets it`s specified requirements. As expert systems are used in various applications, the knowledge base verification of systems takes an important position. The conventional Petri net approach that has been studied recently in order to verify the knowledge base is found that it is inadequate to verify the knowledge base of large and complex system, such as alarm processing system of nuclear power plant. Thus, we propose an improved method that models the knowledge base as enhanced colored Petri net. In this study, we analyze the reachability and the error characteristics of the knowledge base and apply the method to verification of simple knowledge base. 8 refs., 4 figs. (Author)

  19. Neighbors Based Discriminative Feature Difference Learning for Kinship Verification

    DEFF Research Database (Denmark)

    Duan, Xiaodong; Tan, Zheng-Hua

    2015-01-01

    In this paper, we present a discriminative feature difference learning method for facial image based kinship verification. To transform feature difference of an image pair to be discriminative for kinship verification, a linear transformation matrix for feature difference between an image pair...... than the commonly used feature concatenation, leading to a low complexity. Furthermore, there is no positive semi-definitive constrain on the transformation matrix while there is in metric learning methods, leading to an easy solution for the transformation matrix. Experimental results on two public...... databases show that the proposed method combined with a SVM classification method outperforms or is comparable to state-of-the-art kinship verification methods. © Springer International Publishing AG, Part of Springer Science+Business Media...

  20. Formal Verification of Circuits and Systems

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    The problem of validation and verification of correctness of present day hardware and soft- ware systems has become extemely complex due to the enormous growth in the size of the designs. Today typically 50% to 70% of the design cycle time is spent in verifying correct- ness. While simulation remains a predominant form ...

  1. Development and Verification of Smoothed Particle Hydrodynamics Code for Analysis of Tsunami near NPP

    Energy Technology Data Exchange (ETDEWEB)

    Jo, Young Beom; Kim, Eung Soo [Seoul National Univ., Seoul (Korea, Republic of)

    2014-10-15

    It becomes more complicated when considering the shape and phase of the ground below the seawater. Therefore, some different attempts are required to precisely analyze the behavior of tsunami. This paper introduces an on-going activities on code development in SNU based on an unconventional mesh-free fluid analysis method called Smoothed Particle Hydrodynamics (SPH) and its verification work with some practice simulations. This paper summarizes the on-going development and verification activities on Lagrangian mesh-free SPH code in SNU. The newly developed code can cover equation of motions and heat conduction equation so far, and verification of each models is completed. In addition, parallel computation using GPU is now possible, and GUI is also prepared. If users change input geometry or input values, they can simulate for various conditions geometries. A SPH method has large advantages and potential in modeling of free surface, highly deformable geometry and multi-phase problems that traditional grid-based code has difficulties in analysis. Therefore, by incorporating more complex physical models such as turbulent flow, phase change, two-phase flow, and even solid mechanics, application of the current SPH code is expected to be much more extended including molten fuel behaviors in the sever accident.

  2. Verification of Triple Modular Redundancy (TMR) Insertion for Reliable and Trusted Systems

    Science.gov (United States)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems. If a system is expected to be protected using TMR, improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. This manuscript addresses the challenge of confirming that TMR has been inserted without corruption of functionality and with correct application of the expected TMR topology. The proposed verification method combines the usage of existing formal analysis tools with a novel search-detect-and-verify tool. Field programmable gate array (FPGA),Triple Modular Redundancy (TMR),Verification, Trust, Reliability,

  3. Solving complex and disordered surface structures with electron diffraction

    International Nuclear Information System (INIS)

    Van Hove, M.A.

    1987-10-01

    The past of surface structure determination with low-energy electron diffraction (LEED) will be briefly reviewed, setting the stage for a discussion of recent and future developments. The aim of these developments is to solve complex and disordered surface structures. Some efficient solutions to the theoretical and experimental problems will be presented. Since the theoretical problems dominate, the emphasis will be on theoretical approaches to the calculation of the multiple scattering of electrons through complex and disordered surfaces. 49 refs., 13 figs., 1 tab

  4. Characterization of surface complexes in enhanced Raman scattering

    International Nuclear Information System (INIS)

    Roy, D.; Furtak, T.E.

    1984-01-01

    An indicator molecule, para-nitrosodimethylanaline (p-NDMA), has been used to study the chemical nature of surface complexes involving the active site for SERS in the electrochemical environment. We present evidence for positively charged Ag atoms stabilized by coadsorbed Cl - ions as the primary sites which are produced during the oxidation reduction cycle treatment of an Ag electrode. Depending on the relative number of Cl - ions which influence the Ag site the active site demonstrates a greater or lesser electron accepting character toward p-NDMA. This character is influenced by the applied voltage and by the presence of Tl + ions in the bulk of the solution near the surface. As in previously studied systems p-NDMA/Cl - /Ag complexes demonstrate charge transfer excitation which in this case is from the p-NDMA to the Ag site. These results further solidify the importance of complex formation in electrochemical SERS and suggest that caution should be applied when using SERS as a quantitative measure of surface coverage

  5. DIMENSIONAL VERIFICATION AND QUALITY CONTROL OF IMPLANTS PRODUCED BY ADDITIVE MANUFACTURING

    Directory of Open Access Journals (Sweden)

    Teodor Toth

    2015-07-01

    Full Text Available Purpose: Development of computer technology and alternative manufacturing methods in form of additive manufacturing leads to the manufacture of products with complex shapes. In the field of medicine they include, inter alia, custom-made implants manufactured for a particular patient, such as cranial implants, maxillofacial implants, etc. With regard to the fact that such implants are inserted into a patient’s body, it is necessary to perform the verification, including the shape and dimensional verification. The article deals with the application of the industrial computer tomography within the process of inspection and verification of selected custom-made implant types.Methodology/Approach: The Department of Biomedical Engineering and Measurement performs the verification of medicinal products manufactured by the additive manufacturing technologies from the Ti-6Al-4V (Grade 5 titanium alloy, using the coordinate measuring machine Carl Zeiss Contura G2 and the industrial computed tomography machine Carl Zeiss Metrotom 1500. These equipment fulfil the requirements for the identification and evaluation of dimensions of both, the external and the internal structures. Findings: The article presents the possibilities of the computed tomography utilisation in the inspection of individual implant manufacture using the additive manufacturing technologies. The results indicate that with the adjustment of appropriate input parameters (alignment, this technology is appropriate for the analysis of shape deviations, when compared with the CAD model.Research Limitation/implication: With the increasing distance of measured object from X-ray source, the machine’s resolution function decreases. Decreasing of resolution has a minor impact on the measured dimensions (relatively high tolerances, but has a significant impact on the evaluation of porosity and inclusions. Originality/Value of paper: Currently, the verification of a manufactured implant  can be

  6. DOE handbook: Integrated safety management systems (ISMS) verification. Team leader's handbook

    International Nuclear Information System (INIS)

    1999-06-01

    The primary purpose of this handbook is to provide guidance to the ISMS verification Team Leader and the verification team in conducting ISMS verifications. The handbook describes methods and approaches for the review of the ISMS documentation (Phase I) and ISMS implementation (Phase II) and provides information useful to the Team Leader in preparing the review plan, selecting and training the team, coordinating the conduct of the verification, and documenting the results. The process and techniques described are based on the results of several pilot ISMS verifications that have been conducted across the DOE complex. A secondary purpose of this handbook is to provide information useful in developing DOE personnel to conduct these reviews. Specifically, this handbook describes methods and approaches to: (1) Develop the scope of the Phase 1 and Phase 2 review processes to be consistent with the history, hazards, and complexity of the site, facility, or activity; (2) Develop procedures for the conduct of the Phase 1 review, validating that the ISMS documentation satisfies the DEAR clause as amplified in DOE Policies 450.4, 450.5, 450.6 and associated guidance and that DOE can effectively execute responsibilities as described in the Functions, Responsibilities, and Authorities Manual (FRAM); (3) Develop procedures for the conduct of the Phase 2 review, validating that the description approved by the Approval Authority, following or concurrent with the Phase 1 review, has been implemented; and (4) Describe a methodology by which the DOE ISMS verification teams will be advised, trained, and/or mentored to conduct subsequent ISMS verifications. The handbook provides proven methods and approaches for verifying that commitments related to the DEAR, the FRAM, and associated amplifying guidance are in place and implemented in nuclear and high risk facilities. This handbook also contains useful guidance to line managers when preparing for a review of ISMS for radiological

  7. Verification and Validation for Flight-Critical Systems (VVFCS)

    Science.gov (United States)

    Graves, Sharon S.; Jacobsen, Robert A.

    2010-01-01

    On March 31, 2009 a Request for Information (RFI) was issued by NASA s Aviation Safety Program to gather input on the subject of Verification and Validation (V & V) of Flight-Critical Systems. The responses were provided to NASA on or before April 24, 2009. The RFI asked for comments in three topic areas: Modeling and Validation of New Concepts for Vehicles and Operations; Verification of Complex Integrated and Distributed Systems; and Software Safety Assurance. There were a total of 34 responses to the RFI, representing a cross-section of academic (26%), small & large industry (47%) and government agency (27%).

  8. Inventory of present verification techniques. Viewpoint of EURATOM

    International Nuclear Information System (INIS)

    Kloeckner, W.; Eecken, D. Van der; Gmelin, W.

    1998-01-01

    Starting from the role of Euratom as an established regional safeguards system, an overview is given of verification techniques currently practised by Euratom. In the stage-light of a rapidly changing and complex international safeguards scene, Euratom considers it has an important role to play. Having in mind the possibilities created by accelerating modern technology, recommendations are given for an enhanced use of technological means in safeguards. The viewpoint of Euratom is that the majority of methodologies and techniques in place may very well be copied to or used for a cut-off verification system currently under discussion

  9. Surface complexation modeling of zinc sorption onto ferrihydrite.

    Science.gov (United States)

    Dyer, James A; Trivedi, Paras; Scrivner, Noel C; Sparks, Donald L

    2004-02-01

    A previous study involving lead(II) [Pb(II)] sorption onto ferrihydrite over a wide range of conditions highlighted the advantages of combining molecular- and macroscopic-scale investigations with surface complexation modeling to predict Pb(II) speciation and partitioning in aqueous systems. In this work, an extensive collection of new macroscopic and spectroscopic data was used to assess the ability of the modified triple-layer model (TLM) to predict single-solute zinc(II) [Zn(II)] sorption onto 2-line ferrihydrite in NaNO(3) solutions as a function of pH, ionic strength, and concentration. Regression of constant-pH isotherm data, together with potentiometric titration and pH edge data, was a much more rigorous test of the modified TLM than fitting pH edge data alone. When coupled with valuable input from spectroscopic analyses, good fits of the isotherm data were obtained with a one-species, one-Zn-sorption-site model using the bidentate-mononuclear surface complex, (triple bond FeO)(2)Zn; however, surprisingly, both the density of Zn(II) sorption sites and the value of the best-fit equilibrium "constant" for the bidentate-mononuclear complex had to be adjusted with pH to adequately fit the isotherm data. Although spectroscopy provided some evidence for multinuclear surface complex formation at surface loadings approaching site saturation at pH >/=6.5, the assumption of a bidentate-mononuclear surface complex provided acceptable fits of the sorption data over the entire range of conditions studied. Regressing edge data in the absence of isotherm and spectroscopic data resulted in a fair number of surface-species/site-type combinations that provided acceptable fits of the edge data, but unacceptable fits of the isotherm data. A linear relationship between logK((triple bond FeO)2Zn) and pH was found, given by logK((triple bond FeO)2Znat1g/l)=2.058 (pH)-6.131. In addition, a surface activity coefficient term was introduced to the model to reduce the ionic strength

  10. Photoelectrochemical etching of gallium nitride surface by complexation dissolution mechanism

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Miao-Rong [Suzhou Institute of Nano-Tech and Nano-Bionics, Chinese Academy of Sciences, 215123 Suzhou (China); University of Chinese Academy of Sciences, 100049 Beijing (China); Hou, Fei; Wang, Zu-Gang; Zhang, Shao-Hui [Suzhou Institute of Nano-Tech and Nano-Bionics, Chinese Academy of Sciences, 215123 Suzhou (China); Changchun University of Science and Technology, 130022 Changchun (China); Pan, Ge-Bo, E-mail: gbpan2008@sinano.ac.cn [Suzhou Institute of Nano-Tech and Nano-Bionics, Chinese Academy of Sciences, 215123 Suzhou (China)

    2017-07-15

    Graphical abstract: GaN surface was etched by 0.3 M EDTA-2Na. The proposed complexation dissolution mechanism can be applicable to almost all neutral etchants under the prerequisite of strong light and electric field. - Highlights: • GaN surface was etched by EDTA-2Na. • GaN may be dissolved into EDTA-2Na by forming Ga–EDTA complex. • We propose the complexation dissolution mechanism for the first time. - Abstract: Gallium nitride (GaN) surface was etched by 0.3 M ethylenediamine tetraacetic acid disodium (EDTA-2Na) via photoelectrochemical etching technique. SEM images reveal the etched GaN surface becomes rough and irregular. The pore density is up to 1.9 × 10{sup 9} per square centimeter after simple acid post-treatment. The difference of XPS spectra of Ga 3d, N 1s and O 1s between the non-etched and freshly etched GaN surfaces can be attributed to the formation of Ga–EDTA complex at the etching interface between GaN and EDTA-2Na. The proposed complexation dissolution mechanism can be broadly applicable to almost all neutral etchants under the prerequisite of strong light and electric field. From the point of view of environment, safety and energy, EDTA-2Na has obvious advantages over conventionally corrosive etchants. Moreover, as the further and deeper study of such nearly neutral etchants, GaN etching technology has better application prospect in photoelectric micro-device fabrication.

  11. Verification and Diagnostics Framework in ATLAS Trigger/DAQ

    CERN Document Server

    Barczyk, M.; Caprini, M.; Da Silva Conceicao, J.; Dobson, M.; Flammer, J.; Jones, R.; Kazarov, A.; Kolos, S.; Liko, D.; Lucio, L.; Mapelli, L.; Soloviev, I.; Hart, R.; Amorim, A.; Klose, D.; Lima, J.; Pedro, J.; Wolters, H.; Badescu, E.; Alexandrov, I.; Kotov, V.; Mineev, M.; Ryabov, Yu.; Ryabov, Yu.

    2003-01-01

    Trigger and data acquisition (TDAQ) systems for modern HEP experiments are composed of thousands of hardware and software components depending on each other in a very complex manner. Typically, such systems are operated by non-expert shift operators, which are not aware of system functionality details. It is therefore necessary to help the operator to control the system and to minimize system down-time by providing knowledge-based facilities for automatic testing and verification of system components and also for error diagnostics and recovery. For this purpose, a verification and diagnostic framework was developed in the scope of ATLAS TDAQ. The verification functionality of the framework allows developers to configure simple low-level tests for any component in a TDAQ configuration. The test can be configured as one or more processes running on different hosts. The framework organizes tests in sequences, using knowledge about components hierarchy and dependencies, and allowing the operator to verify the fun...

  12. Development and verification of the CATHENA GUI

    International Nuclear Information System (INIS)

    Chin, T.

    2008-01-01

    This paper presents the development and verification of a graphical user interface for CATHENA MOD-3.5d. The thermalhydraulic computer code CATHENA has been developed to simulate the physical behaviour of the hydraulic components in nuclear reactors and experimental facilities. A representation of the facility is developed as an ASCII text file and used by CATHENA to perform the simulation. The existing method of manual generation of idealizations of a physical system for performing thermal hydraulic analysis is complex, time-consuming and prone to errors. An overview is presented of the CATHENA GUI and its depiction of a CATHENA idealization through the manipulation of a visual collection of objects. The methodologies and rigour involved in the verification of the CATHENA GUI will be discussed. (author)

  13. Research on key technology of the verification system of steel rule based on vision measurement

    Science.gov (United States)

    Jia, Siyuan; Wang, Zhong; Liu, Changjie; Fu, Luhua; Li, Yiming; Lu, Ruijun

    2018-01-01

    The steel rule plays an important role in quantity transmission. However, the traditional verification method of steel rule based on manual operation and reading brings about low precision and low efficiency. A machine vison based verification system of steel rule is designed referring to JJG1-1999-Verificaiton Regulation of Steel Rule [1]. What differentiates this system is that it uses a new calibration method of pixel equivalent and decontaminates the surface of steel rule. Experiments show that these two methods fully meet the requirements of the verification system. Measuring results strongly prove that these methods not only meet the precision of verification regulation, but also improve the reliability and efficiency of the verification system.

  14. SiSn diodes: Theoretical analysis and experimental verification

    KAUST Repository

    Hussain, Aftab M.; Wehbe, Nimer; Hussain, Muhammad Mustafa

    2015-01-01

    We report a theoretical analysis and experimental verification of change in band gap of silicon lattice due to the incorporation of tin (Sn). We formed SiSn ultra-thin film on the top surface of a 4 in. silicon wafer using thermal diffusion of Sn

  15. Verification and Validation of The Tritium Transport Code TMAP7

    International Nuclear Information System (INIS)

    Glen R. Longhurst; James Ambrosek

    2004-01-01

    previously existing features for heat transfer, flows between enclosures, and chemical reactions within the enclosures have been retained, but the allowed problem size and complexity have been increased to take advantage of the greater memory and speed available on modern computers. One additional feature unique to TMAP7 is radioactive decay for both trapped and mobile species. Recently, TMAP7 has undergone verification and validation processes to ensure its performance in a wide variety of problems. This paper describes the use and new capabilities of TMAP7 and presents results of verification and validation testing

  16. Formal development and verification of a distributed railway control system

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, J.

    2000-01-01

    specifications which are transformed into directly implementable distributed control processes by applying a series of refinement and verification steps. Concrete safety requirements are derived from an abstract version that can be easily validated with respect to soundness and completeness. Complexity......The authors introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...... is further reduced by separating the system model into a domain model and a controller model. The domain model describes the physical system in absence of control and the controller model introduces the safety-related control mechanisms as a separate entity monitoring observables of the physical system...

  17. IDEF method for designing seismic information system in CTBT verification

    International Nuclear Information System (INIS)

    Zheng Xuefeng; Shen Junyi; Jin Ping; Zhang Huimin; Zheng Jiangling; Sun Peng

    2004-01-01

    Seismic information system is of great importance for improving the capability of CTBT verification. A large amount of money has been appropriated for the research in this field in the U.S. and some other countries in recent years. However, designing and developing a seismic information system involves various technologies about complex system design. This paper discusses the IDEF0 method to construct function models and the IDEF1x method to make information models systemically, as well as how they are used in designing seismic information system in CTBT verification. (authors)

  18. A study of compositional verification based IMA integration method

    Science.gov (United States)

    Huang, Hui; Zhang, Guoquan; Xu, Wanmeng

    2018-03-01

    The rapid development of avionics systems is driving the application of integrated modular avionics (IMA) systems. But meanwhile it is improving avionics system integration, complexity of system test. Then we need simplify the method of IMA system test. The IMA system supports a module platform that runs multiple applications, and shares processing resources. Compared with federated avionics system, IMA system is difficult to isolate failure. Therefore, IMA system verification will face the critical problem is how to test shared resources of multiple application. For a simple avionics system, traditional test methods are easily realizing to test a whole system. But for a complex system, it is hard completed to totally test a huge and integrated avionics system. Then this paper provides using compositional-verification theory in IMA system test, so that reducing processes of test and improving efficiency, consequently economizing costs of IMA system integration.

  19. ECG Sensor Verification System with Mean-Interval Algorithm for Handling Sport Issue

    Directory of Open Access Journals (Sweden)

    Kuo-Kun Tseng

    2016-01-01

    Full Text Available With the development of biometric verification, we proposed a new algorithm and personal mobile sensor card system for ECG verification. The proposed new mean-interval approach can identify the user quickly with high accuracy and consumes a small amount of flash memory in the microprocessor. The new framework of the mobile card system makes ECG verification become a feasible application to overcome the issues of a centralized database. For a fair and comprehensive evaluation, the experimental results have been tested on public MIT-BIH ECG databases and our circuit system; they confirm that the proposed scheme is able to provide excellent accuracy and low complexity. Moreover, we also proposed a multiple-state solution to handle the heat rate changes of sports problem. It should be the first to address the issue of sports in ECG verification.

  20. A hardware-software system for the automation of verification and calibration of oil metering units secondary equipment

    Science.gov (United States)

    Boyarnikov, A. V.; Boyarnikova, L. V.; Kozhushko, A. A.; Sekachev, A. F.

    2017-08-01

    In the article the process of verification (calibration) of oil metering units secondary equipment is considered. The purpose of the work is to increase the reliability and reduce the complexity of this process by developing a software and hardware system that provides automated verification and calibration. The hardware part of this complex carries out the commutation of the measuring channels of the verified controller and the reference channels of the calibrator in accordance with the introduced algorithm. The developed software allows controlling the commutation of channels, setting values on the calibrator, reading the measured data from the controller, calculating errors and compiling protocols. This system can be used for checking the controllers of the secondary equipment of the oil metering units in the automatic verification mode (with the open communication protocol) or in the semi-automatic verification mode (without it). The peculiar feature of the approach used is the development of a universal signal switch operating under software control, which can be configured for various verification methods (calibration), which allows to cover the entire range of controllers of metering units secondary equipment. The use of automatic verification with the help of a hardware and software system allows to shorten the verification time by 5-10 times and to increase the reliability of measurements, excluding the influence of the human factor.

  1. Surface reaction of SnII on goethite (α-FeOOH): surface complexation, redox reaction, reductive dissolution, and phase transformation.

    Science.gov (United States)

    Dulnee, Siriwan; Scheinost, Andreas C

    2014-08-19

    To elucidate the potential risk of (126)Sn migration from nuclear waste repositories, we investigated the surface reactions of Sn(II) on goethite as a function of pH and Sn(II) loading under anoxic condition with O2 level redox state and surface structure were investigated by Sn K edge X-ray absorption spectroscopy (XAS), goethite phase transformations were investigated by high-resolution transmission electron microscopy and selected area electron diffraction. The results demonstrate the rapid and complete oxidation of Sn(II) by goethite and formation of Sn(IV) (1)E and (2)C surface complexes. The contribution of (2)C complexes increases with Sn loading. The Sn(II) oxidation leads to a quantitative release of Fe(II) from goethite at low pH, and to the precipitation of magnetite at higher pH. To predict Sn sorption, we applied surface complexation modeling using the charge distribution multisite complexation approach and the XAS-derived surface complexes. Log K values of 15.5 ± 1.4 for the (1)E complex and 19.2 ± 0.6 for the (2)C complex consistently predict Sn sorption across pH 2-12 and for two different Sn loadings and confirm the strong retention of Sn(II) even under anoxic conditions.

  2. Software verification for nuclear industry

    International Nuclear Information System (INIS)

    Wilburn, N.P.

    1985-08-01

    Why verification of software products throughout the software life cycle is necessary is considered. Concepts of verification, software verification planning, and some verification methodologies for products generated throughout the software life cycle are then discussed

  3. The design of verification regimes

    International Nuclear Information System (INIS)

    Gallagher, N.W.

    1991-01-01

    Verification of a nuclear agreement requires more than knowledge of relevant technologies and institutional arrangements. It also demands thorough understanding of the nature of verification and the politics of verification design. Arms control efforts have been stymied in the past because key players agreed to verification in principle, only to disagree radically over verification in practice. In this chapter, it is shown that the success and stability of arms control endeavors can be undermined by verification designs which promote unilateral rather than cooperative approaches to security, and which may reduce, rather than enhance, the security of both sides. Drawing on logical analysis and practical lessons from previous superpower verification experience, this chapter summarizes the logic and politics of verification and suggests implications for South Asia. The discussion begins by determining what properties all forms of verification have in common, regardless of the participants or the substance and form of their agreement. Viewing verification as the political process of making decisions regarding the occurrence of cooperation points to four critical components: (1) determination of principles, (2) information gathering, (3) analysis and (4) projection. It is shown that verification arrangements differ primarily in regards to how effectively and by whom these four stages are carried out

  4. Study of applicable methods on safety verification of disposal facilities and waste packages

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-08-15

    Three subjects about safety verification on the disposal of low level radioactive waste were investigated in FY. 2012. For radioactive waste disposal facilities, specs and construction techniques of covering with soil to prevent possible destruction caused by natural events (e.g. earthquake) were studied to consider verification methods for those specs. For waste packages subject to near surface pit disposal, settings of scaling factor and average radioactivity concentration (hereafter referred to as ''SF'') on container-filled and solidified waste packages generated from Kashiwazaki Kariwa Nuclear Power Station Unit 1-5, setting of cesium residual ratio of molten solidified waste generated from Tokai and Tokai No.2 Power Stations, etc. were studied. Those results were finalized in consideration of the opinion from advisory panel, and publicly opened as JNES-EV reports. In FY 2012, five JNES reports were published and these have been used as standards of safety verification on waste packages. The verification method of radioactive wastes subject to near-surface trench disposal and intermediate depth disposal were also studied. For radioactive wastes which will be returned from overseas, determination methods of radioactive concentration, heat rate and hydrogen generation rate of CSD-C were established. Determination methods of radioactive concentration and heat rate of CSD-B were also established. These results will be referred to verification manuals. (author)

  5. Measuring Software Test Verification for Complex Workpieces based on Virtual Gear Measuring Instrument

    Directory of Open Access Journals (Sweden)

    Yin Peili

    2017-08-01

    Full Text Available Validity and correctness test verification of the measuring software has been a thorny issue hindering the development of Gear Measuring Instrument (GMI. The main reason is that the software itself is difficult to separate from the rest of the measurement system for independent evaluation. This paper presents a Virtual Gear Measuring Instrument (VGMI to independently validate the measuring software. The triangular patch model with accurately controlled precision was taken as the virtual workpiece and a universal collision detection model was established. The whole process simulation of workpiece measurement is implemented by VGMI replacing GMI and the measuring software is tested in the proposed virtual environment. Taking involute profile measurement procedure as an example, the validity of the software is evaluated based on the simulation results; meanwhile, experiments using the same measuring software are carried out on the involute master in a GMI. The experiment results indicate a consistency of tooth profile deviation and calibration results, thus verifying the accuracy of gear measuring system which includes the measurement procedures. It is shown that the VGMI presented can be applied in the validation of measuring software, providing a new ideal platform for testing of complex workpiece-measuring software without calibrated artifacts.

  6. Improved verification methods for safeguards verifications at enrichment plants

    International Nuclear Information System (INIS)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D.

    2009-01-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF 6 cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  7. Improved verification methods for safeguards verifications at enrichment plants

    Energy Technology Data Exchange (ETDEWEB)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D. [Department of Safeguards, International Atomic Energy Agency, Wagramer Strasse 5, A1400 Vienna (Austria)

    2009-07-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF{sub 6} cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  8. Symplectic geometry on moduli spaces of holomorphic bundles over complex surfaces

    OpenAIRE

    Khesin, Boris; Rosly, Alexei

    2000-01-01

    We give a comparative description of the Poisson structures on the moduli spaces of flat connections on real surfaces and holomorphic Poisson structures on the moduli spaces of holomorphic bundles on complex surfaces. The symplectic leaves of the latter are classified by restrictions of the bundles to certain divisors. This can be regarded as fixing a "complex analogue of the holonomy" of a connection along a "complex analogue of the boundary" in analogy with the real case.

  9. Physics Verification Overview

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-12

    The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.

  10. Verification and the safeguards legacy

    International Nuclear Information System (INIS)

    Perricos, Demetrius

    2001-01-01

    ; qualitative and quantitative measurements of nuclear material; familiarity and access to sensitive technologies related to detection, unattended verification systems, containment/surveillance and sensors; examination and verification of design information of large and complex facilities; theoretical and practical aspects of technologies relevant to verification objectives; analysis of inspection findings and evaluation of their mutual consistency; negotiations on technical issues with facility operators and State authorities. This experience is reflected in the IAEA Safeguards Manual which sets out the policies and procedures to be followed in the inspection process as well as in the Safeguards Criteria which provide guidance for verification, evaluation and analysis of the inspection findings. The IAEA infrastructure and its experience with verification permitted in 1991 the organization to respond immediately and successfully to the tasks required by the Security Council Resolution 687(1991) for Iraq as well as to the tasks related to the verification of completeness and correctness of the initial declarations in the cases of the DPRK. and of S. Africa. In the case of Iraq the discovery of its undeclared programs was made possible through the existing verification system enhanced by additional access rights, information and application of modern detection technology. Such discoveries made it evident that there was a need for an intensive development effort to strengthen the safeguards system to develop a capability to detect undeclared activities. For this purpose it was recognized that there was need for additional and extended a) access to information, b) access to locations. It was also obvious that access to the Security Council, to bring the IAEA closer to the body responsible for maintenance of international peace and security, would be a requirement for reporting periodically on non-proliferation and the results of the IAEA's verification activities. While the case

  11. A static analysis tool set for assembler code verification

    International Nuclear Information System (INIS)

    Dhodapkar, S.D.; Bhattacharjee, A.K.; Sen, Gopa

    1991-01-01

    Software Verification and Validation (V and V) is an important step in assuring reliability and quality of the software. The verification of program source code forms an important part of the overall V and V activity. The static analysis tools described here are useful in verification of assembler code. The tool set consists of static analysers for Intel 8086 and Motorola 68000 assembly language programs. The analysers examine the program source code and generate information about control flow within the program modules, unreachable code, well-formation of modules, call dependency between modules etc. The analysis of loops detects unstructured loops and syntactically infinite loops. Software metrics relating to size and structural complexity are also computed. This report describes the salient features of the design, implementation and the user interface of the tool set. The outputs generated by the analyser are explained using examples taken from some projects analysed by this tool set. (author). 7 refs., 17 figs

  12. Muscle fatigue and contraction intensity modulates the complexity of surface electromyography.

    Science.gov (United States)

    Cashaback, Joshua G A; Cluff, Tyler; Potvin, Jim R

    2013-02-01

    Nonlinear dynamical techniques offer a powerful approach for the investigation of physiological time series. Multiscale entropy analyses have shown that pathological and aging systems are less complex than healthy systems and this finding has been attributed to degraded physiological control processes. A similar phenomenon may arise during fatiguing muscle contractions where surface electromyography signals undergo temporal and spectral changes that arise from the impaired regulation of muscle force production. Here we examine the affect of fatigue and contraction intensity on the short and long-term complexity of biceps brachii surface electromyography. To investigate, we used an isometric muscle fatigue protocol (parsed into three windows) and three contraction intensities (% of maximal elbow joint moment: 40%, 70% and 100%). We found that fatigue reduced the short-term complexity of biceps brachii activity during the last third of the fatiguing contraction. We also found that the complexity of surface electromyography is dependent on contraction intensity. Our results show that multiscale entropy is sensitive to muscle fatigue and contraction intensity and we argue it is imperative that both factors be considered when evaluating the complexity of surface electromyography signals. Our data contribute to a converging body of evidence showing that multiscale entropy can quantify subtle information content in physiological time series. Copyright © 2012 Elsevier Ltd. All rights reserved.

  13. FMCT verification: Case studies

    International Nuclear Information System (INIS)

    Hui Zhang

    2001-01-01

    Full text: How to manage the trade-off between the need for transparency and the concern about the disclosure of sensitive information would be a key issue during the negotiations of FMCT verification provision. This paper will explore the general concerns on FMCT verification; and demonstrate what verification measures might be applied to those reprocessing and enrichment plants. A primary goal of an FMCT will be to have the five declared nuclear weapon states and the three that operate unsafeguarded nuclear facilities become parties. One focus in negotiating the FMCT will be verification. Appropriate verification measures should be applied in each case. Most importantly, FMCT verification would focus, in the first instance, on these states' fissile material production facilities. After the FMCT enters into force, all these facilities should be declared. Some would continue operating to produce civil nuclear power or to produce fissile material for non- explosive military uses. The verification measures necessary for these operating facilities would be essentially IAEA safeguards, as currently being applied to non-nuclear weapon states under the NPT. However, some production facilities would be declared and shut down. Thus, one important task of the FMCT verifications will be to confirm the status of these closed facilities. As case studies, this paper will focus on the verification of those shutdown facilities. The FMCT verification system for former military facilities would have to differ in some ways from traditional IAEA safeguards. For example, there could be concerns about the potential loss of sensitive information at these facilities or at collocated facilities. Eventually, some safeguards measures such as environmental sampling might be seen as too intrusive. Thus, effective but less intrusive verification measures may be needed. Some sensitive nuclear facilities would be subject for the first time to international inspections, which could raise concerns

  14. Verification and disarmament

    Energy Technology Data Exchange (ETDEWEB)

    Blix, H. [IAEA, Vienna (Austria)

    1998-07-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed.

  15. Verification and disarmament

    International Nuclear Information System (INIS)

    Blix, H.

    1998-01-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed

  16. Investigation of a Verification and Validation Tool with a Turbofan Aircraft Engine Application

    Science.gov (United States)

    Uth, Peter; Narang-Siddarth, Anshu; Wong, Edmond

    2018-01-01

    The development of more advanced control architectures for turbofan aircraft engines can yield gains in performance and efficiency over the lifetime of an engine. However, the implementation of these increasingly complex controllers is contingent on their ability to provide safe, reliable engine operation. Therefore, having the means to verify the safety of new control algorithms is crucial. As a step towards this goal, CoCoSim, a publicly available verification tool for Simulink, is used to analyze C-MAPSS40k, a 40,000 lbf class turbo-fan engine model developed at NASA for testing new control algorithms. Due to current limitations of the verification software, several modifications are made to C-MAPSS40k to achieve compatibility with CoCoSim. Some of these modifications sacrifice fidelity to the original model. Several safety and performance requirements typical for turbofan engines are identified and constructed into a verification framework. Preliminary results using an industry standard baseline controller for these requirements are presented. While verification capabilities are demonstrated, a truly comprehensive analysis will require further development of the verification tool.

  17. Design of Service Net based Correctness Verification Approach for Multimedia Conferencing Service Orchestration

    Directory of Open Access Journals (Sweden)

    Cheng Bo

    2012-02-01

    Full Text Available Multimedia conferencing is increasingly becoming a very important and popular application over Internet. Due to the complexity of asynchronous communications and handle large and dynamically concurrent processes for multimedia conferencing, which confront relevant challenge to achieve sufficient correctness guarantees, and supporting the effective verification methods for multimedia conferencing services orchestration is an extremely difficult and challenging problem. In this paper, we firstly present the Business Process Execution Language (BPEL based conferencing service orchestration, and mainly focus on the service net based correction verification approach for multimedia conferencing services orchestration, which can automatically translated the BPEL based service orchestration into a corresponding Petri net model with the Petri Net Markup Language (PNML, and also present the BPEL service net reduction rules and multimedia conferencing service orchestration correction verification algorithms. We perform the correctness analysis and verification using the service net properties as safeness, reachability and deadlocks, and also provide an automated support tool for the formal analysis and soundness verification for the multimedia conferencing services orchestration scenarios. Finally, we give the comparison and evaluations.

  18. Verification of micro-scale photogrammetry for smooth three-dimensional object measurement

    Science.gov (United States)

    Sims-Waterhouse, Danny; Piano, Samanta; Leach, Richard

    2017-05-01

    By using sub-millimetre laser speckle pattern projection we show that photogrammetry systems are able to measure smooth three-dimensional objects with surface height deviations less than 1 μm. The projection of laser speckle patterns allows correspondences on the surface of smooth spheres to be found, and as a result, verification artefacts with low surface height deviations were measured. A combination of VDI/VDE and ISO standards were also utilised to provide a complete verification method, and determine the quality parameters for the system under test. Using the proposed method applied to a photogrammetry system, a 5 mm radius sphere was measured with an expanded uncertainty of 8.5 μm for sizing errors, and 16.6 μm for form errors with a 95 % confidence interval. Sphere spacing lengths between 6 mm and 10 mm were also measured by the photogrammetry system, and were found to have expanded uncertainties of around 20 μm with a 95 % confidence interval.

  19. Functional Verification of Enhanced RISC Processor

    OpenAIRE

    SHANKER NILANGI; SOWMYA L

    2013-01-01

    This paper presents design and verification of a 32-bit enhanced RISC processor core having floating point computations integrated within the core, has been designed to reduce the cost and complexity. The designed 3 stage pipelined 32-bit RISC processor is based on the ARM7 processor architecture with single precision floating point multiplier, floating point adder/subtractor for floating point operations and 32 x 32 booths multiplier added to the integer core of ARM7. The binary representati...

  20. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT HYDRO COMPLIANCE MANAGEMENT, INC. HYDRO-KLEEN FILTRATION SYSTEM, 03/07/WQPC-SWP, SEPTEMBER 2003

    Science.gov (United States)

    Verification testing of the Hydro-Kleen(TM) Filtration System, a catch-basin filter designed to reduce hydrocarbon, sediment, and metals contamination from surface water flows, was conducted at NSF International in Ann Arbor, Michigan. A Hydro-Kleen(TM) system was fitted into a ...

  1. Verification of Thermal Models of Internally Cooled Gas Turbine Blades

    Directory of Open Access Journals (Sweden)

    Igor Shevchenko

    2018-01-01

    Full Text Available Numerical simulation of temperature field of cooled turbine blades is a required element of gas turbine engine design process. The verification is usually performed on the basis of results of test of full-size blade prototype on a gas-dynamic test bench. A method of calorimetric measurement in a molten metal thermostat for verification of a thermal model of cooled blade is proposed in this paper. The method allows obtaining local values of heat flux in each point of blade surface within a single experiment. The error of determination of local heat transfer coefficients using this method does not exceed 8% for blades with radial channels. An important feature of the method is that the heat load remains unchanged during the experiment and the blade outer surface temperature equals zinc melting point. The verification of thermal-hydraulic model of high-pressure turbine blade with cooling allowing asymmetrical heat removal from pressure and suction sides was carried out using the developed method. An analysis of heat transfer coefficients confirmed the high level of heat transfer in the leading edge, whose value is comparable with jet impingement heat transfer. The maximum of the heat transfer coefficients is shifted from the critical point of the leading edge to the pressure side.

  2. COMPLEX SURFACE HARDENING OF STEEL ARTICLES

    Directory of Open Access Journals (Sweden)

    A. V. Kovalchuk

    2014-01-01

    Full Text Available The method of complex surface hardening of steel detailswas designed. The method is a compound of two processes of hardening: chemical heat treatment and physical vapor deposition (PVD of the coating. The result, achieved in this study is much higher, than in other work on this topic and is cumulative. The method designed can be used in mechanical engineering, medicine, energetics and is perspective for military and space technologies.

  3. HDL to verification logic translator

    Science.gov (United States)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  4. DESIGN INFORMATION VERIFICATION FOR NUCLEAR SAFEGUARDS

    Energy Technology Data Exchange (ETDEWEB)

    Robert S. Bean; Richard R. M. Metcalf; Phillip C. Durst

    2009-07-01

    A critical aspect of international safeguards activities performed by the International Atomic Energy Agency (IAEA) is the verification that facility design and construction (including upgrades and modifications) do not create opportunities for nuclear proliferation. These Design Information Verification activities require that IAEA inspectors compare current and past information about the facility to verify the operator’s declaration of proper use. The actual practice of DIV presents challenges to the inspectors due to the large amount of data generated, concerns about sensitive or proprietary data, the overall complexity of the facility, and the effort required to extract just the safeguards relevant information. Planned and anticipated facilities will (especially in the case of reprocessing plants) be ever larger and increasingly complex, thus exacerbating the challenges. This paper reports the results of a workshop held at the Idaho National Laboratory in March 2009, which considered technologies and methods to address these challenges. The use of 3D Laser Range Finding, Outdoor Visualization System, Gamma-LIDAR, and virtual facility modeling, as well as methods to handle the facility data issues (quantity, sensitivity, and accessibility and portability for the inspector) were presented. The workshop attendees drew conclusions about the use of these techniques with respect to successfully employing them in an operating environment, using a Fuel Conditioning Facility walk-through as a baseline for discussion.

  5. Surface complexation models for uranium adsorption in the sub-surface environment

    International Nuclear Information System (INIS)

    Payne, T.E.

    2007-01-01

    Adsorption experiments with soil component minerals under a range of conditions are being used to develop models of uranium(VI) uptake in the sub-surface environment. The results show that adsorption of U on iron oxides and clay minerals is influenced by chemical factors including the pH, partial pressure of CO 2 , and the presence of ligands such as phosphate. Surface complexation models (SCMs) can be used to simulate U adsorption on these minerals. The SCMs are based on plausible mechanistic assumptions and describe the experimental data more adequately than Kd values or sorption isotherms. It is conceptually possible to simulate U sorption data on complex natural samples by combining SCMs for individual component minerals. This approach was used to develop a SCM for U adsorption to mineral assemblages from Koongarra (Australia), and produced a reasonable description of U uptake. In order to assess the applicability of experimental data to the field situation, in-situ measurements of U distributions between solid and liquid phases were undertaken at the Koongarra U deposit. This field partitioning data showed a satisfactory agreement with laboratory sorption data obtained under comparable conditions. (author)

  6. Verification and Validation Challenges for Adaptive Flight Control of Complex Autonomous Systems

    Science.gov (United States)

    Nguyen, Nhan T.

    2018-01-01

    Autonomy of aerospace systems requires the ability for flight control systems to be able to adapt to complex uncertain dynamic environment. In spite of the five decades of research in adaptive control, the fact still remains that currently no adaptive control system has ever been deployed on any safety-critical or human-rated production systems such as passenger transport aircraft. The problem lies in the difficulty with the certification of adaptive control systems since existing certification methods cannot readily be used for nonlinear adaptive control systems. Research to address the notion of metrics for adaptive control began to appear in the recent years. These metrics, if accepted, could pave a path towards certification that would potentially lead to the adoption of adaptive control as a future control technology for safety-critical and human-rated production systems. Development of certifiable adaptive control systems represents a major challenge to overcome. Adaptive control systems with learning algorithms will never become part of the future unless it can be proven that they are highly safe and reliable. Rigorous methods for adaptive control software verification and validation must therefore be developed to ensure that adaptive control system software failures will not occur, to verify that the adaptive control system functions as required, to eliminate unintended functionality, and to demonstrate that certification requirements imposed by regulatory bodies such as the Federal Aviation Administration (FAA) can be satisfied. This presentation will discuss some of the technical issues with adaptive flight control and related V&V challenges.

  7. Probabilistic Decision Graphs - Combining Verification and AI Techniques for Probabilistic Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2004-01-01

    We adopt probabilistic decision graphs developed in the field of automated verification as a tool for probabilistic model representation and inference. We show that probabilistic inference has linear time complexity in the size of the probabilistic decision graph, that the smallest probabilistic ...

  8. A DICOM-RT-based toolbox for the evaluation and verification of radiotherapy plans

    International Nuclear Information System (INIS)

    Spezi, E; Lewis, D G; Smith, C W

    2002-01-01

    The verification of radiotherapy plans is an essential step in the treatment planning process. This is especially important for highly conformal and IMRT plans which produce non-intuitive fluence maps and complex 3D dose distributions. In this work we present a DICOM (Digital Imaging and Communication in Medicine) based toolbox, developed for the evaluation and the verification of radiotherapy treatment plans. The toolbox offers the possibility of importing treatment plans generated with different calculation algorithms and/or different optimization engines and evaluating dose distributions on an independent platform. Furthermore the radiotherapy set-up can be exported to the BEAM Monte Carlo code system for dose verification. This can be done by simulating the irradiation of the patient CT dataset or the irradiation of a software-generated water phantom. We show the application of some of the functions implemented in this toolbox for the evaluation and verification of an IMRT treatment of the head and neck region

  9. Integrated Aero–Vibroacoustics: The Design Verification Process of Vega-C Launcher

    Directory of Open Access Journals (Sweden)

    Davide Bianco

    2018-01-01

    Full Text Available The verification of a space launcher at the design level is a complex issue because of (i the lack of a detailed modeling capability of the acoustic pressure produced by the rocket; and (ii the difficulties in applying deterministic methods to the large-scale metallic structures. In this paper, an innovative integrated design verification process is described, based on the bridging between a new semiempirical jet noise model and a hybrid finite-element method/statistical energy analysis (FEM/SEA approach for calculating the acceleration produced at the payload and equipment level within the structure, vibrating under the external acoustic forcing field. The result is a verification method allowing for accurate prediction of the vibroacoustics in the launcher interior, using limited computational resources and without resorting to computational fluid dynamics (CFD data. Some examples concerning the Vega-C launcher design are shown.

  10. Modeling the dynamics of internal flooding - verification analysis

    International Nuclear Information System (INIS)

    Filipov, K.

    2011-01-01

    The results from conducted software WATERFOW's verification analysis, developed for the purposes of reactor building internal flooding analysis have been presented. For the purpose of benchmarking the integrated code MELCOR is selected. Considering the complex structure of reactor building, the sample tests were used to cover the characteristic points of the internal flooding analysis. The inapplicability of MELCOR to the internal flooding study has been proved

  11. Technical safety requirements control level verification

    International Nuclear Information System (INIS)

    STEWART, J.L.

    1999-01-01

    A Technical Safety Requirement (TSR) control level verification process was developed for the Tank Waste Remediation System (TWRS) TSRs at the Hanford Site in Richland, WA, at the direction of the US. Department of Energy, Richland Operations Office (RL). The objective of the effort was to develop a process to ensure that the TWRS TSR controls are designated and managed at the appropriate levels as Safety Limits (SLs), Limiting Control Settings (LCSs), Limiting Conditions for Operation (LCOs), Administrative Controls (ACs), or Design Features. The TSR control level verification process was developed and implemented by a team of contractor personnel with the participation of Fluor Daniel Hanford, Inc. (FDH), the Project Hanford Management Contract (PHMC) integrating contractor, and RL representatives. The team was composed of individuals with the following experience base: nuclear safety analysis; licensing; nuclear industry and DOE-complex TSR preparation/review experience; tank farm operations; FDH policy and compliance; and RL-TWRS oversight. Each TSR control level designation was completed utilizing TSR control logic diagrams and TSR criteria checklists based on DOE Orders, Standards, Contractor TSR policy, and other guidance. The control logic diagrams and criteria checklists were reviewed and modified by team members during team meetings. The TSR control level verification process was used to systematically evaluate 12 LCOs, 22 AC programs, and approximately 100 program key elements identified in the TWRS TSR document. The verification of each TSR control required a team consensus. Based on the results of the process, refinements were identified and the TWRS TSRs were modified as appropriate. A final report documenting key assumptions and the control level designation for each TSR control was prepared and is maintained on file for future reference. The results of the process were used as a reference in the RL review of the final TWRS TSRs and control suite. RL

  12. Likelihood-ratio-based biometric verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    2002-01-01

    This paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that for single-user verification the likelihood ratio is optimal.

  13. Likelihood Ratio-Based Biometric Verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    The paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that, for single-user verification, the likelihood ratio is optimal.

  14. Applicability of surface complexation modelling in TVO's studies on sorption of radionuclides

    International Nuclear Information System (INIS)

    Carlsson, T.

    1994-03-01

    The report focuses on the possibility of applying surface complexation theories to the conditions at a potential repository site in Finland and of doing proper experimental work in order to determine necessary constants for the models. The report provides background information on: (1) what type experiments should be carried out in order to produce data for surface complexation modelling of sorption phenomena under potential Finnish repository conditions, and (2) how to design and perform properly such experiments, in order to gather data, develop models or both. The report does not describe in detail how proper surface complexation experiments or modelling should be carried out. The work contains several examples of information that may be valuable in both modelling and experimental work. (51 refs., 6 figs., 4 tabs.)

  15. Compositional Verification of Interlocking Systems for Large Stations

    DEFF Research Database (Denmark)

    Fantechi, Alessandro; Haxthausen, Anne Elisabeth; Macedo, Hugo Daniel dos Santos

    2017-01-01

    -networks that are independent at some degree. At this regard, we study how the division of a complex network into sub-networks, using stub elements to abstract all the routes that are common between sub-networks, may still guarantee compositionality of verification of safety properties....... for networks of large size due to the exponential computation time and resources needed. Some recent attempts to address this challenge adopt a compositional approach, targeted to track layouts that are easily decomposable into sub-networks such that a route is almost fully contained in a sub......-network: in this way granting the access to a route is essentially a decision local to the sub-network, and the interfaces with the rest of the network easily abstract away less interesting details related to the external world. Following up on previous work, where we defined a compositional verification method...

  16. Hypervelocity Wind Tunnel No. 9 Mach 7 Thermal Structural Facility Verification and Calibration

    National Research Council Canada - National Science Library

    Lafferty, John

    1996-01-01

    This report summarizes the verification and calibration of the new Mach 7 Thermal Structural Facility located at the White Oak, Maryland, site of the Dahlgren Division, Naval Surface Warfare Center...

  17. Verification and optimization of HDR surface mould brachytherapy plans using GAFCHROMIC EBT2 film: the ideal geometric case

    International Nuclear Information System (INIS)

    Sobolewski, Matthew; Haque, Mamoon

    2011-01-01

    Full text: Surface mould brachytherapy is used to treat superficial cancers due to conformal dose distributions and rapid dose fall-off with depth. In this work, we determine the effect of varying catheter number and prescription distance on dose distributions for surface mould plans using radiochromic film. Eight surface mould plans were generated using PLATO BPS (Version 14.3.2). Measurements were taken with Gafchromic EBT2 film over depths of 5-30 mm with an Ir-192 HDR source. Films were scanned using an Epson Expression 10000 XL flatbed scanner and analysed using RIT 113 software. The EBT2 films showed good agreement with an average difference of 2.8% compared to the planning system. The dose gradient in the interval ranging ±5 mm from the prescription point showed an 80% increase from the plan with maximum catheters (II) to the minimum (3). The size and extent of local dose maxima increased when fewer catheters were used. Increasing prescription distance decreased the dose gradient with a 20% reduction in dose occurring 4 mm superficially to the prescription point when prescription distance increased from 5 to 20 mm. Gafchromic EBT2 was used successfully to evaluate surface mould brachytherapy plans and is a useful tool for dose verification checks. High dose regions ne,u' to the catheter plane can be reduced by using a larger number of catheters and the prescription distance should be adjusted as a function of treatment depth varied by mould thickness.

  18. A method of reconstructing complex stratigraphic surfaces with multitype fault constraints

    Science.gov (United States)

    Deng, Shi-Wu; Jia, Yu; Yao, Xing-Miao; Liu, Zhi-Ning

    2017-06-01

    The construction of complex stratigraphic surfaces is widely employed in many fields, such as petroleum exploration, geological modeling, and geological structure analysis. It also serves as an important foundation for data visualization and visual analysis in these fields. The existing surface construction methods have several deficiencies and face various difficulties, such as the presence of multitype faults and roughness of resulting surfaces. In this paper, a surface modeling method that uses geometric partial differential equations (PDEs) is introduced for the construction of stratigraphic surfaces. It effectively solves the problem of surface roughness caused by the irregularity of stratigraphic data distribution. To cope with the presence of multitype complex faults, a two-way projection algorithm between threedimensional space and a two-dimensional plane is proposed. Using this algorithm, a unified method based on geometric PDEs is developed for dealing with multitype faults. Moreover, the corresponding geometric PDE is derived, and an algorithm based on an evolutionary solution is developed. The algorithm proposed for constructing spatial surfaces with real data verifies its computational efficiency and its ability to handle irregular data distribution. In particular, it can reconstruct faulty surfaces, especially those with overthrust faults.

  19. Uncertainty analysis of point by point sampling complex surfaces using touch probe CMMs

    DEFF Research Database (Denmark)

    Barini, Emanuele; Tosello, Guido; De Chiffre, Leonardo

    2007-01-01

    The paper describes a study concerning point by point scanning of complex surfaces using tactile CMMs. A four factors-two level full factorial experiment was carried out, involving measurements on a complex surface configuration item comprising a sphere, a cylinder and a cone, combined in a singl...

  20. Total skin electron therapy treatment verification: Monte Carlo simulation and beam characteristics of large non-standard electron fields

    International Nuclear Information System (INIS)

    Pavon, Ester Carrasco; Sanchez-Doblado, Francisco; Leal, Antonio; Capote, Roberto; Lagares, Juan Ignacio; Perucha, Maria; Arrans, Rafael

    2003-01-01

    Total skin electron therapy (TSET) is a complex technique which requires non-standard measurements and dosimetric procedures. This paper investigates an essential first step towards TSET Monte Carlo (MC) verification. The non-standard 6 MeV 40 x 40 cm 2 electron beam at a source to surface distance (SSD) of 100 cm as well as its horizontal projection behind a polymethylmethacrylate (PMMA) screen to SSD = 380 cm were evaluated. The EGS4 OMEGA-BEAM code package running on a Linux home made 47 PCs cluster was used for the MC simulations. Percentage depth-dose curves and profiles were calculated and measured experimentally for the 40 x 40 cm 2 field at both SSD = 100 cm and patient surface SSD = 380 cm. The output factor (OF) between the reference 40 x 40 cm 2 open field and its horizontal projection as TSET beam at SSD = 380 cm was also measured for comparison with MC results. The accuracy of the simulated beam was validated by the good agreement to within 2% between measured relative dose distributions, including the beam characteristic parameters (R 50 , R 80 , R 100 , R p , E 0 ) and the MC calculated results. The energy spectrum, fluence and angular distribution at different stages of the beam (at SSD = 100 cm, at SSD = 364.2 cm, behind the PMMA beam spoiler screen and at treatment surface SSD = 380 cm) were derived from MC simulations. Results showed a final decrease in mean energy of almost 56% from the exit window to the treatment surface. A broader angular distribution (FWHM of the angular distribution increased from 13deg at SSD 100 cm to more than 30deg at the treatment surface) was fully attributable to the PMMA beam spoiler screen. OF calculations and measurements agreed to less than 1%. The effect of changing the electron energy cut-off from 0.7 MeV to 0.521 MeV and air density fluctuations in the bunker which could affect the MC results were shown to have a negligible impact on the beam fluence distributions. Results proved the applicability of using MC

  1. Scalable Techniques for Formal Verification

    CERN Document Server

    Ray, Sandip

    2010-01-01

    This book presents state-of-the-art approaches to formal verification techniques to seamlessly integrate different formal verification methods within a single logical foundation. It should benefit researchers and practitioners looking to get a broad overview of the spectrum of formal verification techniques, as well as approaches to combining such techniques within a single framework. Coverage includes a range of case studies showing how such combination is fruitful in developing a scalable verification methodology for industrial designs. This book outlines both theoretical and practical issue

  2. Modeling uranium(VI) adsorption onto montmorillonite under varying carbonate concentrations: A surface complexation model accounting for the spillover effect on surface potential

    Science.gov (United States)

    Tournassat, C.; Tinnacher, R. M.; Grangeon, S.; Davis, J. A.

    2018-01-01

    The prediction of U(VI) adsorption onto montmorillonite clay is confounded by the complexities of: (1) the montmorillonite structure in terms of adsorption sites on basal and edge surfaces, and the complex interactions between the electrical double layers at these surfaces, and (2) U(VI) solution speciation, which can include cationic, anionic and neutral species. Previous U(VI)-montmorillonite adsorption and modeling studies have typically expanded classical surface complexation modeling approaches, initially developed for simple oxides, to include both cation exchange and surface complexation reactions. However, previous models have not taken into account the unique characteristics of electrostatic surface potentials that occur at montmorillonite edge sites, where the electrostatic surface potential of basal plane cation exchange sites influences the surface potential of neighboring edge sites ('spillover' effect). A series of U(VI) - Na-montmorillonite batch adsorption experiments was conducted as a function of pH, with variable U(VI), Ca, and dissolved carbonate concentrations. Based on the experimental data, a new type of surface complexation model (SCM) was developed for montmorillonite, that specifically accounts for the spillover effect using the edge surface speciation model by Tournassat et al. (2016a). The SCM allows for a prediction of U(VI) adsorption under varying chemical conditions with a minimum number of fitting parameters, not only for our own experimental results, but also for a number of published data sets. The model agreed well with many of these datasets without introducing a second site type or including the formation of ternary U(VI)-carbonato surface complexes. The model predictions were greatly impacted by utilizing analytical measurements of dissolved inorganic carbon (DIC) concentrations in individual sample solutions rather than assuming solution equilibration with a specific partial pressure of CO2, even when the gas phase was

  3. Cleanup Verification Package for the 600-47 Waste Site

    International Nuclear Information System (INIS)

    Cutlip, M.J.

    2005-01-01

    This cleanup verification package documents completion of interim remedial action for the 600-47 waste site. This site consisted of several areas of surface debris and contamination near the banks of the Columbia River across from Johnson Island. Contaminated material identified in field surveys included four areas of soil, wood, nuts, bolts, and other metal debris

  4. RTL validation methodology on high complexity wireless microcontroller using OVM technique for fast time to market

    Science.gov (United States)

    Zhafirah Muhammad, Nurul; Harun, A.; Hambali, N. A. M. A.; Murad, S. A. Z.; Mohyar, S. N.; Isa, M. N.; Jambek, AB

    2017-11-01

    Increased demand in internet of thing (IOT) application based has inadvertently forced the move towards higher complexity of integrated circuit supporting SoC. Such spontaneous increased in complexity poses unequivocal complicated validation strategies. Hence, the complexity allows researchers to come out with various exceptional methodologies in order to overcome this problem. This in essence brings about the discovery of dynamic verification, formal verification and hybrid techniques. In reserve, it is very important to discover bugs at infancy of verification process in (SoC) in order to reduce time consuming and fast time to market for the system. Ergo, in this paper we are focusing on the methodology of verification that can be done at Register Transfer Level of SoC based on the AMBA bus design. On top of that, the discovery of others verification method called Open Verification Methodology (OVM) brings out an easier way in RTL validation methodology neither as the replacement for the traditional method yet as an effort for fast time to market for the system. Thus, the method called OVM is proposed in this paper as the verification method for larger design to avert the disclosure of the bottleneck in validation platform.

  5. Can self-verification strivings fully transcend the self-other barrier? Seeking verification of ingroup identities.

    Science.gov (United States)

    Gómez, Angel; Seyle, D Conor; Huici, Carmen; Swann, William B

    2009-12-01

    Recent research has demonstrated self-verification strivings in groups, such that people strive to verify collective identities, which are personal self-views (e.g., "sensitive") associated with group membership (e.g., "women"). Such demonstrations stop short of showing that the desire for self-verification can fully transcend the self-other barrier, as in people working to verify ingroup identities (e.g., "Americans are loud") even when such identities are not self-descriptive ("I am quiet and unassuming"). Five studies focus on such ingroup verification strivings. Results indicate that people prefer to interact with individuals who verify their ingroup identities over those who enhance these identities (Experiments 1-5). Strivings for ingroup identity verification were independent of the extent to which the identities were self-descriptive but were stronger among participants who were highly invested in their ingroup identities, as reflected in high certainty of these identities (Experiments 1-4) and high identification with the group (Experiments 1-5). In addition, whereas past demonstrations of self-verification strivings have been limited to efforts to verify the content of identities (Experiments 1 to 3), the findings also show that they strive to verify the valence of their identities (i.e., the extent to which the identities are valued; Experiments 4 and 5). Self-verification strivings, rather than self-enhancement strivings, appeared to motivate participants' strivings for ingroup identity verification. Links to collective self-verification strivings and social identity theory are discussed.

  6. Technology of magnetic abrasive finishing in machining of difficult-to-machine alloy complex surface

    Directory of Open Access Journals (Sweden)

    Fujian MA

    2016-10-01

    Full Text Available The technology of magnetic abrasive finishing is one of the important finishing technologies. Combining with low-frequency vibration and ultrasonic vibration, it can attain higher precision, quality and efficiency. The characteristics and the related current research of magnetic abrasive finishing, vibration assisted magnetic abrasive finishing and ultrasonic assisted magnetic abrasive finishing are introduced. According to the characteristics of the difficult-to-machine alloy's complex surface, the important problems for further study are presented to realize the finishing of complex surface with the technology of magnetic abrasive finishing, such as increasing the machining efficiency by enhancing the magnetic flux density of machining gap and compounding of magnetic energy and others, establishing of the control function during machining and the process planning method for magnetic abrasive finishing of complex surface under the space geometry restraint of complex surface on magnetic pole, etc.

  7. Utterance Verification for Text-Dependent Speaker Recognition

    DEFF Research Database (Denmark)

    Kinnunen, Tomi; Sahidullah, Md; Kukanov, Ivan

    2016-01-01

    Text-dependent automatic speaker verification naturally calls for the simultaneous verification of speaker identity and spoken content. These two tasks can be achieved with automatic speaker verification (ASV) and utterance verification (UV) technologies. While both have been addressed previously...

  8. Measurement of complex surfaces

    International Nuclear Information System (INIS)

    Brown, G.M.

    1993-05-01

    Several of the components used in coil fabrication involve complex surfaces and dimensions that are not well suited to measurements using conventional dimensional measuring equipment. Some relatively simple techniques that are in use in the SSCL Magnet Systems Division (MSD) for incoming inspection will be described, with discussion of their suitability for specific applications. Components that are submitted for MSD Quality Assurance (QA) dimensional inspection may be divided into two distinct categories; the first category involves components for which there is an approved drawing and for which all nominal dimensions are known; the second category involves parts for which 'reverse engineering' is required, the part is available but there are no available drawings or dimensions. This second category typically occurs during development of coil end parts and coil turn filler parts where it is necessary to manually shape the part and then measure it to develop the information required to prepare a drawing for the part

  9. A Practitioners Perspective on Verification

    Science.gov (United States)

    Steenburgh, R. A.

    2017-12-01

    NOAAs Space Weather Prediction Center offers a wide range of products and services to meet the needs of an equally wide range of customers. A robust verification program is essential to the informed use of model guidance and other tools by both forecasters and end users alike. In this talk, we present current SWPC practices and results, and examine emerging requirements and potential approaches to satisfy them. We explore the varying verification needs of forecasters and end users, as well as the role of subjective and objective verification. Finally, we describe a vehicle used in the meteorological community to unify approaches to model verification and facilitate intercomparison.

  10. Verification report for SIMREP 1.1

    International Nuclear Information System (INIS)

    Tarapore, P.S.

    1987-06-01

    SIMREP 1.1 is a discrete event computer simulation of repository operations in the surface waste-handling facility. The logic for this model is provided by Fluor Technology, Inc., the Architect/Engineer of the salt repository. The verification methods included a line-by-line review of the code, a detailed examination of a generated trace of all simulated events over a given period of operations, and a comparison of the simulation output results with expected values. SIMREP 1.1 performs in the required manner under the given range of input conditions

  11. Surface complexation of carbonate on goethite: IR spectroscopy, structure & charge distribution

    NARCIS (Netherlands)

    Hiemstra, T.; Rahnemaie, R.; Riemsdijk, van W.H.

    2004-01-01

    The adsorption of carbonate on goethite has been evaluated, focussing on the relation between the structure of the surface complex and corresponding adsorption characteristics, like pH dependency and proton co-adsorption. The surface structure of adsorbed CO3-2 has been assessed with (1) a

  12. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models: Appendices

    Science.gov (United States)

    Coppolino, Robert N.

    2018-01-01

    Verification and validation (V&V) is a highly challenging undertaking for SLS structural dynamics models due to the magnitude and complexity of SLS subassemblies and subassemblies. Responses to challenges associated with V&V of Space Launch System (SLS) structural dynamics models are presented in Volume I of this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA). (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976). (3) Mode Consolidation (MC). Finally, (4) Experimental Mode Verification (EMV). This document contains the appendices to Volume I.

  13. RTL validation methodology on high complexity wireless microcontroller using OVM technique for fast time to market

    Directory of Open Access Journals (Sweden)

    Muhammad Nurul Zhafirah

    2017-01-01

    Full Text Available Increased demand in internet of thing (IOT application based has inadvertently forced the move towards higher complexity of integrated circuit supporting SoC. Such spontaneous increased in complexity poses unequivocal complicated validation strategies. Hence, the complexity allows researchers to come out with various exceptional methodologies in order to overcome this problem. This in essence brings about the discovery of dynamic verification, formal verification and hybrid techniques. In reserve, it is very important to discover bugs at infancy of verification process in (SoC in order to reduce time consuming and fast time to market for the system. Ergo, in this paper we are focusing on the methodology of verification that can be done at Register Transfer Level of SoC based on the AMBA bus design. On top of that, the discovery of others verification method called Open Verification Methodology (OVM brings out an easier way in RTL validation methodology neither as the replacement for the traditional method yet as an effort for fast time to market for the system. Thus, the method called OVM is proposed in this paper as the verification method for larger design to avert the disclosure of the bottleneck in validation platform.

  14. Technical safety requirements control level verification; TOPICAL

    International Nuclear Information System (INIS)

    STEWART, J.L.

    1999-01-01

    A Technical Safety Requirement (TSR) control level verification process was developed for the Tank Waste Remediation System (TWRS) TSRs at the Hanford Site in Richland, WA, at the direction of the US. Department of Energy, Richland Operations Office (RL). The objective of the effort was to develop a process to ensure that the TWRS TSR controls are designated and managed at the appropriate levels as Safety Limits (SLs), Limiting Control Settings (LCSs), Limiting Conditions for Operation (LCOs), Administrative Controls (ACs), or Design Features. The TSR control level verification process was developed and implemented by a team of contractor personnel with the participation of Fluor Daniel Hanford, Inc. (FDH), the Project Hanford Management Contract (PHMC) integrating contractor, and RL representatives. The team was composed of individuals with the following experience base: nuclear safety analysis; licensing; nuclear industry and DOE-complex TSR preparation/review experience; tank farm operations; FDH policy and compliance; and RL-TWRS oversight. Each TSR control level designation was completed utilizing TSR control logic diagrams and TSR criteria checklists based on DOE Orders, Standards, Contractor TSR policy, and other guidance. The control logic diagrams and criteria checklists were reviewed and modified by team members during team meetings. The TSR control level verification process was used to systematically evaluate 12 LCOs, 22 AC programs, and approximately 100 program key elements identified in the TWRS TSR document. The verification of each TSR control required a team consensus. Based on the results of the process, refinements were identified and the TWRS TSRs were modified as appropriate. A final report documenting key assumptions and the control level designation for each TSR control was prepared and is maintained on file for future reference. The results of the process were used as a reference in the RL review of the final TWRS TSRs and control suite. RL

  15. Verification Account Management System (VAMS)

    Data.gov (United States)

    Social Security Administration — The Verification Account Management System (VAMS) is the centralized location for maintaining SSA's verification and data exchange accounts. VAMS account management...

  16. Container Verification Using Optically Stimulated Luminescence

    International Nuclear Information System (INIS)

    Tanner, Jennifer E.; Miller, Steven D.; Conrady, Matthew M.; Simmons, Kevin L.; Tinker, Michael R.

    2008-01-01

    Containment verification is a high priority for safeguards containment and surveillance. Nuclear material containers, safeguards equipment cabinets, camera housings, and detector cable conduit are all vulnerable to tampering. Even with a high security seal on a lid or door, custom-built hinges and interfaces, and special colors and types of finishes, the surfaces of enclosures can be tampered with and any penetrations repaired and covered over. With today's technology, these repairs would not be detected during a simple visual inspection. Several suggested solutions have been to develop complicated networks of wires, fiber-optic cables, lasers or other sensors that line the inside of a container and alarm when the network is disturbed. This results in an active system with real time evidence of tampering but is probably not practical for most safeguards applications. A more practical solution would be to use a passive approach where an additional security feature was added to surfaces which would consist of a special coating or paint applied to the container or enclosure. One type of coating would incorporate optically stimulated luminescent (OSL) material. OSL materials are phosphors that luminesce in proportion to the ionizing radiation dose when stimulated with the appropriate optical wavelengths. The OSL fluoresces at a very specific wavelength when illuminated at another, very specific wavelength. The presence of the pre-irradiated OSL material in the coating is confirmed using a device that interrogates the surface of the enclosure using the appropriate optical wavelength and then reads the resulting luminescence. The presence of the OSL indicates that the integrity of the surface is intact. The coating itself could be transparent which would allow the appearance of the container to remain unchanged or the OSL material could be incorporated into certain paints or epoxies used on various types of containers. The coating could be applied during manufacturing

  17. Quantum money with classical verification

    Energy Technology Data Exchange (ETDEWEB)

    Gavinsky, Dmitry [NEC Laboratories America, Princeton, NJ (United States)

    2014-12-04

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  18. Quantum money with classical verification

    International Nuclear Information System (INIS)

    Gavinsky, Dmitry

    2014-01-01

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it

  19. Monitoring/Verification Using DMS: TATP Example

    International Nuclear Information System (INIS)

    Kevin Kyle; Stephan Weeks

    2008-01-01

    Field-rugged and field-programmable differential mobility spectrometry (DMS) networks provide highly selective, universal monitoring of vapors and aerosols at detectable levels from persons or areas involved with illicit chemical/biological/explosives (CBE) production. CBE sensor motes used in conjunction with automated fast gas chromatography with DMS detection (GC/DMS) verification instrumentation integrated into situational operations management systems can be readily deployed and optimized for changing application scenarios. The feasibility of developing selective DMS motes for a 'smart dust' sampling approach with guided, highly selective, fast GC/DMS verification analysis is a compelling approach to minimize or prevent the illegal use of explosives or chemical and biological materials. DMS is currently one of the foremost emerging technologies for field separation and detection of gas-phase chemical species. This is due to trace-level detection limits, high selectivity, and small size. GC is the leading analytical method for the separation of chemical species in complex mixtures. Low-thermal-mass GC columns have led to compact, low-power field systems capable of complete analyses in 15-300 seconds. A collaborative effort optimized a handheld, fast GC/DMS, equipped with a non-rad ionization source, for peroxide-based explosive measurements

  20. Surface Complexation of Neptunium(V) with Goethite

    International Nuclear Information System (INIS)

    Jerden, James L.; Kropf, A. Jeremy

    2007-01-01

    Batch adsorption experiments in which neptunium-bearing solutions were reacted with goethite (alpha-FeOOH) have been performed to study uptake mechanisms in sodium chloride and calcium-bearing sodium silicate solutions. This paper presents results identifying and quantifying the mechanisms by which neptunium is adsorbed as a function of pH and reaction time (aging). Also presented are results from tests in which neptunium is reacted with goethite in the presence of other cations (uranyl and calcium) that may compete with neptunium for sorption sites. The desorption of neptunium from goethite has been studied by re-suspending the neptunium-loaded goethite samples in solutions containing no neptunium. Selected reacted sorbent samples were analyzed by x-ray absorption spectroscopy (XAS) to determine the oxidation state and molecular speciation of the adsorbed neptunium. Results have been used to establish the pH adsorption edge of neptunium on goethite in sodium chloride and calcium-bearing sodium silicate solutions. The results indicate that neptunium uptake on goethite reaches 95% at a pH of approximately 7 and begins to decrease at pH values greater than 8.5. Distribution coefficients for neptunium sorption range from less than 1000 (moles/kg) sorbed / (moles/kg) solution at pH less than 5.0 to greater than 10,000 (moles/kg) sorbed / (moles/kg) solution at pH greater than 7.0. Distribution coefficients as high as 100,000 (moles/kg) sorbed / (moles/kg) solution were recorded for the tests done in calcite equilibrated sodium silicate solutions. XAS results show that neptunium complexes with the goethite surface mainly as Np(V) (although Np(IV) is prevalent in some of the longer-duration sorption tests). The neptunium adsorbed to goethite shows Np-O bond length of approximately 1.8 angstroms which is representative of the Np-O axial bond in the neptunyl(V) complex. This neptunyl(V) ion is coordinated to 5 or 6 equatorial oxygens with Np-O bond lengths of 2

  1. Development of a tool for knowledge base verification of expert system based on Design/CPN

    International Nuclear Information System (INIS)

    Kim, Jong Hyun

    1998-02-01

    Verification is a necessary work in developing a reliable expert system. Verification is a process aimed at demonstrating whether a system meets it's specified requirements. As expert systems are used in various applications, the knowledge base verification of systems takes an important position. The conventional Petri net approach that has been studied recently in order to verify the knowledge base is found that it is inadequate to verify the knowledge base of large and complex system, such as alarm processing system of nuclear power plant. Thus, we propose an improved method that models the knowledge base as enhanced colored Petri net. In this study, we analyze the reachability and the error characteristics of the knowledge base. Generally, verification process requires computational support by automated tools. For this reason, this study developed a tool for knowledge base verification based on Design/CPN, which is a tool for editing, modeling, and simulating Colored Petri net. This tool uses Enhanced Colored Petri net as a modeling method. By applying this tool to the knowledge base of nuclear power plant, it is noticed that it can successfully check most of the anomalies that can occur in a knowledge base

  2. How to Find a Bug in Ten Thousand Lines Transport Solver? Outline of Experiences from AN Advection-Diffusion Code Verification

    Science.gov (United States)

    Zamani, K.; Bombardelli, F.

    2011-12-01

    Almost all natural phenomena on Earth are highly nonlinear. Even simplifications to the equations describing nature usually end up being nonlinear partial differential equations. Transport (ADR) equation is a pivotal equation in atmospheric sciences and water quality. This nonlinear equation needs to be solved numerically for practical purposes so academicians and engineers thoroughly rely on the assistance of numerical codes. Thus, numerical codes require verification before they are utilized for multiple applications in science and engineering. Model verification is a mathematical procedure whereby a numerical code is checked to assure the governing equation is properly solved as it is described in the design document. CFD verification is not a straightforward and well-defined course. Only a complete test suite can uncover all the limitations and bugs. Results are needed to be assessed to make a distinction between bug-induced-defect and innate limitation of a numerical scheme. As Roache (2009) said, numerical verification is a state-of-the-art procedure. Sometimes novel tricks work out. This study conveys the synopsis of the experiences we gained during a comprehensive verification process which was done for a transport solver. A test suite was designed including unit tests and algorithmic tests. Tests were layered in complexity in several dimensions from simple to complex. Acceptance criteria defined for the desirable capabilities of the transport code such as order of accuracy, mass conservation, handling stiff source term, spurious oscillation, and initial shape preservation. At the begining, mesh convergence study which is the main craft of the verification is performed. To that end, analytical solution of ADR equation gathered. Also a new solution was derived. In the more general cases, lack of analytical solution could be overcome through Richardson Extrapolation and Manufactured Solution. Then, two bugs which were concealed during the mesh convergence

  3. Particularities of Verification Processes for Distributed Informatics Applications

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2013-01-01

    Full Text Available This paper presents distributed informatics applications and characteristics of their development cycle. It defines the concept of verification and there are identified the differences from software testing. Particularities of the software testing and software verification processes are described. The verification steps and necessary conditions are presented and there are established influence factors of quality verification. Software optimality verification is analyzed and some metrics are defined for the verification process.

  4. Verification of the ECMWF ensemble forecasts of wind speed against analyses and observations

    DEFF Research Database (Denmark)

    Pinson, Pierre; Hagedorn, Renate

    2012-01-01

    A framework for the verification of ensemble forecasts of near-surface wind speed is described. It is based on existing scores and diagnostic tools, though considering observations from synoptic stations as reference instead of the analysis. This approach is motivated by the idea of having a user......-oriented view of verification, for instance with the wind power applications in mind. The verification framework is specifically applied to the case of ECMWF ensemble forecasts and over Europe. Dynamic climatologies are derived at the various stations, serving as a benchmark. The impact of observational...... uncertainty on scores and diagnostic tools is also considered. The interest of this framework is demonstrated from its application to the routine evaluation of ensemble forecasts and to the assessment of the quality improvements brought in by the recent change in horizontal resolution of the ECMWF ensemble...

  5. Nuclear test ban verification

    International Nuclear Information System (INIS)

    Chun, Kin-Yip

    1991-07-01

    This report describes verification and its rationale, the basic tasks of seismic verification, the physical basis for earthquake/explosion source discrimination and explosion yield determination, the technical problems pertaining to seismic monitoring of underground nuclear tests, the basic problem-solving strategy deployed by the forensic seismology resarch team at the University of Toronto, and the scientific significance of the team's research. The research carried out at the Univeristy of Toronto has two components: teleseismic verification using P wave recordings from the Yellowknife Seismic Array (YKA), and regional (close-in) verification using high-frequency L g and P n recordings from the Eastern Canada Telemetered Network. Major differences have been found in P was attenuation among the propagation paths connecting the YKA listening post with seven active nuclear explosion testing areas in the world. Significant revisions have been made to previously published P wave attenuation results, leading to more interpretable nuclear explosion source functions. (11 refs., 12 figs.)

  6. Android-Based Verification System for Banknotes

    Directory of Open Access Journals (Sweden)

    Ubaid Ur Rahman

    2017-11-01

    Full Text Available With the advancement in imaging technologies for scanning and printing, production of counterfeit banknotes has become cheaper, easier, and more common. The proliferation of counterfeit banknotes causes loss to banks, traders, and individuals involved in financial transactions. Hence, it is inevitably needed that efficient and reliable techniques for detection of counterfeit banknotes should be developed. With the availability of powerful smartphones, it has become possible to perform complex computations and image processing related tasks on these phones. In addition to this, smartphone users have increased greatly and numbers continue to increase. This is a great motivating factor for researchers and developers to propose innovative mobile-based solutions. In this study, a novel technique for verification of Pakistani banknotes is developed, targeting smartphones with android platform. The proposed technique is based on statistical features, and surface roughness of a banknote, representing different properties of the banknote, such as paper material, printing ink, paper quality, and surface roughness. The selection of these features is motivated by the X-ray Diffraction (XRD and Scanning Electron Microscopy (SEM analysis of genuine and counterfeit banknotes. In this regard, two important areas of the banknote, i.e., serial number and flag portions were considered since these portions showed the maximum difference between genuine and counterfeit banknote. The analysis confirmed that genuine and counterfeit banknotes are very different in terms of the printing process, the ingredients used in preparation of banknotes, and the quality of the paper. After extracting the discriminative set of features, support vector machine is used for classification. The experimental results confirm the high accuracy of the proposed technique.

  7. Modeling the surface tension of complex, reactive organic-inorganic mixtures

    Science.gov (United States)

    Schwier, A. N.; Viglione, G. A.; Li, Z.; McNeill, V. Faye

    2013-11-01

    Atmospheric aerosols can contain thousands of organic compounds which impact aerosol surface tension, affecting aerosol properties such as heterogeneous reactivity, ice nucleation, and cloud droplet formation. We present new experimental data for the surface tension of complex, reactive organic-inorganic aqueous mixtures mimicking tropospheric aerosols. Each solution contained 2-6 organic compounds, including methylglyoxal, glyoxal, formaldehyde, acetaldehyde, oxalic acid, succinic acid, leucine, alanine, glycine, and serine, with and without ammonium sulfate. We test two semi-empirical surface tension models and find that most reactive, complex, aqueous organic mixtures which do not contain salt are well described by a weighted Szyszkowski-Langmuir (S-L) model which was first presented by Henning et al. (2005). Two approaches for modeling the effects of salt were tested: (1) the Tuckermann approach (an extension of the Henning model with an additional explicit salt term), and (2) a new implicit method proposed here which employs experimental surface tension data obtained for each organic species in the presence of salt used with the Henning model. We recommend the use of method (2) for surface tension modeling of aerosol systems because the Henning model (using data obtained from organic-inorganic systems) and Tuckermann approach provide similar modeling results and goodness-of-fit (χ2) values, yet the Henning model is a simpler and more physical approach to modeling the effects of salt, requiring less empirically determined parameters.

  8. Toroidal surface complexes of bacteriophage φ12 are responsible for host-cell attachment

    International Nuclear Information System (INIS)

    Leo-Macias, Alejandra; Katz, Garrett; Wei Hui; Alimova, Alexandra; Katz, A.; Rice, William J.; Diaz-Avalos, Ruben; Hu Guobin; Stokes, David L.; Gottlieb, Paul

    2011-01-01

    Cryo-electron tomography and subtomogram averaging are utilized to determine that the bacteriophage φ12, a member of the Cystoviridae family, contains surface complexes that are toroidal in shape, are composed of six globular domains with six-fold symmetry, and have a discrete density connecting them to the virus membrane-envelope surface. The lack of this kind of spike in a reassortant of φ12 demonstrates that the gene for the hexameric spike is located in φ12's medium length genome segment, likely to the P3 open reading frames which are the proteins involved in viral-host cell attachment. Based on this and on protein mass estimates derived from the obtained averaged structure, it is suggested that each of the globular domains is most likely composed of a total of four copies of P3a and/or P3c proteins. Our findings may have implications in the study of the evolution of the cystovirus species in regard to their host specificity. - Research Highlights: → Subtomogram averaging reveals enhanced detail of a φ12 cystovirus surface protein complex. → The surface protein complex has a toroidal shape and six-fold symmetry. → It is encoded by the medium-size genome segment. → The proteins of the surface complex most likely are one copy of P3a and three copies of P3c.

  9. Building a Simulated Environment for the Study of Multilateral Approaches to Nuclear Materials Verification

    International Nuclear Information System (INIS)

    Moul, R.; Persbo, A.; Keir, D.

    2015-01-01

    Verification research can be resource-intensive, particularly when it relies on practical or field exercises. These exercises can also involve substantial logistical preparations and are difficult to run in an iterative manner to produce data sets that can be later utilized in verification research. This paper presents the conceptual framework, methodology and preliminary findings from part of a multi-year research project, led by VERTIC. The multi-component simulated environment that we have generated, using existing computer models for nuclear reactors and other components of fuel cycles, can be used to investigate options for future multilateral nuclear verification, at a variety of locations and time points in a nuclear complex. We have constructed detailed fuel cycle simulations for two fictional, and very different, states. In addition to these mass-flow models, a 3-dimensional, avatarbased simulation of a nuclear facility is under development. We have also developed accompanying scenarios-that provide legal and procedural assumptions that will control the process of our fictional verification solutions. These tools have all been produced using open source information and software. While these tools are valuable for research purposes, they can also play an important role in support of training and education in the field of nuclear materials verification, in a variety of settings and circumstances. (author)

  10. Modality Switching in a Property Verification Task: An ERP Study of What Happens When Candles Flicker after High Heels Click.

    Science.gov (United States)

    Collins, Jennifer; Pecher, Diane; Zeelenberg, René; Coulson, Seana

    2011-01-01

    The perceptual modalities associated with property words, such as flicker or click, have previously been demonstrated to affect subsequent property verification judgments (Pecher et al., 2003). Known as the conceptual modality switch effect, this finding supports the claim that brain systems for perception and action help subserve the representation of concepts. The present study addressed the cognitive and neural substrate of this effect by recording event-related potentials (ERPs) as participants performed a property verification task with visual or auditory properties in key trials. We found that for visual property verifications, modality switching was associated with an increased amplitude N400. For auditory verifications, switching led to a larger late positive complex. Observed ERP effects of modality switching suggest property words access perceptual brain systems. Moreover, the timing and pattern of the effects suggest perceptual systems impact the decision-making stage in the verification of auditory properties, and the semantic stage in the verification of visual properties.

  11. Surface complexation modeling calculation of Pb(II) adsorption onto the calcined diatomite

    Science.gov (United States)

    Ma, Shu-Cui; Zhang, Ji-Lin; Sun, De-Hui; Liu, Gui-Xia

    2015-12-01

    Removal of noxious heavy metal ions (e.g. Pb(II)) by surface adsorption of minerals (e.g. diatomite) is an important means in the environmental aqueous pollution control. Thus, it is very essential to understand the surface adsorptive behavior and mechanism. In this work, the Pb(II) apparent surface complexation reaction equilibrium constants on the calcined diatomite and distributions of Pb(II) surface species were investigated through modeling calculations of Pb(II) based on diffuse double layer model (DLM) with three amphoteric sites. Batch experiments were used to study the adsorption of Pb(II) onto the calcined diatomite as a function of pH (3.0-7.0) and different ionic strengths (0.05 and 0.1 mol L-1 NaCl) under ambient atmosphere. Adsorption of Pb(II) can be well described by Freundlich isotherm models. The apparent surface complexation equilibrium constants (log K) were obtained by fitting the batch experimental data using the PEST 13.0 together with PHREEQC 3.1.2 codes and there is good agreement between measured and predicted data. Distribution of Pb(II) surface species on the diatomite calculated by PHREEQC 3.1.2 program indicates that the impurity cations (e.g. Al3+, Fe3+, etc.) in the diatomite play a leading role in the Pb(II) adsorption and dominant formation of complexes and additional electrostatic interaction are the main adsorption mechanism of Pb(II) on the diatomite under weak acidic conditions.

  12. Verification and validation in computational fluid dynamics

    Science.gov (United States)

    Oberkampf, William L.; Trucano, Timothy G.

    2002-04-01

    Verification and validation (V&V) are the primary means to assess accuracy and reliability in computational simulations. This paper presents an extensive review of the literature in V&V in computational fluid dynamics (CFD), discusses methods and procedures for assessing V&V, and develops a number of extensions to existing ideas. The review of the development of V&V terminology and methodology points out the contributions from members of the operations research, statistics, and CFD communities. Fundamental issues in V&V are addressed, such as code verification versus solution verification, model validation versus solution validation, the distinction between error and uncertainty, conceptual sources of error and uncertainty, and the relationship between validation and prediction. The fundamental strategy of verification is the identification and quantification of errors in the computational model and its solution. In verification activities, the accuracy of a computational solution is primarily measured relative to two types of highly accurate solutions: analytical solutions and highly accurate numerical solutions. Methods for determining the accuracy of numerical solutions are presented and the importance of software testing during verification activities is emphasized. The fundamental strategy of validation is to assess how accurately the computational results compare with the experimental data, with quantified error and uncertainty estimates for both. This strategy employs a hierarchical methodology that segregates and simplifies the physical and coupling phenomena involved in the complex engineering system of interest. A hypersonic cruise missile is used as an example of how this hierarchical structure is formulated. The discussion of validation assessment also encompasses a number of other important topics. A set of guidelines is proposed for designing and conducting validation experiments, supported by an explanation of how validation experiments are different

  13. Java bytecode verification via static single assignment form

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian W.; Franz, Michael

    2008-01-01

    Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism that trans......Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism...

  14. Self-Organization during Friction in Complex Surface Engineered Tribosystems

    Directory of Open Access Journals (Sweden)

    Ben D. Beake

    2010-02-01

    Full Text Available Self-organization during friction in complex surface engineered tribosystems is investigated. The probability of self-organization in these complex tribosystems is studied on the basis of the theoretical concepts of irreversible thermodynamics. It is shown that a higher number of interrelated processes within the system result in an increased probability of self-organization. The results of this thermodynamic model are confirmed by the investigation of the wear performance of a novel Ti0.2Al0.55Cr0.2Si0.03Y0.02N/Ti0.25Al0.65Cr0.1N (PVD coating with complex nano-multilayered structure under extreme tribological conditions of dry high-speed end milling of hardened H13 tool steel.

  15. Formal verification of algorithms for critical systems

    Science.gov (United States)

    Rushby, John M.; Von Henke, Friedrich

    1993-01-01

    We describe our experience with formal, machine-checked verification of algorithms for critical applications, concentrating on a Byzantine fault-tolerant algorithm for synchronizing the clocks in the replicated computers of a digital flight control system. First, we explain the problems encountered in unsynchronized systems and the necessity, and criticality, of fault-tolerant synchronization. We give an overview of one such algorithm, and of the arguments for its correctness. Next, we describe a verification of the algorithm that we performed using our EHDM system for formal specification and verification. We indicate the errors we found in the published analysis of the algorithm, and other benefits that we derived from the verification. Based on our experience, we derive some key requirements for a formal specification and verification system adequate to the task of verifying algorithms of the type considered. Finally, we summarize our conclusions regarding the benefits of formal verification in this domain, and the capabilities required of verification systems in order to realize those benefits.

  16. Cork-resin ablative insulation for complex surfaces and method for applying the same

    Science.gov (United States)

    Walker, H. M.; Sharpe, M. H.; Simpson, W. G. (Inventor)

    1980-01-01

    A method of applying cork-resin ablative insulation material to complex curved surfaces is disclosed. The material is prepared by mixing finely divided cork with a B-stage curable thermosetting resin, forming the resulting mixture into a block, B-stage curing the resin-containing block, and slicing the block into sheets. The B-stage cured sheet is shaped to conform to the surface being insulated, and further curing is then performed. Curing of the resins only to B-stage before shaping enables application of sheet material to complex curved surfaces and avoids limitations and disadvantages presented in handling of fully cured sheet material.

  17. Challenges for effective WMD verification

    International Nuclear Information System (INIS)

    Andemicael, B.

    2006-01-01

    Effective verification is crucial to the fulfillment of the objectives of any disarmament treaty, not least as regards the proliferation of weapons of mass destruction (WMD). The effectiveness of the verification package depends on a number of factors, some inherent in the agreed structure and others related to the type of responses demanded by emerging challenges. The verification systems of three global agencies-the IAEA, the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO, currently the Preparatory Commission), and the Organization for the Prohibition of Chemical Weapons (OPCW)-share similarities in their broad objectives of confidence-building and deterrence by assuring members that rigorous verification would deter or otherwise detect non-compliance. Yet they are up against various constraints and other issues, both internal and external to the treaty regime. These constraints pose major challenges to the effectiveness and reliability of the verification operations. In the nuclear field, the IAEA safeguards process was the first to evolve incrementally from modest Statute beginnings to a robust verification system under the global Treaty on the Non-Proliferation of Nuclear Weapons (NPT). The nuclear non-proliferation regime is now being supplemented by a technology-intensive verification system of the nuclear test-ban treaty (CTBT), a product of over three decades of negotiation. However, there still remain fundamental gaps and loopholes in the regime as a whole, which tend to diminish the combined effectiveness of the IAEA and the CTBT verification capabilities. He three major problems are (a) the lack of universality of membership, essentially because of the absence of three nuclear weapon-capable States-India, Pakistan and Israel-from both the NPT and the CTBT, (b) the changes in US disarmament policy, especially in the nuclear field, and (c) the failure of the Conference on Disarmament to conclude a fissile material cut-off treaty. The world is

  18. A Syntactic-Semantic Approach to Incremental Verification

    OpenAIRE

    Bianculli, Domenico; Filieri, Antonio; Ghezzi, Carlo; Mandrioli, Dino

    2013-01-01

    Software verification of evolving systems is challenging mainstream methodologies and tools. Formal verification techniques often conflict with the time constraints imposed by change management practices for evolving systems. Since changes in these systems are often local to restricted parts, an incremental verification approach could be beneficial. This paper introduces SiDECAR, a general framework for the definition of verification procedures, which are made incremental by the framework...

  19. Surface complexation of neptunium (V) onto whole cells and cell componets of Shewanella alga

    Energy Technology Data Exchange (ETDEWEB)

    Reed, Donald Timothy [Los Alamos National Laboratory; Deo, Randhir P [ASU; Rittmann, Bruce E [ASU; Songkasiri, Warinthorn [UNAFFILIATED

    2008-01-01

    We systematically quantified surface complexation of neptunium(V) onto whole cells of Shewanella alga strain BrY and onto cell wall and extracellular polymeric substances (EPS) of S. alga. We first performed acid and base titrations and used the mathematical model FITEQL with constant-capacitance surface-complexation to determine the concentrations and deprotonation constants of specific surface functional groups. Deprotonation constants most likely corresponded to a carboxyl site associated with amino acids (pK{sub a} {approx} 2.4), a carboxyl group not associated with amino acids (pK{sub a} {approx} 5), a phosphoryl site (pK{sub a} {approx} 7.2), and an amine site (pK{sub a} > 10). We then carried out batch sorption experiments with Np(V) and each of the S. alga components at different pHs. Results show that solution pH influenced the speciation of Np(V) and each of the surface functional groups. We used the speciation sub-model of the biogeochemical model CCBATCH to compute the stability constants for Np(V) complexation to each surface functional group. The stability constants were similar for each functional group on S. alga bacterial whole cells, cell walls, and EPS, and they explain the complicated sorption patterns when they are combined with the aqueous-phase speciation of Np(V). For pH < 8, NpO{sub 2}{sup +} was the dominant form of Np(V), and its log K values for the low-pK{sub a} carboxyl, other carboxyl, and phosphoryl groups were 1.75, 1.75, and 2.5 to 3.1, respectively. For pH greater than 8, the key surface ligand was amine >XNH3+, which complexed with NpO{sub 2}(CO{sub 3}){sub 3}{sup 5-}. The log K for NpO{sub 2}(CO{sub 3}){sub 3}{sup 5-} complexed onto the amine groups was 3.1 to 3.6. All of the log K values are similar to those of Np(V) complexes with aqueous carboxyl and N-containing carboxyl ligands. These results point towards the important role of surface complexation in defining key actinide-microbiological interactions in the subsurface.

  20. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  1. Verification of Ceramic Structures

    Science.gov (United States)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit

    2012-07-01

    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  2. Verification steps for the CMS event-builder software

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    The CMS event-builder software is used to assemble event fragments into complete events at 100 kHz. The data originates at the detector front-end electronics, passes through several computers and is transported from the underground to the high-level trigger farm on the surface. I will present the testing and verifications steps a new software version has to pass before it is deployed in production. I will discuss the current practice and possible improvements.

  3. Is flow verification necessary

    International Nuclear Information System (INIS)

    Beetle, T.M.

    1986-01-01

    Safeguards test statistics are used in an attempt to detect diversion of special nuclear material. Under assumptions concerning possible manipulation (falsification) of safeguards accounting data, the effects on the statistics due to diversion and data manipulation are described algebraically. A comprehensive set of statistics that is capable of detecting any diversion of material is defined in terms of the algebraic properties of the effects. When the assumptions exclude collusion between persons in two material balance areas, then three sets of accounting statistics are shown to be comprehensive. Two of the sets contain widely known accountancy statistics. One of them does not require physical flow verification - comparisons of operator and inspector data for receipts and shipments. The third set contains a single statistic which does not require physical flow verification. In addition to not requiring technically difficult and expensive flow verification, this single statistic has several advantages over other comprehensive sets of statistics. This algebraic approach as an alternative to flow verification for safeguards accountancy is discussed in this paper

  4. Adsorption of uranium(VI) to manganese oxides: X-ray absorption spectroscopy and surface complexation modeling.

    Science.gov (United States)

    Wang, Zimeng; Lee, Sung-Woo; Catalano, Jeffrey G; Lezama-Pacheco, Juan S; Bargar, John R; Tebo, Bradley M; Giammar, Daniel E

    2013-01-15

    The mobility of hexavalent uranium in soil and groundwater is strongly governed by adsorption to mineral surfaces. As strong naturally occurring adsorbents, manganese oxides may significantly influence the fate and transport of uranium. Models for U(VI) adsorption over a broad range of chemical conditions can improve predictive capabilities for uranium transport in the subsurface. This study integrated batch experiments of U(VI) adsorption to synthetic and biogenic MnO(2), surface complexation modeling, ζ-potential analysis, and molecular-scale characterization of adsorbed U(VI) with extended X-ray absorption fine structure (EXAFS) spectroscopy. The surface complexation model included inner-sphere monodentate and bidentate surface complexes and a ternary uranyl-carbonato surface complex, which was consistent with the EXAFS analysis. The model could successfully simulate adsorption results over a broad range of pH and dissolved inorganic carbon concentrations. U(VI) adsorption to synthetic δ-MnO(2) appears to be stronger than to biogenic MnO(2), and the differences in adsorption affinity and capacity are not associated with any substantial difference in U(VI) coordination.

  5. Spatial Evaluation and Verification of Earthquake Simulators

    Science.gov (United States)

    Wilson, John Max; Yoder, Mark R.; Rundle, John B.; Turcotte, Donald L.; Schultz, Kasey W.

    2017-06-01

    In this paper, we address the problem of verifying earthquake simulators with observed data. Earthquake simulators are a class of computational simulations which attempt to mirror the topological complexity of fault systems on which earthquakes occur. In addition, the physics of friction and elastic interactions between fault elements are included in these simulations. Simulation parameters are adjusted so that natural earthquake sequences are matched in their scaling properties. Physically based earthquake simulators can generate many thousands of years of simulated seismicity, allowing for a robust capture of the statistical properties of large, damaging earthquakes that have long recurrence time scales. Verification of simulations against current observed earthquake seismicity is necessary, and following past simulator and forecast model verification methods, we approach the challenges in spatial forecast verification to simulators; namely, that simulator outputs are confined to the modeled faults, while observed earthquake epicenters often occur off of known faults. We present two methods for addressing this discrepancy: a simplistic approach whereby observed earthquakes are shifted to the nearest fault element and a smoothing method based on the power laws of the epidemic-type aftershock (ETAS) model, which distributes the seismicity of each simulated earthquake over the entire test region at a decaying rate with epicentral distance. To test these methods, a receiver operating characteristic plot was produced by comparing the rate maps to observed m>6.0 earthquakes in California since 1980. We found that the nearest-neighbor mapping produced poor forecasts, while the ETAS power-law method produced rate maps that agreed reasonably well with observations.

  6. Assertion based verification methodology for HDL designs of primary sodium pump speed and eddy current flow measurement systems of PFBR

    International Nuclear Information System (INIS)

    Misra, M.K.; Menon, Saritha P.; Thirugnana Murthy, D.

    2013-01-01

    With the growing complexity and size of digital designs, functional verification has become a huge challenge. The validation and testing process accounts for a significant percentage of the overall development effort and cost for electronic systems. Many studies have shown that up to 70% of the design development time and resources are spent on functional verification. Functional errors manifest themselves very early in the design flow, and unless they are detected upfront, they can result in severe consequences - both financially and from a safety viewpoint. This paper covers the various types of verification methodologies and focuses on Assertion Based Verification Methodology for HDL designs, taking as case studies, the Primary Sodium Pump Speed and Eddy Current Flow Measurement Systems of PFBR. (author)

  7. A Verification Study on the Loop-Breaking Logic of FTREX

    International Nuclear Information System (INIS)

    Choi, Jong Soo

    2008-01-01

    The logical loop problem in fault tree analysis (FTA) has been solved by manually or automatically breaking their circular logics. The breaking of logical loops is one of uncertainty sources in fault tree analyses. A practical method which can verify fault tree analysis results was developed by Choi. The method has the capability to handle logical loop problems. It has been implemented in a FORTRAN program which is called VETA (Verification and Evaluation of fault Tree Analysis results) code. FTREX, a well-known fault tree quantifier developed by KAERI, has an automatic loop-breaking logic. In order to make certain of the correctness of the loop-breaking logic of FTREX, some typical trees with complex loops are developed and applied to this study. This paper presents some verification results of the loop-breaking logic tested by the VETA code

  8. Zinc surface complexes on birnessite: A density functional theory study

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Kideok D.; Refson, Keith; Sposito, Garrison

    2009-01-05

    Biogeochemical cycling of zinc is strongly influenced by sorption on birnessite minerals (layer-type MnO2), which are found in diverse terrestrial and aquatic environments. Zinc has been observed to form both tetrahedral (Zn{sup IV}) and octahedral (Zn{sup VI}) triple-corner-sharing surface complexes (TCS) at Mn(IV) vacancy sites in hexagonal birnessite. The octahedral complex is expected to be similar to that of Zn in the Mn oxide mineral, chalcophanite (ZnMn{sub 3}O{sub 7} {center_dot} 3H{sub 2}O), but the reason for the occurrence of the four-coordinate Zn surface species remains unclear. We address this issue computationally using spin-polarized Density Functional Theory (DFT) to examine the Zn{sub IV}-TCS and Zn{sup VI}-TCS species. Structural parameters obtained by DFT geometry optimization were in excellent agreement with available experimental data on Zn-birnessites. Total energy, magnetic moments, and electron-overlap populations obtained by DFT for isolated Zn{sup IV}-TCS revealed that this species is stable in birnessite without a need for Mn(III) substitution in the octahedral sheet and that it is more effective in reducing undersaturation of surface O at a Mn vacancy than is Zn{sub VI}-TCS. Comparison between geometry-optimized ZnMn{sub 3}O{sub 7} {center_dot} 3H{sub 2}O (chalcophanite) and the hypothetical monohydrate mineral, ZnMn{sub 3}O{sub 7} {center_dot} H{sub 2}O, which contains only tetrahedral Zn, showed that the hydration state of Zn significantly affects birnessite structural stability. Finally, our study also revealed that, relative to their positions in an ideal vacancy-free MnO{sub 2}, Mn nearest to Zn in a TCS surface complex move toward the vacancy by 0.08-0.11 {angstrom}, while surface O bordering the vacancy move away from it by 0.16-0.21 {angstrom}, in agreement with recent X-ray absorption spectroscopic analyses.

  9. Survey on Offline Finger Print Verification System

    NARCIS (Netherlands)

    Suman, R.; Kaur, R.

    2012-01-01

    The fingerprint verification, means where "verification" implies a user matching a fingerprint against a single fingerprint associated with the identity that the user claims. Biometrics can be classified into two types Behavioral (signature verification, keystroke dynamics, etc.) And Physiological

  10. A Survey on Formal Verification Techniques for Safety-Critical Systems-on-Chip

    Directory of Open Access Journals (Sweden)

    Tomás Grimm

    2018-05-01

    Full Text Available The high degree of miniaturization in the electronics industry has been, for several years, a driver to push embedded systems to different fields and applications. One example is safety-critical systems, where the compactness in the form factor helps to reduce the costs and allows for the implementation of new techniques. The automotive industry is a great example of a safety-critical area with a great rise in the adoption of microelectronics. With it came the creation of the ISO 26262 standard with the goal of guaranteeing a high level of dependability in the designs. Other areas in the safety-critical applications domain have similar standards. However, these standards are mostly guidelines to make sure that designs reach the desired dependability level without explicit instructions. In the end, the success of the design to fulfill the standard is the result of a thorough verification process. Naturally, the goal of any verification team dealing with such important designs is complete coverage as well as standards conformity, but as these are complex hardware, complete functional verification is a difficult task. From the several techniques that exist to verify hardware, where each has its pros and cons, we studied six well-established in academia and in industry. We can divide them into two categories: simulation, which needs extremely large amounts of time, and formal verification, which needs unrealistic amounts of resources. Therefore, we conclude that a hybrid approach offers the best balance between simulation (time and formal verification (resources.

  11. In-core Instrument Subcritical Verification (INCISV) - Core Design Verification Method - 358

    International Nuclear Information System (INIS)

    Prible, M.C.; Heibel, M.D.; Conner, S.L.; Sebastiani, P.J.; Kistler, D.P.

    2010-01-01

    According to the standard on reload startup physics testing, ANSI/ANS 19.6.1, a plant must verify that the constructed core behaves sufficiently close to the designed core to confirm that the various safety analyses bound the actual behavior of the plant. A large portion of this verification must occur before the reactor operates at power. The INCISV Core Design Verification Method uses the unique characteristics of a Westinghouse Electric Company fixed in-core self powered detector design to perform core design verification after a core reload before power operation. A Vanadium self powered detector that spans the length of the active fuel region is capable of confirming the required core characteristics prior to power ascension; reactivity balance, shutdown margin, temperature coefficient and power distribution. Using a detector element that spans the length of the active fuel region inside the core provides a signal of total integrated flux. Measuring the integrated flux distributions and changes at various rodded conditions and plant temperatures, and comparing them to predicted flux levels, validates all core necessary core design characteristics. INCISV eliminates the dependence on various corrections and assumptions between the ex-core detectors and the core for traditional physics testing programs. This program also eliminates the need for special rod maneuvers which are infrequently performed by plant operators during typical core design verification testing and allows for safer startup activities. (authors)

  12. Surface complexation of selenite on goethite: MO/DFT geometry and charge distribution

    NARCIS (Netherlands)

    Hiemstra, T.; Rietra, R.P.J.J.; Riemsdijk, van W.H.

    2007-01-01

    The adsorption of selenite on goethite (alpha-FeOOH) has been analyzed with the charge distribution (CD) and the multi-site surface complexation (MUSIC) model being combined with an extended Stem (ES) layer model option. The geometry of a set of different types of hydrated iron-selenite complexes

  13. Surface Complexation Modeling in Variable Charge Soils: Charge Characterization by Potentiometric Titration

    Directory of Open Access Journals (Sweden)

    Giuliano Marchi

    2015-10-01

    Full Text Available ABSTRACT Intrinsic equilibrium constants of 17 representative Brazilian Oxisols were estimated from potentiometric titration measuring the adsorption of H+ and OH− on amphoteric surfaces in suspensions of varying ionic strength. Equilibrium constants were fitted to two surface complexation models: diffuse layer and constant capacitance. The former was fitted by calculating total site concentration from curve fitting estimates and pH-extrapolation of the intrinsic equilibrium constants to the PZNPC (hand calculation, considering one and two reactive sites, and by the FITEQL software. The latter was fitted only by FITEQL, with one reactive site. Soil chemical and physical properties were correlated to the intrinsic equilibrium constants. Both surface complexation models satisfactorily fit our experimental data, but for results at low ionic strength, optimization did not converge in FITEQL. Data were incorporated in Visual MINTEQ and they provide a modeling system that can predict protonation-dissociation reactions in the soil surface under changing environmental conditions.

  14. A verification of quantum field theory – measurement of Casimir force

    Indian Academy of Sciences (India)

    journal of. Feb. & Mar. 2001 physics pp. 239–243. A verification of quantum field theory ... minum coated a sphere and flat plate using an atomic force microscope. ... where R is the radius of curvature of the spherical surface. The finite .... sured by AFM) of 60% Au/40% Pd, to form a nonreactive and conductive top layer. For.

  15. Chromate adsorption on selected soil minerals: Surface complexation modeling coupled with spectroscopic investigation

    Energy Technology Data Exchange (ETDEWEB)

    Veselská, Veronika, E-mail: veselskav@fzp.czu.cz [Department of Environmental Geosciences, Faculty of Environmental Sciences, Czech University of Life Sciences Prague, Kamýcka 129, CZ-16521, Prague (Czech Republic); Fajgar, Radek [Department of Analytical and Material Chemistry, Institute of Chemical Process Fundamentals of the CAS, v.v.i., Rozvojová 135/1, CZ-16502, Prague (Czech Republic); Číhalová, Sylva [Department of Environmental Geosciences, Faculty of Environmental Sciences, Czech University of Life Sciences Prague, Kamýcka 129, CZ-16521, Prague (Czech Republic); Bolanz, Ralph M. [Institute of Geosciences, Friedrich-Schiller-University Jena, Carl-Zeiss-Promenade 10, DE-07745, Jena (Germany); Göttlicher, Jörg; Steininger, Ralph [ANKA Synchrotron Radiation Facility, Karlsruhe Institute of Technology, Hermann-von-Helmholtz-Platz 1, DE-76344, Eggenstein-Leopoldshafen (Germany); Siddique, Jamal A.; Komárek, Michael [Department of Environmental Geosciences, Faculty of Environmental Sciences, Czech University of Life Sciences Prague, Kamýcka 129, CZ-16521, Prague (Czech Republic)

    2016-11-15

    Highlights: • Study of Cr(VI) adsorption on soil minerals over a large range of conditions. • Combined surface complexation modeling and spectroscopic techniques. • Diffuse-layer and triple-layer models used to obtain fits to experimental data. • Speciation of Cr(VI) and Cr(III) was assessed. - Abstract: This study investigates the mechanisms of Cr(VI) adsorption on natural clay (illite and kaolinite) and synthetic (birnessite and ferrihydrite) minerals, including its speciation changes, and combining quantitative thermodynamically based mechanistic surface complexation models (SCMs) with spectroscopic measurements. Series of adsorption experiments have been performed at different pH values (3–10), ionic strengths (0.001–0.1 M KNO{sub 3}), sorbate concentrations (10{sup −4}, 10{sup −5}, and 10{sup −6} M Cr(VI)), and sorbate/sorbent ratios (50–500). Fourier transform infrared spectroscopy, X-ray photoelectron spectroscopy, and X-ray absorption spectroscopy were used to determine the surface complexes, including surface reactions. Adsorption of Cr(VI) is strongly ionic strength dependent. For ferrihydrite at pH <7, a simple diffuse-layer model provides a reasonable prediction of adsorption. For birnessite, bidentate inner-sphere complexes of chromate and dichromate resulted in a better diffuse-layer model fit. For kaolinite, outer-sphere complexation prevails mainly at lower Cr(VI) loadings. Dissolution of solid phases needs to be considered for better SCMs fits. The coupled SCM and spectroscopic approach is thus useful for investigating individual minerals responsible for Cr(VI) retention in soils, and improving the handling and remediation processes.

  16. SIMMER-III code-verification. Phase 1

    International Nuclear Information System (INIS)

    Maschek, W.

    1996-05-01

    SIMMER-III is a computer code to investigate core disruptive accidents in liquid metal fast reactors but should also be used to investigate safety related problems in other types of advanced reactors. The code is developed by PNC with cooperation of the European partners FZK, CEA and AEA-T. SIMMER-III is a two-dimensional, three-velocity-field, multiphase, multicomponent, Eulerian, fluid-dynamics code coupled with a space-, time-, and energy-dependent neutron dynamics model. In order to model complex flow situations in a postulated disrupting core, mass and energy conservation equations are solved for 27 density components and 16 energy components, respectively. Three velocity fields (two liquid and one vapor) are modeled to simulate the relative motion of different fluid components. An additional static field takes into account the structures available in a reactor (pins, hexans, vessel structures, internal structures etc.). The neutronics is based on the discrete ordinate method (S N method) coupled into a quasistatic dynamic model. The code assessment and verification of the fluid dynamic/thermohydraulic parts of the code is performed in several steps in a joint effort of all partners. The results of the FZK contributions to the first assessment and verification phase is reported. (orig.) [de

  17. Uranyl adsorption and surface speciation at the imogolite-water interface: Self-consistent spectroscopic and surface complexation models

    Science.gov (United States)

    Arai, Y.; McBeath, M.; Bargar, J.R.; Joye, J.; Davis, J.A.

    2006-01-01

    Macro- and molecular-scale knowledge of uranyl (U(VI)) partitioning reactions with soil/sediment mineral components is important in predicting U(VI) transport processes in the vadose zone and aquifers. In this study, U(VI) reactivity and surface speciation on a poorly crystalline aluminosilicate mineral, synthetic imogolite, were investigated using batch adsorption experiments, X-ray absorption spectroscopy (XAS), and surface complexation modeling. U(VI) uptake on imogolite surfaces was greatest at pH ???7-8 (I = 0.1 M NaNO3 solution, suspension density = 0.4 g/L [U(VI)]i = 0.01-30 ??M, equilibration with air). Uranyl uptake decreased with increasing sodium nitrate concentration in the range from 0.02 to 0.5 M. XAS analyses show that two U(VI) inner-sphere (bidentate mononuclear coordination on outer-wall aluminol groups) and one outer-sphere surface species are present on the imogolite surface, and the distribution of the surface species is pH dependent. At pH 8.8, bis-carbonato inner-sphere and tris-carbonato outer-sphere surface species are present. At pH 7, bis- and non-carbonato inner-sphere surface species co-exist, and the fraction of bis-carbonato species increases slightly with increasing I (0.1-0.5 M). At pH 5.3, U(VI) non-carbonato bidentate mononuclear surface species predominate (69%). A triple layer surface complexation model was developed with surface species that are consistent with the XAS analyses and macroscopic adsorption data. The proton stoichiometry of surface reactions was determined from both the pH dependence of U(VI) adsorption data in pH regions of surface species predominance and from bond-valence calculations. The bis-carbonato species required a distribution of surface charge between the surface and ?? charge planes in order to be consistent with both the spectroscopic and macroscopic adsorption data. This research indicates that U(VI)-carbonato ternary species on poorly crystalline aluminosilicate mineral surfaces may be important in

  18. Hybrid Control and Verification of a Pulsed Welding Process

    DEFF Research Database (Denmark)

    Wisniewski, Rafal; Larsen, Jesper Abildgaard; Izadi-Zamanabadi, Roozbeh

    Currently systems, which are desired to control, are becoming more and more complex and classical control theory objectives, such as stability or sensitivity, are often not sufficient to cover the control objectives of the systems. In this paper it is shown how the dynamics of a pulsed welding...... process can be reformulated into a timed automaton hybrid setting and subsequently properties such as reachability and deadlock absence is verified by the simulation and verification tool UPPAAL....

  19. Fingerprint verification prediction model in hand dermatitis.

    Science.gov (United States)

    Lee, Chew K; Chang, Choong C; Johor, Asmah; Othman, Puwira; Baba, Roshidah

    2015-07-01

    Hand dermatitis associated fingerprint changes is a significant problem and affects fingerprint verification processes. This study was done to develop a clinically useful prediction model for fingerprint verification in patients with hand dermatitis. A case-control study involving 100 patients with hand dermatitis. All patients verified their thumbprints against their identity card. Registered fingerprints were randomized into a model derivation and model validation group. Predictive model was derived using multiple logistic regression. Validation was done using the goodness-of-fit test. The fingerprint verification prediction model consists of a major criterion (fingerprint dystrophy area of ≥ 25%) and two minor criteria (long horizontal lines and long vertical lines). The presence of the major criterion predicts it will almost always fail verification, while presence of both minor criteria and presence of one minor criterion predict high and low risk of fingerprint verification failure, respectively. When none of the criteria are met, the fingerprint almost always passes the verification. The area under the receiver operating characteristic curve was 0.937, and the goodness-of-fit test showed agreement between the observed and expected number (P = 0.26). The derived fingerprint verification failure prediction model is validated and highly discriminatory in predicting risk of fingerprint verification in patients with hand dermatitis. © 2014 The International Society of Dermatology.

  20. Electrical performance verification methodology for large reflector antennas: based on the P-band SAR payload of the ESA BIOMASS candidate mission

    DEFF Research Database (Denmark)

    Pivnenko, Sergey; Kim, Oleksiy S.; Nielsen, Jeppe Majlund

    2013-01-01

    pattern and gain of the entire antenna including support and satellite structure with an appropriate computational software. A preliminary investigation of the proposed methodology was carried out by performing extensive simulations of different verification approaches. The experimental validation......In this paper, an electrical performance verification methodology for large reflector antennas is proposed. The verification methodology was developed for the BIOMASS P-band (435 MHz) synthetic aperture radar (SAR), but can be applied to other large deployable or fixed reflector antennas for which...... the verification of the entire antenna or payload is impossible. The two-step methodology is based on accurate measurement of the feed structure characteristics, such as complex radiation pattern and radiation efficiency, with an appropriate Measurement technique, and then accurate calculation of the radiation...

  1. Tailored optical vector fields for ultrashort-pulse laser induced complex surface plasmon structuring.

    Science.gov (United States)

    Ouyang, J; Perrie, W; Allegre, O J; Heil, T; Jin, Y; Fearon, E; Eckford, D; Edwardson, S P; Dearden, G

    2015-05-18

    Precise tailoring of optical vector beams is demonstrated, shaping their focal electric fields and used to create complex laser micro-patterning on a metal surface. A Spatial Light Modulator (SLM) and a micro-structured S-waveplate were integrated with a picosecond laser system and employed to structure the vector fields into radial and azimuthal polarizations with and without a vortex phase wavefront as well as superposition states. Imprinting Laser Induced Periodic Surface Structures (LIPSS) elucidates the detailed vector fields around the focal region. In addition to clear azimuthal and radial plasmon surface structures, unique, variable logarithmic spiral micro-structures with a pitch Λ ∼1μm, not observed previously, were imprinted on the surface, confirming unambiguously the complex 2D focal electric fields. We show clearly also how the Orbital Angular Momentum(OAM) associated with a helical wavefront induces rotation of vector fields along the optic axis of a focusing lens and confirmed by the observed surface micro-structures.

  2. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  3. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William L.; Trucano, Timothy G.

    2008-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  4. Orion GN&C Fault Management System Verification: Scope And Methodology

    Science.gov (United States)

    Brown, Denise; Weiler, David; Flanary, Ronald

    2016-01-01

    In order to ensure long-term ability to meet mission goals and to provide for the safety of the public, ground personnel, and any crew members, nearly all spacecraft include a fault management (FM) system. For a manned vehicle such as Orion, the safety of the crew is of paramount importance. The goal of the Orion Guidance, Navigation and Control (GN&C) fault management system is to detect, isolate, and respond to faults before they can result in harm to the human crew or loss of the spacecraft. Verification of fault management/fault protection capability is challenging due to the large number of possible faults in a complex spacecraft, the inherent unpredictability of faults, the complexity of interactions among the various spacecraft components, and the inability to easily quantify human reactions to failure scenarios. The Orion GN&C Fault Detection, Isolation, and Recovery (FDIR) team has developed a methodology for bounding the scope of FM system verification while ensuring sufficient coverage of the failure space and providing high confidence that the fault management system meets all safety requirements. The methodology utilizes a swarm search algorithm to identify failure cases that can result in catastrophic loss of the crew or the vehicle and rare event sequential Monte Carlo to verify safety and FDIR performance requirements.

  5. Comparison of Degrees of Potential-Energy-Surface Anharmonicity for Complexes and Clusters with Hydrogen Bonds

    Science.gov (United States)

    Kozlovskaya, E. N.; Doroshenko, I. Yu.; Pogorelov, V. E.; Vaskivskyi, Ye. V.; Pitsevich, G. A.

    2018-01-01

    Previously calculated multidimensional potential-energy surfaces of the MeOH monomer and dimer, water dimer, malonaldehyde, formic acid dimer, free pyridine-N-oxide/trichloroacetic acid complex, and protonated water dimer were analyzed. The corresponding harmonic potential-energy surfaces near the global minima were constructed for series of clusters and complexes with hydrogen bonds of different strengths based on the behavior of the calculated multidimensional potential-energy surfaces. This enabled the introduction of an obvious anharmonicity parameter for the calculated potential-energy surfaces. The anharmonicity parameter was analyzed as functions of the size of the analyzed area near the energy minimum, the number of points over which energies were compared, and the dimensionality of the solved vibrational problem. Anharmonicity parameters for potential-energy surfaces in complexes with strong, medium, and weak H-bonds were calculated under identical conditions. The obtained anharmonicity parameters were compared with the corresponding diagonal anharmonicity constants for stretching vibrations of the bridging protons and the lengths of the hydrogen bridges.

  6. Quantitative analysis of patient-specific dosimetric IMRT verification

    International Nuclear Information System (INIS)

    Budgell, G J; Perrin, B A; Mott, J H L; Fairfoul, J; Mackay, R I

    2005-01-01

    Patient-specific dosimetric verification methods for IMRT treatments are variable, time-consuming and frequently qualitative, preventing evidence-based reduction in the amount of verification performed. This paper addresses some of these issues by applying a quantitative analysis parameter to the dosimetric verification procedure. Film measurements in different planes were acquired for a series of ten IMRT prostate patients, analysed using the quantitative parameter, and compared to determine the most suitable verification plane. Film and ion chamber verification results for 61 patients were analysed to determine long-term accuracy, reproducibility and stability of the planning and delivery system. The reproducibility of the measurement and analysis system was also studied. The results show that verification results are strongly dependent on the plane chosen, with the coronal plane particularly insensitive to delivery error. Unexpectedly, no correlation could be found between the levels of error in different verification planes. Longer term verification results showed consistent patterns which suggest that the amount of patient-specific verification can be safely reduced, provided proper caution is exercised: an evidence-based model for such reduction is proposed. It is concluded that dose/distance to agreement (e.g., 3%/3 mm) should be used as a criterion of acceptability. Quantitative parameters calculated for a given criterion of acceptability should be adopted in conjunction with displays that show where discrepancies occur. Planning and delivery systems which cannot meet the required standards of accuracy, reproducibility and stability to reduce verification will not be accepted by the radiotherapy community

  7. Machine-assisted verification of latent fingerprints: first results for nondestructive contact-less optical acquisition techniques with a CWL sensor

    Science.gov (United States)

    Hildebrandt, Mario; Kiltz, Stefan; Krapyvskyy, Dmytro; Dittmann, Jana; Vielhauer, Claus; Leich, Marcus

    2011-11-01

    A machine-assisted analysis of traces from crime scenes might be possible with the advent of new high-resolution non-destructive contact-less acquisition techniques for latent fingerprints. This requires reliable techniques for the automatic extraction of fingerprint features from latent and exemplar fingerprints for matching purposes using pattern recognition approaches. Therefore, we evaluate the NIST Biometric Image Software for the feature extraction and verification of contact-lessly acquired latent fingerprints to determine potential error rates. Our exemplary test setup includes 30 latent fingerprints from 5 people in two test sets that are acquired from different surfaces using a chromatic white light sensor. The first test set includes 20 fingerprints on two different surfaces. It is used to determine the feature extraction performance. The second test set includes one latent fingerprint on 10 different surfaces and an exemplar fingerprint to determine the verification performance. This utilized sensing technique does not require a physical or chemical visibility enhancement of the fingerprint residue, thus the original trace remains unaltered for further investigations. No particular feature extraction and verification techniques have been applied to such data, yet. Hence, we see the need for appropriate algorithms that are suitable to support forensic investigations.

  8. Post-silicon and runtime verification for modern processors

    CERN Document Server

    Wagner, Ilya

    2010-01-01

    The purpose of this book is to survey the state of the art and evolving directions in post-silicon and runtime verification. The authors start by giving an overview of the state of the art in verification, particularly current post-silicon methodologies in use in the industry, both for the domain of processor pipeline design and for memory subsystems. They then dive into the presentation of several new post-silicon verification solutions aimed at boosting the verification coverage of modern processors, dedicating several chapters to this topic. The presentation of runtime verification solution

  9. Nanofluidic structures with complex three-dimensional surfaces

    International Nuclear Information System (INIS)

    Stavis, Samuel M; Gaitan, Michael; Strychalski, Elizabeth A

    2009-01-01

    Nanofluidic devices have typically explored a design space of patterns limited by a single nanoscale structure depth. A method is presented here for fabricating nanofluidic structures with complex three-dimensional (3D) surfaces, utilizing a single layer of grayscale photolithography and standard integrated circuit manufacturing tools. This method is applied to construct nanofluidic devices with numerous (30) structure depths controlled from ∼10 to ∼620 nm with an average standard deviation of 1 cm. A prototype 3D nanofluidic device is demonstrated that implements size exclusion of rigid nanoparticles and variable nanoscale confinement and deformation of biomolecules.

  10. Functionalized granular activated carbon and surface complexation with chromates and bi-chromates in wastewater

    International Nuclear Information System (INIS)

    Singha, Somdutta; Sarkar, Ujjaini; Luharuka, Pallavi

    2013-01-01

    Cr(VI) is present in the aqueous medium as chromate (CrO 4 2− ) and bi-chromate (HCrO 4 − ). Functionalized granular activated carbons (FACs) are used as adsorbents in the treatment of wastewaters containing hexavalent chromium. The FACs are prepared by chemical modifications of granular activated carbons (GACs) using functionalizing agents like HNO 3 , HCl and HF. The Brunauer, Emmett and Teller surface areas of FAC-HCl (693.5 m 2 /g), FAC-HNO 3 (648.8 m 2 /g) and FAC-HF (726.2 m 2 /g) are comparable to the GAC (777.7 m 2 /g). But, the adsorption capacity of each of the FAC-HNO 3 , FAC-HCl and FAC-HF is found to be higher than the GAC. The functional groups play an important role in the adsorption process and pH has practically no role in this specific case. The FACs have hydrophilic protonated external surfaces in particular, along with the functional surface sites capable to make complexes with the CrO 4 2− and HCrO 4 − present. Surface complex formation is maximized in the order FAC-HNO 3 > FAC-HF > FAC-HCl, in proportion to the total surface acidity. This is also confirmed by the well-known pseudo second-order kinetic model. Physi-sorption equilibrium isotherms are parameterized by using standard Freundlich and Langmuir models. Langmuir fits better. The formation of surface complexes with the functional groups and hexavalent chromium is also revealed in the images of field emission scanning electron micrograph; energy dispersive X-ray spectroscopy and Fourier transform infrared spectroscopy analysis after adsorption. The intra-particle diffusion is not the only rate-controlling factor. The Boyd's film diffusion model fits very well with R 2 as high as 98.1% for FAC-HNO 3 . This result demonstrates that the functionalization of the GAC by acid treatments would increase the diffusion rate, predominantly with a boundary layer diffusion effect. - Highlights: ► Physico-chemical adsorption using functionalized activated carbon (FACs) is applied. ► FACs

  11. Verification and synthesis of optimal decision strategies for complex systems

    International Nuclear Information System (INIS)

    Summers, S. J.

    2013-01-01

    Complex systems make a habit of disagreeing with the mathematical models strategically designed to capture their behavior. A recursive process ensues where data is used to gain insight into the disagreement. A simple model may give way to a model with hybrid dynamics. A deterministic model may give way to a model with stochastic dynamics. In many cases, the modeling framework that sufficiently characterises the system is both hybrid and stochastic; these systems are referred to as stochastic hybrid systems. This dissertation considers the stochastic hybrid system framework for modeling complex systems and provides mathematical methods for analysing, and synthesizing decision laws for, such systems. We first propose a stochastic reach-avoid problem for discrete time stochastic hybrid systems. In particular, we present a dynamic programming based solution to a probabilistic reach-avoid problem for a controlled discrete time stochastic hybrid system. We address two distinct interpretations of the reach-avoid problem via stochastic optimal control. In the first case, a sum-multiplicative cost function is introduced along with a corresponding dynamic recursion that quantifies the probability of hitting a target set at some point during a finite time horizon, while avoiding an unsafe set at all preceding time steps. In the second case, we introduce a multiplicative cost function and a dynamic recursion that quantifies the probability of hitting a target set at the terminal time, while avoiding an unsafe set at all preceding time steps. In each case, optimal reach-avoid control policies are derived as the solution to an optimal control problem via dynamic programming. We next introduce an extension of the reach-avoid problem where we consider the verification of discrete time stochastic hybrid systems when there exists uncertainty in the reachability specifications themselves. A sum multiplicative cost function is introduced along with a corresponding dynamic recursion

  12. Verification and synthesis of optimal decision strategies for complex systems

    Energy Technology Data Exchange (ETDEWEB)

    Summers, S. J.

    2013-07-01

    Complex systems make a habit of disagreeing with the mathematical models strategically designed to capture their behavior. A recursive process ensues where data is used to gain insight into the disagreement. A simple model may give way to a model with hybrid dynamics. A deterministic model may give way to a model with stochastic dynamics. In many cases, the modeling framework that sufficiently characterises the system is both hybrid and stochastic; these systems are referred to as stochastic hybrid systems. This dissertation considers the stochastic hybrid system framework for modeling complex systems and provides mathematical methods for analysing, and synthesizing decision laws for, such systems. We first propose a stochastic reach-avoid problem for discrete time stochastic hybrid systems. In particular, we present a dynamic programming based solution to a probabilistic reach-avoid problem for a controlled discrete time stochastic hybrid system. We address two distinct interpretations of the reach-avoid problem via stochastic optimal control. In the first case, a sum-multiplicative cost function is introduced along with a corresponding dynamic recursion that quantifies the probability of hitting a target set at some point during a finite time horizon, while avoiding an unsafe set at all preceding time steps. In the second case, we introduce a multiplicative cost function and a dynamic recursion that quantifies the probability of hitting a target set at the terminal time, while avoiding an unsafe set at all preceding time steps. In each case, optimal reach-avoid control policies are derived as the solution to an optimal control problem via dynamic programming. We next introduce an extension of the reach-avoid problem where we consider the verification of discrete time stochastic hybrid systems when there exists uncertainty in the reachability specifications themselves. A sum multiplicative cost function is introduced along with a corresponding dynamic recursion

  13. IAEA inspectors complete verification of nuclear material in Iraq

    International Nuclear Information System (INIS)

    2004-01-01

    Full text: At the request of the Government of Iraq and pursuant to the NPT Safeguards Agreement with Iraq, a team of IAEA safeguards inspectors has completed the annual Physical Inventory Verification of declared nuclear material in Iraq, and is returning to Vienna. The material - natural or low-enriched uranium - is not sensitive from a proliferation perspective and is consolidated at a storage facility near the Tuwaitha complex, south of Baghdad. This inspection was conducted with the logistical and security assistance of the Multinational Force and the Office of the UN Security Coordinator. Inspections such as this are required by safeguards agreements with every non-nuclear-weapon state party to the NPT that has declared holdings of nuclear material, to verify the correctness of the declaration, and that material has not been diverted to any undeclared activity. Such inspections have been performed in Iraq on a continuing basis. The most recent took place in June 2003, following reports of looting of nuclear material at the Tuwaitha complex; IAEA inspectors recovered, repackaged and resealed all but a minute amount of material. NPT safeguards inspections are limited in scope and coverage as compared to the verification activities carried out in 1991-98 and 2002-03 by the IAEA under Security Council resolution 687 and related resolutions. 'This week's mission was a good first step,' IAEA Director General Mohamed ElBaradei said. 'Now we hope to be in a position to complete the mandate entrusted to us by the Security Council, to enable the Council over time to remove all sanctions and restrictions imposed on Iraq - so that Iraq's rights as a full-fledged member of the international community can be restored.' The removal of remaining sanctions is dependent on completion of the verification process by the IAEA and the UN Monitoring, Verification and Inspection Commission (UNMOVIC). It should be noted that IAEA technical assistance to Iraq has been resumed over

  14. Surface-illuminant ambiguity and color constancy: effects of scene complexity and depth cues.

    Science.gov (United States)

    Kraft, James M; Maloney, Shannon I; Brainard, David H

    2002-01-01

    Two experiments were conducted to study how scene complexity and cues to depth affect human color constancy. Specifically, two levels of scene complexity were compared. The low-complexity scene contained two walls with the same surface reflectance and a test patch which provided no information about the illuminant. In addition to the surfaces visible in the low-complexity scene, the high-complexity scene contained two rectangular solid objects and 24 paper samples with diverse surface reflectances. Observers viewed illuminated objects in an experimental chamber and adjusted the test patch until it appeared achromatic. Achromatic settings made tinder two different illuminants were used to compute an index that quantified the degree of constancy. Two experiments were conducted: one in which observers viewed the stimuli directly, and one in which they viewed the scenes through an optical system that reduced cues to depth. In each experiment, constancy was assessed for two conditions. In the valid-cue condition, many cues provided valid information about the illuminant change. In the invalid-cue condition, some image cues provided invalid information. Four broad conclusions are drawn from the data: (a) constancy is generally better in the valid-cue condition than in the invalid-cue condition: (b) for the stimulus configuration used, increasing image complexity has little effect in the valid-cue condition but leads to increased constancy in the invalid-cue condition; (c) for the stimulus configuration used, reducing cues to depth has little effect for either constancy condition: and (d) there is moderate individual variation in the degree of constancy exhibited, particularly in the degree to which the complexity manipulation affects performance.

  15. Modeling and simulation for fewer-axis grinding of complex surface

    Science.gov (United States)

    Li, Zhengjian; Peng, Xiaoqiang; Song, Ci

    2017-10-01

    As the basis of fewer-axis grinding of complex surface, the grinding mathematical model is of great importance. A mathematical model of the grinding wheel was established, and then coordinate and normal vector of the wheel profile could be calculated. Through normal vector matching at the cutter contact point and the coordinate system transformation, the grinding mathematical model was established to work out the coordinate of the cutter location point. Based on the model, interference analysis was simulated to find out the right position and posture of workpiece for grinding. Then positioning errors of the workpiece including the translation positioning error and the rotation positioning error were analyzed respectively, and the main locating datum was obtained. According to the analysis results, the grinding tool path was planned and generated to grind the complex surface, and good form accuracy was obtained. The grinding mathematical model is simple, feasible and can be widely applied.

  16. Use of polyamfolit complexes of ethyl-amino-crotonate/acrylic acid with surface-active materials for radionuclide extraction

    International Nuclear Information System (INIS)

    Kabdyrakova, A.M.; Artem'ev, O.I.; Protskij, A.V.; Bimendina, L.A.; Yashkarova, M.G.; Orazzhanova, L.K.

    2005-01-01

    Pentifylline of betaine structure was synthesised on the basis of 3-aminocrotonate and acrylic acid. Polyamfolit composition and its complexes with anionic surface-active material (lauryl sulfate of sodium) were determined. It is revealed that complex formation occurs with [polyamfolit]:[surface active material]=1:1 ratio and is accompanied by significant reduce of system characteristics viscosity. The paper presents results of [polyamfolit]:[surface active material] complex apply experimental investigation for radionuclide directed migration in soil. (author)

  17. Mixed DNA/Oligo(ethylene glycol) Functionalized Gold Surface Improve DNA Hybridization in Complex Media

    International Nuclear Information System (INIS)

    Lee, C.; Gamble, L.; Grainger, D.; Castner, D.

    2006-01-01

    Reliable, direct 'sample-to-answer' capture of nucleic acid targets from complex media would greatly improve existing capabilities of DNA microarrays and biosensors. This goal has proven elusive for many current nucleic acid detection technologies attempting to produce assay results directly from complex real-world samples, including food, tissue, and environmental materials. In this study, we have investigated mixed self-assembled thiolated single-strand DNA (ssDNA) monolayers containing a short thiolated oligo(ethylene glycol) (OEG) surface diluent on gold surfaces to improve the specific capture of DNA targets from complex media. Both surface composition and orientation of these mixed DNA monolayers were characterized with x-ray photoelectron spectroscopy (XPS) and near-edge x-ray absorption fine structure (NEXAFS). XPS results from sequentially adsorbed ssDNA/OEG monolayers on gold indicate that thiolated OEG diluent molecules first incorporate into the thiolated ssDNA monolayer and, upon longer OEG exposures, competitively displace adsorbed ssDNA molecules from the gold surface. NEXAFS polarization dependence results (followed by monitoring the N 1s→π* transition) indicate that adsorbed thiolated ssDNA nucleotide base-ring structures in the mixed ssDNA monolayers are oriented more parallel to the gold surface compared to DNA bases in pure ssDNA monolayers. This supports ssDNA oligomer reorientation towards a more upright position upon OEG mixed adlayer incorporation. DNA target hybridization on mixed ssDNA probe/OEG monolayers was monitored by surface plasmon resonance (SPR). Improvements in specific target capture for these ssDNA probe surfaces due to incorporation of the OEG diluent were demonstrated using two model biosensing assays, DNA target capture from complete bovine serum and from salmon genomic DNA mixtures. SPR results demonstrate that OEG incorporation into the ssDNA adlayer improves surface resistance to both nonspecific DNA and protein

  18. Comparison of 3D anatomical dose verification and 2D phantom dose verification of IMRT/VMAT treatments for nasopharyngeal carcinoma

    International Nuclear Information System (INIS)

    Lin, Hailei; Huang, Shaomin; Deng, Xiaowu; Zhu, Jinhan; Chen, Lixin

    2014-01-01

    The two-dimensional phantom dose verification (2D-PDV) using hybrid plan and planar dose measurement has been widely used for IMRT treatment QA. Due to the lack of information about the correlations between the verification results and the anatomical structure of patients, it is inadequate in clinical evaluation. A three-dimensional anatomical dose verification (3D-ADV) method was used in this study to evaluate the IMRT/VMAT treatment delivery for nasopharyngeal carcinoma and comparison with 2D-PDV was analyzed. Twenty nasopharyngeal carcinoma (NPC) patients treated with IMRT/VMAT were recruited in the study. A 2D ion-chamber array was used for the 2D-PDV in both single-gantry-angle composite (SGAC) and multi-gantry-angle composite (MGAC) verifications. Differences in the gamma pass rate between the 2 verification methods were assessed. Based on measurement of irradiation dose fluence, the 3D dose distribution was reconstructed for 3D-ADV in the above cases. The reconstructed dose homogeneity index (HI), conformity index (CI) of the planning target volume (PTV) were calculated. Gamma pass rate and deviations in the dose-volume histogram (DVH) of each PTV and organ at risk (OAR) were analyzed. In 2D-PDV, the gamma pass rate (3%, 3 mm) of SGAC (99.55% ± 0.83%) was significantly higher than that of MGAC (92.41% ± 7.19%). In 3D-ADV, the gamma pass rates (3%, 3 mm) were 99.75% ± 0.21% in global, 83.82% ± 16.98% to 93.71% ± 6.22% in the PTVs and 45.12% ± 32.78% to 98.08% ± 2.29% in the OARs. The maximum HI increment in PTVnx was 19.34%, while the maximum CI decrement in PTV1 and PTV2 were -32.45% and -6.93%, respectively. Deviations in dose volume of PTVs were all within ±5%. D2% of the brainstem, spinal cord, left/right optic nerves, and the mean doses to the left/right parotid glands maximally increased by 3.5%, 6.03%, 31.13%/26.90% and 4.78%/4.54%, respectively. The 2D-PDV and global gamma pass rate might be insufficient to provide an accurate assessment for

  19. Z-2 Architecture Description and Requirements Verification Results

    Science.gov (United States)

    Graziosi, Dave; Jones, Bobby; Ferl, Jinny; Scarborough, Steve; Hewes, Linda; Ross, Amy; Rhodes, Richard

    2016-01-01

    The Z-2 Prototype Planetary Extravehicular Space Suit Assembly is a continuation of NASA's Z series of spacesuits. The Z-2 is another step in NASA's technology development roadmap leading to human exploration of the Martian surface. The suit was designed for maximum mobility at 8.3 psid, reduced mass, and to have high fidelity life support interfaces. As Z-2 will be man-tested at full vacuum in NASA JSC's Chamber B, it was manufactured as Class II, making it the most flight-like planetary walking suit produced to date. The Z-2 suit architecture is an evolution of previous EVA suits, namely the ISS EMU, Mark III, Rear Entry I-Suit and Z-1 spacesuits. The suit is a hybrid hard and soft multi-bearing, rear entry spacesuit. The hard upper torso (HUT) is an all-composite structure and includes a 2-bearing rolling convolute shoulder with Vernier sizing mechanism, removable suit port interface plate (SIP), elliptical hemispherical helmet and self-don/doff shoulder harness. The hatch is a hybrid aluminum and composite construction with Apollo style gas connectors, custom water pass-thru, removable hatch cage and interfaces to primary and auxiliary life support feed water bags. The suit includes Z-1 style lower arms with cam brackets for Vernier sizing and government furnished equipment (GFE) Phase VI gloves. The lower torso includes a telescopic waist sizing system, waist bearing, rolling convolute waist joint, hard brief, 2 bearing soft hip thigh, Z-1 style legs with ISS EMU style cam brackets for sizing, and conformal walking boots with ankle bearings. The Z-2 Requirements Verification Plan includes the verification of more than 200 individual requirements. The verification methods include test, analysis, inspection, demonstration or a combination of methods. Examples of unmanned requirements include suit leakage, proof pressure testing, operational life, mass, isometric man-loads, sizing adjustment ranges, internal and external interfaces such as in-suit drink bag

  20. RESRAD-BUILD verification

    International Nuclear Information System (INIS)

    Kamboj, S.; Yu, C.; Biwer, B. M.; Klett, T.

    2002-01-01

    The results generated by the RESRAD-BUILD code (version 3.0) were verified with hand or spreadsheet calculations using equations given in the RESRAD-BUILD manual for different pathways. For verification purposes, different radionuclides--H-3, C-14, Na-22, Al-26, Cl-36, Mn-54, Co-60, Au-195, Ra-226, Ra-228, Th-228, and U-238--were chosen to test all pathways and models. Tritium, Ra-226, and Th-228 were chosen because of the special tritium and radon models in the RESRAD-BUILD code. Other radionuclides were selected to represent a spectrum of radiation types and energies. Verification of the RESRAD-BUILD code was conducted with an initial check of all the input parameters for correctness against their original source documents. Verification of the calculations was performed external to the RESRAD-BUILD code with Microsoft Excel to verify all the major portions of the code. In some cases, RESRAD-BUILD results were compared with those of external codes, such as MCNP (Monte Carlo N-particle) and RESRAD. The verification was conducted on a step-by-step basis and used different test cases as templates. The following types of calculations were investigated: (1) source injection rate, (2) air concentration in the room, (3) air particulate deposition, (4) radon pathway model, (5) tritium model for volume source, (6) external exposure model, (7) different pathway doses, and (8) time dependence of dose. Some minor errors were identified in version 3.0; these errors have been corrected in later versions of the code. Some possible improvements in the code were also identified

  1. Technology for bolus verification in proton therapy

    Science.gov (United States)

    Shipulin, K. N.; Mytsin, G. V.; Agapov, A. V.

    2015-01-01

    To ensure the conformal depth-dose distribution of a proton beam within a target volume, complex shaped range shifters (so-called boluses), which account for the heterogeneous structure of patient tissue and organs in the beam path, were calculated and manufactured. The precise manufacturing of proton compensators used for patient treatment is a vital step in quality assurance in proton therapy. In this work a software-hardware complex that verifies the quality and precision of bolus manufacturing at the Medico-Technical Complex (MTC) was developed. The boluses consisted of a positioning system with two photoelectric biosensors. We evaluated 20 boluses used in proton therapy of five patients. A total number of 2562 experimental points were measured, of which only two points had values that differed from the calculated value by more than 0.5 mm. The other data points displayed a deviation within ±0.5 mm from the calculated value. The technology for bolus verification developed in this work can be used for the high precision testing of geometrical parameters of proton compensators in radiotherapy.

  2. Spent fuel verification options for final repository safeguards in Finland. A study on verification methods, their feasibility and safety aspects

    International Nuclear Information System (INIS)

    Hautamaeki, J.; Tiitta, A.

    2000-12-01

    The verification possibilities of the spent fuel assemblies from the Olkiluoto and Loviisa NPPs and the fuel rods from the research reactor of VTT are contemplated in this report. The spent fuel assemblies have to be verified at the partial defect level before the final disposal into the geologic repository. The rods from the research reactor may be verified at the gross defect level. Developing a measurement system for partial defect verification is a complicated and time-consuming task. The Passive High Energy Gamma Emission Tomography and the Fork Detector combined with Gamma Spectrometry are the most potential measurement principles to be developed for this purpose. The whole verification process has to be planned to be as slick as possible. An early start in the planning of the verification and developing the measurement devices is important in order to enable a smooth integration of the verification measurements into the conditioning and disposal process. The IAEA and Euratom have not yet concluded the safeguards criteria for the final disposal. E.g. criteria connected to the selection of the best place to perform the verification. Measurements have not yet been concluded. Options for the verification places have been considered in this report. One option for a verification measurement place is the intermediate storage. The other option is the encapsulation plant. Crucial viewpoints are such as which one offers the best practical possibilities to perform the measurements effectively and which would be the better place in the safeguards point of view. Verification measurements may be needed both in the intermediate storages and in the encapsulation plant. In this report also the integrity of the fuel assemblies after wet intermediate storage period is assessed, because the assemblies have to stand the handling operations of the verification measurements. (orig.)

  3. A hybrid 3D SEM reconstruction method optimized for complex geologic material surfaces.

    Science.gov (United States)

    Yan, Shang; Adegbule, Aderonke; Kibbey, Tohren C G

    2017-08-01

    Reconstruction methods are widely used to extract three-dimensional information from scanning electron microscope (SEM) images. This paper presents a new hybrid reconstruction method that combines stereoscopic reconstruction with shape-from-shading calculations to generate highly-detailed elevation maps from SEM image pairs. The method makes use of an imaged glass sphere to determine the quantitative relationship between observed intensity and angles between the beam and surface normal, and the detector and surface normal. Two specific equations are derived to make use of image intensity information in creating the final elevation map. The equations are used together, one making use of intensities in the two images, the other making use of intensities within a single image. The method is specifically designed for SEM images captured with a single secondary electron detector, and is optimized to capture maximum detail from complex natural surfaces. The method is illustrated with a complex structured abrasive material, and a rough natural sand grain. Results show that the method is capable of capturing details such as angular surface features, varying surface roughness, and surface striations. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. [DNA complexes, formed on aqueous phase surfaces: new planar polymeric and composite nanostructures].

    Science.gov (United States)

    Antipina, M N; Gaĭnutdinov, R V; Rakhnianskaia, A A; Sergeev-Cherenkov, A N; Tolstikhina, A L; Iurova, T V; Kislov, V V; Khomutov, G B

    2003-01-01

    The formation of DNA complexes with Langmuir monolayers of the cationic lipid octadecylamine (ODA) and the new amphiphilic polycation poly-4-vinylpyridine with 16% of cetylpyridinium groups (PVP-16) on the surface of an aqueous solution of native DNA of low ionic strength was studied. Topographic images of Langmuir-Blodgett films of DNA/ODA and DNA/PVP-16 complexes applied to micaceous substrates were investigated by the method of atomic force microscopy. It was found that films of the amphiphilic polycation have an ordered planar polycrystalline structure. The morphology of planar DNA complexes with the amphiphilic cation substantially depended on the incubation time and the phase state of the monolayer on the surface of the aqueous DNA solution. Complex structures and individual DNA molecules were observed on the surface of the amphiphilic monolayer. Along with quasi-linear individual bound DNA molecules, characteristic extended net-like structures and quasi-circular toroidal condensed conformations of planar DNA complexes were detected. Mono- and multilayer films of DNA/PVP-16 complexes were used as templates and nanoreactors for the synthesis of inorganic nanostructures via the binding of metal cations from the solution and subsequent generation of the inorganic phase. As a result, ultrathin polymeric composite films with integrated DNA building blocks and quasi-linear arrays of inorganic semiconductor (CdS) and iron oxide nanoparticles and nanowires were obtained. The nanostructures obtained were characterized by scanning probe microscopy and transmission electron microscopy techniques. The methods developed are promising for investigating the mechanisms of structural organization and transformation in DNA and polyelectrolyte complexes at the gas-liquid interface and for the design of new extremely thin highly ordered planar polymeric and composite materials, films, and coatings with controlled ultrastructure for applications in nanoelectronics and

  5. Verification of safety critical software

    International Nuclear Information System (INIS)

    Son, Ki Chang; Chun, Chong Son; Lee, Byeong Joo; Lee, Soon Sung; Lee, Byung Chai

    1996-01-01

    To assure quality of safety critical software, software should be developed in accordance with software development procedures and rigorous software verification and validation should be performed. Software verification is the formal act of reviewing, testing of checking, and documenting whether software components comply with the specified requirements for a particular stage of the development phase[1]. New software verification methodology was developed and was applied to the Shutdown System No. 1 and 2 (SDS1,2) for Wolsung 2,3 and 4 nuclear power plants by Korea Atomic Energy Research Institute(KAERI) and Atomic Energy of Canada Limited(AECL) in order to satisfy new regulation requirements of Atomic Energy Control Boars(AECB). Software verification methodology applied to SDS1 for Wolsung 2,3 and 4 project will be described in this paper. Some errors were found by this methodology during the software development for SDS1 and were corrected by software designer. Outputs from Wolsung 2,3 and 4 project have demonstrated that the use of this methodology results in a high quality, cost-effective product. 15 refs., 6 figs. (author)

  6. Future of monitoring and verification

    International Nuclear Information System (INIS)

    Wagenmakers, H.

    1991-01-01

    The organized verification entrusted to IAEA for the implementation of the NPT, of the Treaty of Tlatelolco and of the Treaty of Rarotonga, reaches reasonable standards. The current dispute with the Democratic People's Republic of Korea about the conclusion of a safeguards agreement with IAEA, by its exceptional nature, underscores rather than undermines the positive judgement to be passed on IAEA's overall performance. The additional task given to the Director General of IAEA under Security Council resolution 687 (1991) regarding Iraq's nuclear-weapons-usable material is particularly challenging. For the purposes of this paper, verification is defined as the process for establishing whether the States parties are complying with an agreement. In the final stage verification may lead into consideration of how to respond to non-compliance. Monitoring is perceived as the first level in the verification system. It is one generic form of collecting information on objects, activities or events and it involves a variety of instruments ranging from communications satellites to television cameras or human inspectors. Monitoring may also be used as a confidence-building measure

  7. Functionalized granular activated carbon and surface complexation with chromates and bi-chromates in wastewater

    Energy Technology Data Exchange (ETDEWEB)

    Singha, Somdutta; Sarkar, Ujjaini, E-mail: usarkar@chemical.jdvu.ac.in; Luharuka, Pallavi

    2013-03-01

    Cr(VI) is present in the aqueous medium as chromate (CrO{sub 4}{sup 2−}) and bi-chromate (HCrO{sub 4}{sup −}). Functionalized granular activated carbons (FACs) are used as adsorbents in the treatment of wastewaters containing hexavalent chromium. The FACs are prepared by chemical modifications of granular activated carbons (GACs) using functionalizing agents like HNO{sub 3}, HCl and HF. The Brunauer, Emmett and Teller surface areas of FAC-HCl (693.5 m{sup 2}/g), FAC-HNO{sub 3} (648.8 m{sup 2}/g) and FAC-HF (726.2 m{sup 2}/g) are comparable to the GAC (777.7 m{sup 2}/g). But, the adsorption capacity of each of the FAC-HNO{sub 3}, FAC-HCl and FAC-HF is found to be higher than the GAC. The functional groups play an important role in the adsorption process and pH has practically no role in this specific case. The FACs have hydrophilic protonated external surfaces in particular, along with the functional surface sites capable to make complexes with the CrO{sub 4}{sup 2−} and HCrO{sub 4}{sup −} present. Surface complex formation is maximized in the order FAC-HNO{sub 3} > FAC-HF > FAC-HCl, in proportion to the total surface acidity. This is also confirmed by the well-known pseudo second-order kinetic model. Physi-sorption equilibrium isotherms are parameterized by using standard Freundlich and Langmuir models. Langmuir fits better. The formation of surface complexes with the functional groups and hexavalent chromium is also revealed in the images of field emission scanning electron micrograph; energy dispersive X-ray spectroscopy and Fourier transform infrared spectroscopy analysis after adsorption. The intra-particle diffusion is not the only rate-controlling factor. The Boyd's film diffusion model fits very well with R{sup 2} as high as 98.1% for FAC-HNO{sub 3}. This result demonstrates that the functionalization of the GAC by acid treatments would increase the diffusion rate, predominantly with a boundary layer diffusion effect. - Highlights: ► Physico

  8. Using advanced surface complexation models for modelling soil chemistry under forests: Solling forest, Germany

    Energy Technology Data Exchange (ETDEWEB)

    Bonten, Luc T.C., E-mail: luc.bonten@wur.nl [Alterra-Wageningen UR, Soil Science Centre, P.O. Box 47, 6700 AA Wageningen (Netherlands); Groenenberg, Jan E. [Alterra-Wageningen UR, Soil Science Centre, P.O. Box 47, 6700 AA Wageningen (Netherlands); Meesenburg, Henning [Northwest German Forest Research Station, Abt. Umweltkontrolle, Sachgebiet Intensives Umweltmonitoring, Goettingen (Germany); Vries, Wim de [Alterra-Wageningen UR, Soil Science Centre, P.O. Box 47, 6700 AA Wageningen (Netherlands)

    2011-10-15

    Various dynamic soil chemistry models have been developed to gain insight into impacts of atmospheric deposition of sulphur, nitrogen and other elements on soil and soil solution chemistry. Sorption parameters for anions and cations are generally calibrated for each site, which hampers extrapolation in space and time. On the other hand, recently developed surface complexation models (SCMs) have been successful in predicting ion sorption for static systems using generic parameter sets. This study reports the inclusion of an assemblage of these SCMs in the dynamic soil chemistry model SMARTml and applies this model to a spruce forest site in Solling Germany. Parameters for SCMs were taken from generic datasets and not calibrated. Nevertheless, modelling results for major elements matched observations well. Further, trace metals were included in the model, also using the existing framework of SCMs. The model predicted sorption for most trace elements well. - Highlights: > Surface complexation models can be well applied in field studies. > Soil chemistry under a forest site is adequately modelled using generic parameters. > The model is easily extended with extra elements within the existing framework. > Surface complexation models can show the linkages between major soil chemistry and trace element behaviour. - Surface complexation models with generic parameters make calibration of sorption superfluous in dynamic modelling of deposition impacts on soil chemistry under nature areas.

  9. Using advanced surface complexation models for modelling soil chemistry under forests: Solling forest, Germany

    International Nuclear Information System (INIS)

    Bonten, Luc T.C.; Groenenberg, Jan E.; Meesenburg, Henning; Vries, Wim de

    2011-01-01

    Various dynamic soil chemistry models have been developed to gain insight into impacts of atmospheric deposition of sulphur, nitrogen and other elements on soil and soil solution chemistry. Sorption parameters for anions and cations are generally calibrated for each site, which hampers extrapolation in space and time. On the other hand, recently developed surface complexation models (SCMs) have been successful in predicting ion sorption for static systems using generic parameter sets. This study reports the inclusion of an assemblage of these SCMs in the dynamic soil chemistry model SMARTml and applies this model to a spruce forest site in Solling Germany. Parameters for SCMs were taken from generic datasets and not calibrated. Nevertheless, modelling results for major elements matched observations well. Further, trace metals were included in the model, also using the existing framework of SCMs. The model predicted sorption for most trace elements well. - Highlights: → Surface complexation models can be well applied in field studies. → Soil chemistry under a forest site is adequately modelled using generic parameters. → The model is easily extended with extra elements within the existing framework. → Surface complexation models can show the linkages between major soil chemistry and trace element behaviour. - Surface complexation models with generic parameters make calibration of sorption superfluous in dynamic modelling of deposition impacts on soil chemistry under nature areas.

  10. Concepts for inventory verification in critical facilities

    International Nuclear Information System (INIS)

    Cobb, D.D.; Sapir, J.L.; Kern, E.A.; Dietz, R.J.

    1978-12-01

    Materials measurement and inventory verification concepts for safeguarding large critical facilities are presented. Inspection strategies and methods for applying international safeguards to such facilities are proposed. The conceptual approach to routine inventory verification includes frequent visits to the facility by one inspector, and the use of seals and nondestructive assay (NDA) measurements to verify the portion of the inventory maintained in vault storage. Periodic verification of the reactor inventory is accomplished by sampling and NDA measurement of in-core fuel elements combined with measurements of integral reactivity and related reactor parameters that are sensitive to the total fissile inventory. A combination of statistical sampling and NDA verification with measurements of reactor parameters is more effective than either technique used by itself. Special procedures for assessment and verification for abnormal safeguards conditions are also considered. When the inspection strategies and inventory verification methods are combined with strict containment and surveillance methods, they provide a high degree of assurance that any clandestine attempt to divert a significant quantity of fissile material from a critical facility inventory will be detected. Field testing of specific hardware systems and procedures to determine their sensitivity, reliability, and operational acceptability is recommended. 50 figures, 21 tables

  11. Monitoring and verification R and D

    International Nuclear Information System (INIS)

    Pilat, Joseph F.; Budlong-Sylvester, Kory W.; Fearey, Bryan L.

    2011-01-01

    The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R and D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existing energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R and D required to address these gaps and other monitoring and verification challenges.

  12. Verification of ICESat-2/ATLAS Science Receiver Algorithm Onboard Databases

    Science.gov (United States)

    Carabajal, C. C.; Saba, J. L.; Leigh, H. W.; Magruder, L. A.; Urban, T. J.; Mcgarry, J.; Schutz, B. E.

    2013-12-01

    NASA's ICESat-2 mission will fly the Advanced Topographic Laser Altimetry System (ATLAS) instrument on a 3-year mission scheduled to launch in 2016. ATLAS is a single-photon detection system transmitting at 532nm with a laser repetition rate of 10 kHz, and a 6 spot pattern on the Earth's surface. A set of onboard Receiver Algorithms will perform signal processing to reduce the data rate and data volume to acceptable levels. These Algorithms distinguish surface echoes from the background noise, limit the daily data volume, and allow the instrument to telemeter only a small vertical region about the signal. For this purpose, three onboard databases are used: a Surface Reference Map (SRM), a Digital Elevation Model (DEM), and a Digital Relief Maps (DRMs). The DEM provides minimum and maximum heights that limit the signal search region of the onboard algorithms, including a margin for errors in the source databases, and onboard geolocation. Since the surface echoes will be correlated while noise will be randomly distributed, the signal location is found by histogramming the received event times and identifying the histogram bins with statistically significant counts. Once the signal location has been established, the onboard Digital Relief Maps (DRMs) will be used to determine the vertical width of the telemetry band about the signal. University of Texas-Center for Space Research (UT-CSR) is developing the ICESat-2 onboard databases, which are currently being tested using preliminary versions and equivalent representations of elevation ranges and relief more recently developed at Goddard Space Flight Center (GSFC). Global and regional elevation models have been assessed in terms of their accuracy using ICESat geodetic control, and have been used to develop equivalent representations of the onboard databases for testing against the UT-CSR databases, with special emphasis on the ice sheet regions. A series of verification checks have been implemented, including

  13. Verification Test Report for CFAST 3.1.6

    International Nuclear Information System (INIS)

    Vincent, A.M. III

    2002-01-01

    Fire is a significant hazard in most facilities that handle radioactive materials. The severity of fire varies with room arrangement, combustible loading, ventilation and protective system response. The complexity of even simple situations can be unwieldy to solve by hand calculations. Thus, computer simulation of the fire severity has become an important tool in characterizing fire risk. The Savannah River Site (SRS), a Department of Energy facility, has been using the Consolidated Model of Fire Growth and Smoke Transport (CFAST) software to complete such deterministic evaluations to better characterize the nuclear facility fire severity. To fully utilize CFAST at SRS it is necessary to demonstrate that CFAST produces valid analytic solutions over its range of use. This report describes the primary verification exercise that is required to establish that CFAST, and its user interface program FAST, produce valid analytic solutions. This verification exercise may be used to check the fu nctionality of FAST and as a training tool to familiarize users with the software. In addition, the report consolidates the lessons learned by the SRS staff in using FAST and CFAST as fire modeling tools

  14. Data storage accounting and verification at LHC experiments

    Energy Technology Data Exchange (ETDEWEB)

    Huang, C. H. [Fermilab; Lanciotti, E. [CERN; Magini, N. [CERN; Ratnikova, N. [Moscow, ITEP; Sanchez-Hernandez, A. [CINVESTAV, IPN; Serfon, C. [Munich U.; Wildish, T. [Princeton U.; Zhang, X. [Beijing, Inst. High Energy Phys.

    2012-01-01

    All major experiments at the Large Hadron Collider (LHC) need to measure real storage usage at the Grid sites. This information is equally important for resource management, planning, and operations. To verify the consistency of central catalogs, experiments are asking sites to provide a full list of the files they have on storage, including size, checksum, and other file attributes. Such storage dumps, provided at regular intervals, give a realistic view of the storage resource usage by the experiments. Regular monitoring of the space usage and data verification serve as additional internal checks of the system integrity and performance. Both the importance and the complexity of these tasks increase with the constant growth of the total data volumes during the active data taking period at the LHC. The use of common solutions helps to reduce the maintenance costs, both at the large Tier1 facilities supporting multiple virtual organizations and at the small sites that often lack manpower. We discuss requirements and solutions to the common tasks of data storage accounting and verification, and present experiment-specific strategies and implementations used within the LHC experiments according to their computing models.

  15. Profiled Deck Composite Slab Strength Verification: A Review

    Directory of Open Access Journals (Sweden)

    K. Mohammed

    2017-12-01

    Full Text Available The purpose of this article is to present an overview on alternate profiled deck composite slab (PDCS strength verification devoid of the expensive and complex laboratory procedures in establishing its longitudinal shear capacity. Despite the several deterministic research findings leading to the development of proposals and modifications on the complex shear characteristics of PDCS that defines its strength behaviour, the laboratory performance testing stands to be the only accurate means for the PDCS strength assessment. The issue is critical and warrants much further thoughts from different perspective other than the deterministic approach that are rather expensive and time consuming. Hence, the development of a rational-based numerical test load function from longitudinal shear capacity consideration is a necessity in augmenting the previous futile attempts for strength determination of PDCS devoid of the costlier and expensive laboratory procedure.

  16. Verification of Advective Bar Elements Implemented in the Aria Thermal Response Code.

    Energy Technology Data Exchange (ETDEWEB)

    Mills, Brantley [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-01-01

    A verification effort was undertaken to evaluate the implementation of the new advective bar capability in the Aria thermal response code. Several approaches to the verification process were taken : a mesh refinement study to demonstrate solution convergence in the fluid and the solid, visually examining the mapping of the advective bar element nodes to the surrounding surfaces, and a comparison of solutions produced using the advective bars for simple geometries with solutions from commercial CFD software . The mesh refinement study has shown solution convergence for simple pipe flow in both temperature and velocity . Guidelines were provided to achieve appropriate meshes between the advective bar elements and the surrounding volume. Simulations of pipe flow using advective bars elements in Aria have been compared to simulations using the commercial CFD software ANSYS Fluent (r) and provided comparable solutions in temperature and velocity supporting proper implementation of the new capability. Verification of Advective Bar Elements iv Acknowledgements A special thanks goes to Dean Dobranich for his guidance and expertise through all stages of this effort . His advice and feedback was instrumental to its completion. Thanks also goes to Sam Subia and Tolu Okusanya for helping to plan many of the verification activities performed in this document. Thank you to Sam, Justin Lamb and Victor Brunini for their assistance in resolving issues encountered with running the advective bar element model. Finally, thanks goes to Dean, Sam, and Adam Hetzler for reviewing the document and providing very valuable comments.

  17. Face Verification for Mobile Personal Devices

    NARCIS (Netherlands)

    Tao, Q.

    2009-01-01

    In this thesis, we presented a detailed study of the face verification problem on the mobile device, covering every component of the system. The study includes face detection, registration, normalization, and verification. Furthermore, the information fusion problem is studied to verify face

  18. Gender Verification of Female Olympic Athletes.

    Science.gov (United States)

    Dickinson, Barry D.; Genel, Myron; Robinowitz, Carolyn B.; Turner, Patricia L.; Woods, Gary L.

    2002-01-01

    Gender verification of female athletes has long been criticized by geneticists, endocrinologists, and others in the medical community. Recently, the International Olympic Committee's Athletic Commission called for discontinuation of mandatory laboratory-based gender verification of female athletes. This article discusses normal sexual…

  19. Exploring implementation practices in results-based financing: the case of the verification in Benin.

    Science.gov (United States)

    Antony, Matthieu; Bertone, Maria Paola; Barthes, Olivier

    2017-03-14

    Results-based financing (RBF) has been introduced in many countries across Africa and a growing literature is building around the assessment of their impact. These studies are usually quantitative and often silent on the paths and processes through which results are achieved and on the wider health system effects of RBF. To address this gap, our study aims at exploring the implementation of an RBF pilot in Benin, focusing on the verification of results. The study is based on action research carried out by authors involved in the pilot as part of the agency supporting the RBF implementation in Benin. While our participant observation and operational collaboration with project's stakeholders informed the study, the analysis is mostly based on quantitative and qualitative secondary data, collected throughout the project's implementation and documentation processes. Data include project documents, reports and budgets, RBF data on service outputs and on the outcome of the verification, daily activity timesheets of the technical assistants in the districts, as well as focus groups with Community-based Organizations and informal interviews with technical assistants and district medical officers. Our analysis focuses on the actual practices of quantitative, qualitative and community verification. Results show that the verification processes are complex, costly and time-consuming, and in practice they end up differing from what designed originally. We explore the consequences of this on the operation of the scheme, on its potential to generate the envisaged change. We find, for example, that the time taken up by verification procedures limits the time available for data analysis and feedback to facility staff, thus limiting the potential to improve service delivery. Verification challenges also result in delays in bonus payment, which delink effort and reward. Additionally, the limited integration of the verification activities of district teams with their routine tasks

  20. Issues to be considered on obtaining plant models for formal verification purposes

    Science.gov (United States)

    Pacheco, R.; Gonzalez, L.; Intriago, M.; Machado, J.; Prisacaru, G.; Olaru, D.

    2016-08-01

    The development of dependable software for mechatronic systems can be a very complex and hard task. For facilitating the obtaining of dependable software for industrial controllers, some powerful software tools and analysis techniques can be used. Mainly, when using simulation and formal verification analysis techniques, it is necessary to develop plant models, in order to describe the plant behavior of those systems. However, developing a plant model implies that designer takes his (or her) decisions concerning granularity and level of abstraction of models; approach to consider for modeling (global or modular); and definition of strategies for simulation and formal verification tasks. This paper intends to highlight some aspects that can be considered for taking into account those decisions. For this purpose, it is presented a case study and there are illustrated and discussed very important aspects concerning above exposed issues.

  1. Enhancement of the use of digital mock-ups in the verification and validation process for ITER remote handling systems

    Energy Technology Data Exchange (ETDEWEB)

    Sibois, R., E-mail: romain.sibois@vtt.fi [VTT Technical Research Centre of Finland, P.O. Box 1300, 33101 Tampere (Finland); Salminen, K.; Siuko, M. [VTT Technical Research Centre of Finland, P.O. Box 1300, 33101 Tampere (Finland); Mattila, J. [Tampere University of Technology, Korkeakoulunkatu 6, 33720 Tampere (Finland); Määttä, T. [VTT Technical Research Centre of Finland, P.O. Box 1300, 33101 Tampere (Finland)

    2013-10-15

    Highlights: • Verification and validation process for ITER remote handling system. • Verification and validation framework for complex engineering systems. • Verification and validation roadmap for digital modelling phase. • Importance of the product life-cycle management in the verification and validation framework. -- Abstract: The paper is part of the EFDA's programme of European Goal Oriented Training programme on remote handling (RH) “GOT-RH”. The programme aims to train engineers for activities supporting the ITER project and the long-term fusion programme. This paper is written based on the results of a project “verification and validation (V and V) of ITER RH system using digital mock-ups (DMUs)”. The purpose of this project is to study efficient approach of using DMU for the V and V of the ITER RH system design utilizing a system engineering (SE) framework. This paper reviews the definitions of DMU and virtual prototype and overviews the current trends of using virtual prototyping in the industry during the early design phase. Based on the survey of best industrial practices, this paper proposes ways to improve the V and V process for ITER RH system utilizing DMUs.

  2. Reload core safety verification

    International Nuclear Information System (INIS)

    Svetlik, M.; Minarcin, M.

    2003-01-01

    This paper presents a brief look at the process of reload core safety evaluation and verification in Slovak Republic. It gives an overview of experimental verification of selected nuclear parameters in the course of physics testing during reactor start-up. The comparison of IAEA recommendations and testing procedures at Slovak and European nuclear power plants of similar design is included. An introduction of two level criteria for evaluation of tests represents an effort to formulate the relation between safety evaluation and measured values (Authors)

  3. Colour interceptions, thermal stability and surface morphology of polyester metal complexes

    International Nuclear Information System (INIS)

    Zohdy, M.H.

    2005-01-01

    Chelating copolymers via grafting of acrylic acid (AAc) and acrylamide (AAm/AAc) comonomer mixture onto polyester micro fiber fabrics (PETMF) using gamma-radiation technique were prepared. The prepared graft chains (PETMF-g-AAc) and (PETMF-g-PAAc/PAAm) acted as chelating sites for some selected transition metal ions. The prepared graft copolymers and their metal complexes were characterized using thermogravimetric analysis (TGA), colour parameters and surface morphology measurements. The colour interception and strength measurements showed that the metal complexation is homogeneously distributed. The results showed that the thermal stability of PETMF was improved after graft copolymerization and metal complexes. Moreover, the degree of grafting enhanced the thermal stability values of the grafted and complexed copolymers up to 25% of magnitude, on the other hand the activation energy of the grafted-copolymer with acrylic acid increased up to 80%. The SEM observation gives further supports to the homogenous distribution of grafting and metal complexation

  4. The interaction between surface color and color knowledge: Behavioral and electrophysiological evidence

    OpenAIRE

    Bramão, I.; Faísca, L.; Forkstam, C.; Inácio, F.; Araújo, S.; Petersson, K.; Reis, A.

    2012-01-01

    In this study, we used event-related potentials (ERPs) to evaluate the contribution of surface color and color knowledge information in object identification. We constructed two color-object verification tasks – a surface and a knowledge verification task – using high color diagnostic objects; both typical and atypical color versions of the same object were presented. Continuous electroencephalogram was recorded from 26 subjects. A cluster randomization procedure was used to explore the diffe...

  5. Validation of Embedded System Verification Models

    NARCIS (Netherlands)

    Marincic, J.; Mader, Angelika H.; Wieringa, Roelf J.

    The result of a model-based requirements verification shows that the model of a system satisfies (or not) formalised system requirements. The verification result is correct only if the model represents the system adequately. No matter what modelling technique we use, what precedes the model

  6. Compositional verification of real-time systems using Ecdar

    DEFF Research Database (Denmark)

    David, Alexandre; Larsen, Kim Guldstrand; Legay, Axel

    2012-01-01

    We present a specification theory for timed systems implemented in the Ecdar tool. We illustrate the operations of the specification theory on a running example, showing the models and verification checks. To demonstrate the power of the compositional verification, we perform an in depth case study...... of a leader election protocol; Modeling it in Ecdar as Timed input/output automata Specifications and performing both monolithic and compositional verification of two interesting properties on it. We compare the execution time of the compositional to the classical verification showing a huge difference...

  7. Disarmament Verification - the OPCW Experience

    International Nuclear Information System (INIS)

    Lodding, J.

    2010-01-01

    The Chemical Weapons Convention is the only multilateral treaty that bans completely an entire category of weapons of mass destruction under international verification arrangements. Possessor States, i.e. those that have chemical weapons stockpiles at the time of becoming party to the CWC, commit to destroying these. All States undertake never to acquire chemical weapons and not to help other States acquire such weapons. The CWC foresees time-bound chemical disarmament. The deadlines for destruction for early entrants to the CWC are provided in the treaty. For late entrants, the Conference of States Parties intervenes to set destruction deadlines. One of the unique features of the CWC is thus the regime for verifying destruction of chemical weapons. But how can you design a system for verification at military sites, while protecting military restricted information? What degree of assurance is considered sufficient in such circumstances? How do you divide the verification costs? How do you deal with production capability and initial declarations of existing stockpiles? The founders of the CWC had to address these and other challenges in designing the treaty. Further refinement of the verification system has followed since the treaty opened for signature in 1993 and since inspection work was initiated following entry-into-force of the treaty in 1997. Most of this work concerns destruction at the two large possessor States, Russia and the United States. Perhaps some of the lessons learned from the OPCW experience may be instructive in a future verification regime for nuclear weapons. (author)

  8. Verification of Chemical Weapons Destruction

    International Nuclear Information System (INIS)

    Lodding, J.

    2010-01-01

    The Chemical Weapons Convention is the only multilateral treaty that bans completely an entire category of weapons of mass destruction under international verification arrangements. Possessor States, i.e. those that have chemical weapons stockpiles at the time of becoming party to the CWC, commit to destroying these. All States undertake never to acquire chemical weapons and not to help other States acquire such weapons. The CWC foresees time-bound chemical disarmament. The deadlines for destruction for early entrants to the CWC are provided in the treaty. For late entrants, the Conference of States Parties intervenes to set destruction deadlines. One of the unique features of the CWC is thus the regime for verifying destruction of chemical weapons. But how can you design a system for verification at military sites, while protecting military restricted information? What degree of assurance is considered sufficient in such circumstances? How do you divide the verification costs? How do you deal with production capability and initial declarations of existing stockpiles? The founders of the CWC had to address these and other challenges in designing the treaty. Further refinement of the verification system has followed since the treaty opened for signature in 1993 and since inspection work was initiated following entry-into-force of the treaty in 1997. Most of this work concerns destruction at the two large possessor States, Russia and the United States. Perhaps some of the lessons learned from the OPCW experience may be instructive in a future verification regime for nuclear weapons. (author)

  9. High resolution simulations of orographic flow over a complex terrain on the Southeast coast of Brazil

    Science.gov (United States)

    Chou, S. C.; Zolino, M. M.; Gomes, J. L.; Bustamante, J. F.; Lima-e-Silva, P. P.

    2012-04-01

    The Eta Model is used operationally by CPTEC to produce weather forecasts over South America since 1997. The model has gone through upgrades. In order to prepare the model for operational higher resolution forecasts, the model is configured and tested over a region of complex topography located near the coast of Southeast Brazil. The Eta Model was configured, with 2-km horizontal resolution and 50 layers. The Eta-2km is a second nesting, it is driven by Eta-15km, which in its turn is driven by Era-Interim reanalyses. The model domain includes the two Brazilians cities, Rio de Janeiro and Sao Paulo, urban areas, preserved tropical forest, pasture fields, and complex terrain and coastline. Mountains can rise up to about 700m. The region suffers frequent events of floods and landslides. The objective of this work is to evaluate high resolution simulations of wind and temperature in this complex area. Verification of model runs uses observations taken from the nuclear power plant. Accurate near-surface wind direction and magnitude are needed for the plant emergency plan and winds are highly sensitive to model spatial resolution and atmospheric stability. Verification of two cases during summer shows that model has clear diurnal cycle signal for wind in that region. The area is characterized by weak winds which makes the simulation more difficult. The simulated wind magnitude is about 1.5m/s, which is close to observations of about 2m/s; however, the observed change of wind direction of the sea breeze is fast whereas it is slow in the simulations. Nighttime katabatic flow is captured by the simulations. Comparison against Eta-5km runs show that the valley circulation is better described in the 2-km resolution run. Simulated temperatures follow closely the observed diurnal cycle. Experiments improving some surface conditions such as the surface temperature and land cover show simulation error reduction and improved diurnal cycle.

  10. Research on verification and validation strategy of detonation fluid dynamics code of LAD2D

    Science.gov (United States)

    Wang, R. L.; Liang, X.; Liu, X. Z.

    2017-07-01

    The verification and validation (V&V) is an important approach in the software quality assurance of code in complex engineering application. Reasonable and efficient V&V strategy can achieve twice the result with half the effort. This article introduces the software-Lagrangian adaptive hydrodynamics code in 2D space (LAD2D), which is self-developed software in detonation CFD with plastic-elastic structure. The V&V strategy of this detonation CFD code is presented based on the foundation of V&V methodology for scientific software. The basic framework of the module verification and the function validation is proposed, composing the detonation fluid dynamics model V&V strategy of LAD2D.

  11. A Model for Collaborative Runtime Verification

    NARCIS (Netherlands)

    Testerink, Bas; Bulling, Nils; Dastani, Mehdi

    2015-01-01

    Runtime verification concerns checking whether a system execution satisfies a given property. In this paper we propose a model for collaborative runtime verification where a network of local monitors collaborates in order to verify properties of the system. A local monitor has only a local view on

  12. Active Mirror Predictive and Requirements Verification Software (AMP-ReVS)

    Science.gov (United States)

    Basinger, Scott A.

    2012-01-01

    This software is designed to predict large active mirror performance at various stages in the fabrication lifecycle of the mirror. It was developed for 1-meter class powered mirrors for astronomical purposes, but is extensible to other geometries. The package accepts finite element model (FEM) inputs and laboratory measured data for large optical-quality mirrors with active figure control. It computes phenomenological contributions to the surface figure error using several built-in optimization techniques. These phenomena include stresses induced in the mirror by the manufacturing process and the support structure, the test procedure, high spatial frequency errors introduced by the polishing process, and other process-dependent deleterious effects due to light-weighting of the mirror. Then, depending on the maturity of the mirror, it either predicts the best surface figure error that the mirror will attain, or it verifies that the requirements for the error sources have been met once the best surface figure error has been measured. The unique feature of this software is that it ties together physical phenomenology with wavefront sensing and control techniques and various optimization methods including convex optimization, Kalman filtering, and quadratic programming to both generate predictive models and to do requirements verification. This software combines three distinct disciplines: wavefront control, predictive models based on FEM, and requirements verification using measured data in a robust, reusable code that is applicable to any large optics for ground and space telescopes. The software also includes state-of-the-art wavefront control algorithms that allow closed-loop performance to be computed. It allows for quantitative trade studies to be performed for optical systems engineering, including computing the best surface figure error under various testing and operating conditions. After the mirror manufacturing process and testing have been completed, the

  13. Study of solid/liquid and solid/gas interfaces in Cu–isoleucine complex by surface X-ray diffraction

    International Nuclear Information System (INIS)

    Ferrer, Pilar; Rubio-Zuazo, Juan; Castro, German R.

    2013-01-01

    The enzymes could be understood like structures formed by amino acids bonded with metals, which act as active sites. The research on the coordination of metal–amino acid complexes will bring light on the behavior of metal enzymes, due to the close relation existing between the atomic structure and the functionality. The Cu–isoleucine bond is considered as a good model system to attain a better insight into the characteristics of naturally occurring copper metalloproteins. The surface structure of metal–amino acid complex could be considered as a more realistic model for real systems under biologic working conditions, since the molecular packing is decreased. In the surface, the structural constrains are reduced, keeping the structural capability of surface complex to change as a function of the surrounding environment. In this work, we present a surface X-ray diffraction study on Cu–isoleucine complex under different ambient conditions. Cu(Ile) 2 crystals of about 5 mm × 5 mm × 1 mm have been growth, by seeding method in a supersaturated solution, presenting a surface of high quality. The sample for the surface diffraction study was mounted on a cell specially designed for solid/liquid or solid/gas interface analysis. The Cu–isoleucine crystal was measured under a protective dry N 2 gas flow and in contact with a saturated metal amino acid solution. The bulk and the surface signals were compared, showing different atomic structures. In both cases, from surface diffraction data, it is observed that the atomic structure of the top layer undergoes a clear structural deformation. A non-uniform surface relaxation is observed producing an inhomogeneous displacement of the surface atoms towards the surface normal.

  14. HDM/PASCAL Verification System User's Manual

    Science.gov (United States)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  15. Self-verification and contextualized self-views.

    Science.gov (United States)

    Chen, Serena; English, Tammy; Peng, Kaiping

    2006-07-01

    Whereas most self-verification research has focused on people's desire to verify their global self-conceptions, the present studies examined self-verification with regard to contextualized selfviews-views of the self in particular situations and relationships. It was hypothesized that individuals whose core self-conceptions include contextualized self-views should seek to verify these self-views. In Study 1, the more individuals defined the self in dialectical terms, the more their judgments were biased in favor of verifying over nonverifying feedback about a negative, situation-specific self-view. In Study 2, consistent with research on gender differences in the importance of relationships to the self-concept, women but not men showed a similar bias toward feedback about a negative, relationship-specific self-view, a pattern not seen for global self-views. Together, the results support the notion that self-verification occurs for core self-conceptions, whatever form(s) they may take. Individual differences in self-verification and the nature of selfhood and authenticity are discussed.

  16. Verification of RESRAD-build computer code, version 3.1

    International Nuclear Information System (INIS)

    2003-01-01

    RESRAD-BUILD is a computer model for analyzing the radiological doses resulting from the remediation and occupancy of buildings contaminated with radioactive material. It is part of a family of codes that includes RESRAD, RESRAD-CHEM, RESRAD-RECYCLE, RESRAD-BASELINE, and RESRAD-ECORISK. The RESRAD-BUILD models were developed and codified by Argonne National Laboratory (ANL); version 1.5 of the code and the user's manual were publicly released in 1994. The original version of the code was written for the Microsoft DOS operating system. However, subsequent versions of the code were written for the Microsoft Windows operating system. The purpose of the present verification task (which includes validation as defined in the standard) is to provide an independent review of the latest version of RESRAD-BUILD under the guidance provided by ANSI/ANS-10.4 for verification and validation of existing computer programs. This approach consists of a posteriori V and V review which takes advantage of available program development products as well as user experience. The purpose, as specified in ANSI/ANS-10.4, is to determine whether the program produces valid responses when used to analyze problems within a specific domain of applications, and to document the level of verification. The culmination of these efforts is the production of this formal Verification Report. The first step in performing the verification of an existing program was the preparation of a Verification Review Plan. The review plan consisted of identifying: Reason(s) why a posteriori verification is to be performed; Scope and objectives for the level of verification selected; Development products to be used for the review; Availability and use of user experience; and Actions to be taken to supplement missing or unavailable development products. The purpose, scope and objectives for the level of verification selected are described in this section of the Verification Report. The development products that were used

  17. A rule-based verification and control framework in ATLAS Trigger-DAQ

    CERN Document Server

    Kazarov, A; Lehmann-Miotto, G; Sloper, J E; Ryabov, Yu; Computing In High Energy and Nuclear Physics

    2007-01-01

    In order to meet the requirements of ATLAS data taking, the ATLAS Trigger-DAQ system is composed of O(1000) of applications running on more than 2600 computers in a network. With such system size, s/w and h/w failures are quite often. To minimize system downtime, the Trigger-DAQ control system shall include advanced verification and diagnostics facilities. The operator should use tests and expertise of the TDAQ and detectors developers in order to diagnose and recover from errors, if possible automatically. The TDAQ control system is built as a distributed tree of controllers, where behavior of each controller is defined in a rule-based language allowing easy customization. The control system also includes verification framework which allow users to develop and configure tests for any component in the system with different levels of complexity. It can be used as a stand-alone test facility for a small detector installation, as part of the general TDAQ initialization procedure, and for diagnosing the problems ...

  18. Early bone anchorage to micro- and nano-topographically complex implant surfaces in hyperglycemia.

    Science.gov (United States)

    Ajami, Elnaz; Bell, Spencer; Liddell, Robert S; Davies, John E

    2016-07-15

    The aim of this work was to investigate the effect of implant surface design on early bone anchorage in the presence of hyperglycemia. 108 Wistar rats were separated into euglycemic (EG) controls and STZ-treated hyperglycemic (HG) groups, and received bilateral femoral custom rectangular implants of two surface topographies: grit blasted (GB) and grit-blast with a superimposed calcium phosphate nanotopography (GB-DCD). The peri-implant bone was subjected to a tensile disruption test 5, 7, and 9days post-operatively (n=28/time point); the force was measured; and the residual peri-implant bone was observed by scanning electron microscopy (SEM). Disruption forces at 5days were not significantly different from zero for the GB implants (p=0.24) in either metabolic group; but were for GB+DCD implants in both metabolic groups (pmicro-surfaced implants showed significantly different disruption forces at all time points (e.g. >15N and implants, as all values were very low (implant bone showed compromised intra-fibrillar collagen mineralization in hyperglycemia, while inter-fibrillar and cement line mineralization remained unaffected. Enhanced bone anchorage to the implant surfaces was observed on the nanotopographically complex surface independent of metabolic group. The compromised intra-fibrillar mineralization observed provides a mechanism by which early bone mineralization is affected in hyperglycemia. It is generally accepted that the hyperglycemia associated with diabetes mellitus compromises bone quality, although the mechanism by which this occurs is unknown. Uncontrolled hyperglycemia is therefore a contra-indication for bone implant placement. It is also known that nano-topographically complex implant surfaces accelerate early peri-implant healing. In this report we show that, in our experimental model, nano-topographically complex surfaces can mitigate the compromised bone healing seen in hyperglycemia. Importantly, we also provide a mechanistic explanation for

  19. Enhancing the magnetic anisotropy of maghemite nanoparticles via the surface coordination of molecular complexes

    Science.gov (United States)

    Prado, Yoann; Daffé, Niéli; Michel, Aude; Georgelin, Thomas; Yaacoub, Nader; Grenèche, Jean-Marc; Choueikani, Fadi; Otero, Edwige; Ohresser, Philippe; Arrio, Marie-Anne; Cartier-dit-Moulin, Christophe; Sainctavit, Philippe; Fleury, Benoit; Dupuis, Vincent; Lisnard, Laurent; Fresnais, Jérôme

    2015-01-01

    Superparamagnetic nanoparticles are promising objects for data storage or medical applications. In the smallest—and more attractive—systems, the properties are governed by the magnetic anisotropy. Here we report a molecule-based synthetic strategy to enhance this anisotropy in sub-10-nm nanoparticles. It consists of the fabrication of composite materials where anisotropic molecular complexes are coordinated to the surface of the nanoparticles. Reacting 5 nm γ-Fe2O3 nanoparticles with the [CoII(TPMA)Cl2] complex (TPMA: tris(2-pyridylmethyl)amine) leads to the desired composite materials and the characterization of the functionalized nanoparticles evidences the successful coordination—without nanoparticle aggregation and without complex dissociation—of the molecular complexes to the nanoparticles surface. Magnetic measurements indicate the significant enhancement of the anisotropy in the final objects. Indeed, the functionalized nanoparticles show a threefold increase of the blocking temperature and a coercive field increased by one order of magnitude. PMID:26634987

  20. A Verification Logic for GOAL Agents

    Science.gov (United States)

    Hindriks, K. V.

    Although there has been a growing body of literature on verification of agents programs, it has been difficult to design a verification logic for agent programs that fully characterizes such programs and to connect agent programs to agent theory. The challenge is to define an agent programming language that defines a computational framework but also allows for a logical characterization useful for verification. The agent programming language GOAL has been originally designed to connect agent programming to agent theory and we present additional results here that GOAL agents can be fully represented by a logical theory. GOAL agents can thus be said to execute the corresponding logical theory.

  1. Benchmark testing and independent verification of the VS2DT computer code

    International Nuclear Information System (INIS)

    McCord, J.T.

    1994-11-01

    The finite difference flow and transport simulator VS2DT was benchmark tested against several other codes which solve the same equations (Richards equation for flow and the Advection-Dispersion equation for transport). The benchmark problems investigated transient two-dimensional flow in a heterogeneous soil profile with a localized water source at the ground surface. The VS2DT code performed as well as or better than all other codes when considering mass balance characteristics and computational speed. It was also rated highly relative to the other codes with regard to ease-of-use. Following the benchmark study, the code was verified against two analytical solutions, one for two-dimensional flow and one for two-dimensional transport. These independent verifications show reasonable agreement with the analytical solutions, and complement the one-dimensional verification problems published in the code's original documentation

  2. Secure optical verification using dual phase-only correlation

    International Nuclear Information System (INIS)

    Liu, Wei; Liu, Shutian; Zhang, Yan; Xie, Zhenwei; Liu, Zhengjun

    2015-01-01

    We introduce a security-enhanced optical verification system using dual phase-only correlation based on a novel correlation algorithm. By employing a nonlinear encoding, the inherent locks of the verification system are obtained in real-valued random distributions, and the identity keys assigned to authorized users are designed as pure phases. The verification process is implemented in two-step correlation, so only authorized identity keys can output the discriminate auto-correlation and cross-correlation signals that satisfy the reset threshold values. Compared with the traditional phase-only-correlation-based verification systems, a higher security level against counterfeiting and collisions are obtained, which is demonstrated by cryptanalysis using known attacks, such as the known-plaintext attack and the chosen-plaintext attack. Optical experiments as well as necessary numerical simulations are carried out to support the proposed verification method. (paper)

  3. Verification of DRAGON: the NXT tracking module

    International Nuclear Information System (INIS)

    Zkiek, A.; Marleau, G.

    2007-01-01

    The version of DRAGON-IST that has been verified for the calculation of the incremental cross sections associated with CANDU reactivity devices is version 3.04Bb that was released in 2001. Since then, various improvements were implemented in the code including the NXT: module that can track assemblies of clusters in 2-D and 3-D geometries. Here we will discuss the verification plan for the NXT: module of DRAGON, illustrate the verification procedure we selected and present our verification results. (author)

  4. Comparability of the performance of in-line computer vision for geometrical verification of parts, produced by Additive Manufacturing

    DEFF Research Database (Denmark)

    Pedersen, David B.; Hansen, Hans N.

    2014-01-01

    The field of Additive Manufacturing is growing at an accelerated rate, as prototyping is left in favor of direct manufacturing of components for the industry and consumer. A consequence of masscustomization and component complexity is an adverse geometrical verification challenge. Mass...

  5. Evaluating polymer degradation with complex mixtures using a simplified surface area method.

    Science.gov (United States)

    Steele, Kandace M; Pelham, Todd; Phalen, Robert N

    2017-09-01

    Chemical-resistant gloves, designed to protect workers from chemical hazards, are made from a variety of polymer materials such as plastic, rubber, and synthetic rubber. One material does not provide protection against all chemicals, thus proper polymer selection is critical. Standardized testing, such as chemical degradation tests, are used to aid in the selection process. The current methods of degradation ratings based on changes in weight or tensile properties can be expensive and data often do not exist for complex chemical mixtures. There are hundreds of thousands of chemical products on the market that do not have chemical resistance data for polymer selection. The method described in this study provides an inexpensive alternative to gravimetric analysis. This method uses surface area change to evaluate degradation of a polymer material. Degradation tests for 5 polymer types against 50 complex mixtures were conducted using both gravimetric and surface area methods. The percent change data were compared between the two methods. The resulting regression line was y = 0.48x + 0.019, in units of percent, and the Pearson correlation coefficient was r = 0.9537 (p ≤ 0.05), which indicated a strong correlation between percent weight change and percent surface area change. On average, the percent change for surface area was about half that of the weight change. Using this information, an equivalent rating system was developed for determining the chemical degradation of polymer gloves using surface area.

  6. Spectroscopic identification of binary and ternary surface complexes of Np(V) on gibbsite.

    Science.gov (United States)

    Gückel, Katharina; Rossberg, André; Müller, Katharina; Brendler, Vinzenz; Bernhard, Gert; Foerstendorf, Harald

    2013-12-17

    For the first time, detailed molecular information on the Np(V) sorption species on amorphous Al(OH)3 and crystalline gibbsite was obtained by in situ time-resolved Attenuated Total Reflection Fourier-Transform Infrared (ATR FT-IR) and Extended X-ray Absorption Fine Structure (EXAFS) spectroscopy. The results consistently demonstrate the formation of mononuclear inner sphere complexes of the NpO2(+) ion irrespective of the prevailing atmospheric condition. The impact of the presence of atmospheric equivalent added carbonate on the speciation in solution and on the surfaces becomes evident from vibrational data. While the 1:1 aqueous carbonato species (NpO2CO3(-)) was found to become predominant in the circumneutral pH range, it is most likely that this species is sorbed onto the gibbsite surface as a ternary inner sphere surface complex where the NpO2(+) moiety is directly coordinated to the functional groups of the gibbsite's surface. These findings are corroborated by results obtained from EXAFS spectroscopy providing further evidence for a bidentate coordination of the Np(V) ion on amorphous Al(OH)3. The identification of the Np(V) surface species on gibbsite constitutes a basic finding for a comprehensive description of the dissemination of neptunium in groundwater systems.

  7. Technical challenges for dismantlement verification

    International Nuclear Information System (INIS)

    Olinger, C.T.; Stanbro, W.D.; Johnston, R.G.; Nakhleh, C.W.; Dreicer, J.S.

    1997-01-01

    In preparation for future nuclear arms reduction treaties, including any potential successor treaties to START I and II, the authors have been examining possible methods for bilateral warhead dismantlement verification. Warhead dismantlement verification raises significant challenges in the political, legal, and technical arenas. This discussion will focus on the technical issues raised by warhead arms controls. Technical complications arise from several sources. These will be discussed under the headings of warhead authentication, chain-of-custody, dismantlement verification, non-nuclear component tracking, component monitoring, and irreversibility. The authors will discuss possible technical options to address these challenges as applied to a generic dismantlement and disposition process, in the process identifying limitations and vulnerabilities. They expect that these considerations will play a large role in any future arms reduction effort and, therefore, should be addressed in a timely fashion

  8. Safety Verification for Probabilistic Hybrid Systems

    DEFF Research Database (Denmark)

    Zhang, Lijun; She, Zhikun; Ratschan, Stefan

    2010-01-01

    The interplay of random phenomena and continuous real-time control deserves increased attention for instance in wireless sensing and control applications. Safety verification for such systems thus needs to consider probabilistic variations of systems with hybrid dynamics. In safety verification o...... on a number of case studies, tackled using a prototypical implementation....

  9. Reactive solute transport in streams: A surface complexation approach for trace metal sorption

    Science.gov (United States)

    Runkel, Robert L.; Kimball, Briant A.; McKnight, Diane M.; Bencala, Kenneth E.

    1999-01-01

    A model for trace metals that considers in-stream transport, metal oxide precipitation-dissolution, and pH-dependent sorption is presented. Linkage between a surface complexation submodel and the stream transport equations provides a framework for modeling sorption onto static and/or dynamic surfaces. A static surface (e.g., an iron- oxide-coated streambed) is defined as a surface with a temporally constant solid concentration. Limited contact between solutes in the water column and the static surface is considered using a pseudokinetic approach. A dynamic surface (e.g., freshly precipitated metal oxides) has a temporally variable solid concentration and is in equilibrium with the water column. Transport and deposition of solute mass sorbed to the dynamic surface is represented in the stream transport equations that include precipitate settling. The model is applied to a pH-modification experiment in an acid mine drainage stream. Dissolved copper concentrations were depressed for a 3 hour period in response to the experimentally elevated pH. After passage of the pH front, copper was desorbed, and dissolved concentrations returned to ambient levels. Copper sorption is modeled by considering sorption to aged hydrous ferric oxide (HFO) on the streambed (static surface) and freshly precipitated HFO in the water column (dynamic surface). Comparison of parameter estimates with reported values suggests that naturally formed iron oxides may be more effective in removing trace metals than synthetic oxides used in laboratory studies. The model's ability to simulate pH, metal oxide precipitation-dissolution, and pH-dependent sorption provides a means of evaluating the complex interactions between trace metal chemistry and hydrologic transport at the field scale.

  10. Towards Verification of Operational Procedures Using Auto-Generated Diagnostic Trees

    Science.gov (United States)

    Kurtoglu, Tolga; Lutz, Robyn; Patterson-Hine, Ann

    2009-01-01

    The design, development, and operation of complex space, lunar and planetary exploration systems require the development of general procedures that describe a detailed set of instructions capturing how mission tasks are performed. For both crewed and uncrewed NASA systems, mission safety and the accomplishment of the scientific mission objectives are highly dependent on the correctness of procedures. In this paper, we describe how to use the auto-generated diagnostic trees from existing diagnostic models to improve the verification of standard operating procedures. Specifically, we introduce a systematic method, namely the Diagnostic Tree for Verification (DTV), developed with the goal of leveraging the information contained within auto-generated diagnostic trees in order to check the correctness of procedures, to streamline the procedures in terms of reducing the number of steps or use of resources in them, and to propose alternative procedural steps adaptive to changing operational conditions. The application of the DTV method to a spacecraft electrical power system shows the feasibility of the approach and its range of capabilities

  11. SSN Verification Service

    Data.gov (United States)

    Social Security Administration — The SSN Verification Service is used by Java applications to execute the GUVERF02 service using the WebSphere/CICS Interface. It accepts several input data fields...

  12. Verification of Treatment Planning System (TPS) on Beam Axis of Co-60 Teletherapy

    International Nuclear Information System (INIS)

    Nunung-Nuraeni; Budhy-Kurniawan; Purwanto; Sugiyantari; Heru-Prasetio; Nasukha

    2001-01-01

    Cancer diseases up to now can be able to be treated by using surgery, chemotherapy and radiotherapy. The need of high level precision and accuracy on radiation dose are very important task. One of task is verification of Treatment Planning System (Tps) to the treatment of patients. The research has been done to verify Tps on beam exis of teletherapy Co-60. Result found that the different between Tps and measurements are about -2.682 % to 1.918% for simple geometry and homogeneous material, 5.278 % to 4.990 % for complex geometry, and -3.202 % to -2.090 % for more complex geometry. (author)

  13. Final Report Independent Verification Survey of the High Flux Beam Reactor, Building 802 Fan House Brookhaven National Laboratory Upton, New York

    Energy Technology Data Exchange (ETDEWEB)

    Harpeneau, Evan M. [Oak Ridge Institute for Science and Education, Oak Ridge, TN (United States). Independent Environmental Assessment and Verification Program

    2011-06-24

    On May 9, 2011, ORISE conducted verification survey activities including scans, sampling, and the collection of smears of the remaining soils and off-gas pipe associated with the 802 Fan House within the HFBR (High Flux Beam Reactor) Complex at BNL. ORISE is of the opinion, based on independent scan and sample results obtained during verification activities at the HFBR 802 Fan House, that the FSS (final status survey) unit meets the applicable site cleanup objectives established for as left radiological conditions.

  14. Enhanced Verification Test Suite for Physics Simulation Codes

    Energy Technology Data Exchange (ETDEWEB)

    Kamm, J R; Brock, J S; Brandon, S T; Cotrell, D L; Johnson, B; Knupp, P; Rider, W; Trucano, T; Weirs, V G

    2008-10-10

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations. The key points of this document are: (1) Verification deals with mathematical correctness of the numerical algorithms in a code, while validation deals with physical correctness of a simulation in a regime of interest. This document is about verification. (2) The current seven-problem Tri-Laboratory Verification Test Suite, which has been used for approximately five years at the DOE WP laboratories, is limited. (3) Both the methodology for and technology used in verification analysis have evolved and been improved since the original test suite was proposed. (4) The proposed test problems are in three basic areas: (a) Hydrodynamics; (b) Transport processes; and (c) Dynamic strength-of-materials. (5) For several of the proposed problems we provide a 'strong sense verification benchmark', consisting of (i) a clear mathematical statement of the problem with sufficient information to run a computer simulation, (ii) an explanation of how the code result and benchmark solution are to be evaluated, and (iii) a description of the acceptance criterion for simulation code results. (6) It is proposed that the set of verification test problems with which any particular code be evaluated include some of the problems described in this document. Analysis of the proposed verification test problems constitutes part of a necessary--but not sufficient--step that builds confidence in physics and engineering simulation codes. More complicated test cases, including physics models of

  15. Advancing Disarmament Verification Tools: A Task for Europe?

    International Nuclear Information System (INIS)

    Göttsche, Malte; Kütt, Moritz; Neuneck, Götz; Niemeyer, Irmgard

    2015-01-01

    A number of scientific-technical activities have been carried out to establish more robust and irreversible disarmament verification schemes. Regardless of the actual path towards deeper reductions in nuclear arsenals or their total elimination in the future, disarmament verification will require new verification procedures and techniques. This paper discusses the information that would be required as a basis for building confidence in disarmament, how it could be principally verified and the role Europe could play. Various ongoing activities are presented that could be brought together to produce a more intensified research and development environment in Europe. The paper argues that if ‘effective multilateralism’ is the main goal of the European Union’s (EU) disarmament policy, EU efforts should be combined and strengthened to create a coordinated multilateral disarmament verification capacity in the EU and other European countries. The paper concludes with several recommendations that would have a significant impact on future developments. Among other things, the paper proposes a one-year review process that should include all relevant European actors. In the long run, an EU Centre for Disarmament Verification could be envisaged to optimize verification needs, technologies and procedures.

  16. The verification of DRAGON: progress and lessons learned

    International Nuclear Information System (INIS)

    Marleau, G.

    2002-01-01

    The general requirements for the verification of the legacy code DRAGON are somewhat different from those used for new codes. For example, the absence of a design manual for DRAGON makes it difficult to confirm that the each part of the code performs as required since these requirements are not explicitly spelled out for most of the DRAGON modules. In fact, this conformance of the code can only be assessed, in most cases, by making sure that the contents of the DRAGON data structures, which correspond to the output generated by a module of the code, contains the adequate information. It is also possible in some cases to use the self-verification options in DRAGON to perform additional verification or to evaluate, using an independent software, the performance of specific functions in the code. Here, we will describe the global verification process that was considered in order to bring DRAGON to an industry standard tool-set (IST) status. We will also discuss some of the lessons we learned in performing this verification and present some of the modification to DRAGON that were implemented as a consequence of this verification. (author)

  17. Simulation Environment Based on the Universal Verification Methodology

    CERN Document Server

    AUTHOR|(SzGeCERN)697338

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC desi...

  18. Hierarchical Representation Learning for Kinship Verification.

    Science.gov (United States)

    Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul

    2017-01-01

    Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this paper, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determine their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d' , and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical kinship verification via representation learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU kinship database is created, which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields the state-of-the-art kinship verification accuracy on the WVU kinship database and on four existing benchmark data sets. Furthermore, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.

  19. Energetic Surface Smoothing of Complex Metal-Oxide Thin Films

    International Nuclear Information System (INIS)

    Willmott, P.R.; Herger, R.; Schlepuetz, C.M.; Martoccia, D.; Patterson, B.D.

    2006-01-01

    A novel energetic smoothing mechanism in the growth of complex metal-oxide thin films is reported from in situ kinetic studies of pulsed laser deposition of La 1-x Sr x MnO 3 on SrTiO 3 , using x-ray reflectivity. Below 50% monolayer coverage, prompt insertion of energetic impinging species into small-diameter islands causes them to break up to form daughter islands. This smoothing mechanism therefore inhibits the formation of large-diameter 2D islands and the seeding of 3D growth. Above 50% coverage, islands begin to coalesce and their breakup is thereby suppressed. The energy of the incident flux is instead rechanneled into enhanced surface diffusion, which leads to an increase in the effective surface temperature of ΔT≅500 K. These results have important implications on optimal conditions for nanoscale device fabrication using these materials

  20. Mechanisms available for cooling plants’ surfaces

    Directory of Open Access Journals (Sweden)

    Prokhorov Alexey Anatolievich

    2016-12-01

    Full Text Available The essay briefly touches upon the main mechanisms to cool down the plats’ surfaces that lead to condensation of atmospheric moisture; methods for experimental verification of these mechanisms are presented therein.

  1. Knowledge-based inspection:modelling complex processes with the integrated Safeguards Modelling Method (iSMM)

    International Nuclear Information System (INIS)

    Abazi, F.

    2011-01-01

    Increased level of complexity in almost every discipline and operation today raises the demand for knowledge in order to successfully run an organization whether to generate profit or to attain a non-profit mission. Traditional way of transferring knowledge to information systems rich in data structures and complex algorithms continue to hinder the ability to swiftly turnover concepts into operations. Diagrammatic modelling commonly applied in engineering in order to represent concepts or reality remains to be an excellent way of converging knowledge from domain experts. The nuclear verification domain represents ever more a matter which has great importance to the World safety and security. Demand for knowledge about nuclear processes and verification activities used to offset potential misuse of nuclear technology will intensify with the growth of the subject technology. This Doctoral thesis contributes with a model-based approach for representing complex process such as nuclear inspections. The work presented contributes to other domains characterized with knowledge intensive and complex processes. Based on characteristics of a complex process a conceptual framework was established as the theoretical basis for creating a number of modelling languages to represent the domain. The integrated Safeguards Modelling Method (iSMM) is formalized through an integrated meta-model. The diagrammatic modelling languages represent the verification domain and relevant nuclear verification aspects. Such a meta-model conceptualizes the relation between practices of process management, knowledge management and domain specific verification principles. This fusion is considered as necessary in order to create quality processes. The study also extends the formalization achieved through a meta-model by contributing with a formalization language based on Pattern Theory. Through the use of graphical and mathematical constructs of the theory, process structures are formalized enhancing

  2. Verification Test Report for CFAST 3.1.6; TOPICAL

    International Nuclear Information System (INIS)

    Vincent, A.M. III

    2002-01-01

    Fire is a significant hazard in most facilities that handle radioactive materials. The severity of fire varies with room arrangement, combustible loading, ventilation and protective system response. The complexity of even simple situations can be unwieldy to solve by hand calculations. Thus, computer simulation of the fire severity has become an important tool in characterizing fire risk. The Savannah River Site (SRS), a Department of Energy facility, has been using the Consolidated Model of Fire Growth and Smoke Transport (CFAST) software to complete such deterministic evaluations to better characterize the nuclear facility fire severity. To fully utilize CFAST at SRS it is necessary to demonstrate that CFAST produces valid analytic solutions over its range of use. This report describes the primary verification exercise that is required to establish that CFAST, and its user interface program FAST, produce valid analytic solutions. This verification exercise may be used to check the fu nctionality of FAST and as a training tool to familiarize users with the software. In addition, the report consolidates the lessons learned by the SRS staff in using FAST and CFAST as fire modeling tools

  3. Self-verification motives at the collective level of self-definition.

    Science.gov (United States)

    Chen, Serena; Chen, Karen Y; Shaw, Lindsay

    2004-01-01

    Three studies examined self-verification motives in relation to collective aspects of the self. Several moderators of collective self-verification were also examined--namely, the certainty with which collective self-views are held, the nature of one's ties to a source of self-verification, the salience of the collective self, and the importance of group identification. Evidence for collective self-verification emerged across all studies, particularly when collective self-views were held with high certainty (Studies 1 and 2), perceivers were somehow tied to the source of self-verification (Study 1), the collective self was salient (Study 2), and group identification was important (Study 3). To the authors' knowledge, these studies are the first to examine self-verification at the collective level of self-definition. The parallel and distinct ways in which self-verification processes may operate at different levels of self-definition are discussed.

  4. Results of the radiological verification survey of the partial remediation at 90 Avenue C, Lodi, New Jersey (LJ079V)

    International Nuclear Information System (INIS)

    Foley, R.D.; Johnson, C.A.

    1994-02-01

    The property at 90 Avenue C, Lodi, New Jersey is one of the vicinity properties of the former Maywood Chemical Works, Maywood, New Jersey designated for remedial action by the US Department of Energy (DOE). In July 1991, Bechtel National, Inc. performed a partial remedial action on this property. At the request of DOE, a team from Oak Ridge National Laboratory conducted an independent radiological verification survey in July, 1991 at this site. The purpose of the verification survey was to ensure the effectiveness of remedial actions performed within FUSRAP and to confirm the site's compliance with DOE guidelines. The radiological survey included surface gamma scans indoors and outdoors, ground-level beta-gamma measurements, and systematic and biased soil and material sampling. Results of the verification survey demonstrated that all radiological measurements on the portions of the property that had been remediated were within DOE guidelines. However, there still remains a portion of the property to be remediated that is not covered by this verification survey

  5. Energy transfer between surface-immobilized light-harvesting chlorophyll a/b complex (LHCII) studied by surface plasmon field-enhanced fluorescence spectroscopy (SPFS).

    Science.gov (United States)

    Lauterbach, Rolf; Liu, Jing; Knoll, Wolfgang; Paulsen, Harald

    2010-11-16

    The major light-harvesting chlorophyll a/b complex (LHCII) of the photosynthetic apparatus in green plants can be viewed as a protein scaffold binding and positioning a large number of pigment molecules that combines rapid and efficient excitation energy transfer with effective protection of its pigments from photobleaching. These properties make LHCII potentially interesting as a light harvester (or a model thereof) in photoelectronic applications. Most of such applications would require the LHCII to be immobilized on a solid surface. In a previous study we showed the immobilization of recombinant LHCII on functionalized gold surfaces via a 6-histidine tag (His tag) in the protein moiety. In this work the occurrence and efficiency of Förster energy transfer between immobilized LHCII on a functionalized surface have been analyzed by surface plasmon field-enhanced fluorescence spectroscopy (SPFS). A near-infrared dye was attached to some but not all of the LHC complexes, serving as an energy acceptor to chlorophylls. Analysis of the energy transfer from chlorophylls to this acceptor dye yielded information about the extent of intercomplex energy transfer between immobilized LHCII.

  6. 24 CFR 5.512 - Verification of eligible immigration status.

    Science.gov (United States)

    2010-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status of...

  7. Verification of RADTRAN

    International Nuclear Information System (INIS)

    Kanipe, F.L.; Neuhauser, K.S.

    1995-01-01

    This document presents details of the verification process of the RADTRAN computer code which was established for the calculation of risk estimates for radioactive materials transportation by highway, rail, air, and waterborne modes

  8. CASL Verification and Validation Plan

    Energy Technology Data Exchange (ETDEWEB)

    Mousseau, Vincent Andrew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dinh, Nam [North Carolina State Univ., Raleigh, NC (United States)

    2016-06-30

    This report documents the Consortium for Advanced Simulation of LWRs (CASL) verification and validation plan. The document builds upon input from CASL subject matter experts, most notably the CASL Challenge Problem Product Integrators, CASL Focus Area leaders, and CASL code development and assessment teams. This document will be a living document that will track progress on CASL to do verification and validation for both the CASL codes (including MPACT, CTF, BISON, MAMBA) and for the CASL challenge problems (CIPS, PCI, DNB). The CASL codes and the CASL challenge problems are at differing levels of maturity with respect to validation and verification. The gap analysis will summarize additional work that needs to be done. Additional VVUQ work will be done as resources permit. This report is prepared for the Department of Energy’s (DOE’s) CASL program in support of milestone CASL.P13.02.

  9. Engineering the cell surface display of cohesins for assembly of cellulosome-inspired enzyme complexes on Lactococcus lactis

    Directory of Open Access Journals (Sweden)

    Wieczorek Andrew S

    2010-09-01

    Full Text Available Abstract Background The assembly and spatial organization of enzymes in naturally occurring multi-protein complexes is of paramount importance for the efficient degradation of complex polymers and biosynthesis of valuable products. The degradation of cellulose into fermentable sugars by Clostridium thermocellum is achieved by means of a multi-protein "cellulosome" complex. Assembled via dockerin-cohesin interactions, the cellulosome is associated with the cell surface during cellulose hydrolysis, forming ternary cellulose-enzyme-microbe complexes for enhanced activity and synergy. The assembly of recombinant cell surface displayed cellulosome-inspired complexes in surrogate microbes is highly desirable. The model organism Lactococcus lactis is of particular interest as it has been metabolically engineered to produce a variety of commodity chemicals including lactic acid and bioactive compounds, and can efficiently secrete an array of recombinant proteins and enzymes of varying sizes. Results Fragments of the scaffoldin protein CipA were functionally displayed on the cell surface of Lactococcus lactis. Scaffolds were engineered to contain a single cohesin module, two cohesin modules, one cohesin and a cellulose-binding module, or only a cellulose-binding module. Cell toxicity from over-expression of the proteins was circumvented by use of the nisA inducible promoter, and incorporation of the C-terminal anchor motif of the streptococcal M6 protein resulted in the successful surface-display of the scaffolds. The facilitated detection of successfully secreted scaffolds was achieved by fusion with the export-specific reporter staphylococcal nuclease (NucA. Scaffolds retained their ability to associate in vivo with an engineered hybrid reporter enzyme, E. coli β-glucuronidase fused to the type 1 dockerin motif of the cellulosomal enzyme CelS. Surface-anchored complexes exhibited dual enzyme activities (nuclease and β-glucuronidase, and were

  10. The monitoring and verification of nuclear weapons

    International Nuclear Information System (INIS)

    Garwin, Richard L.

    2014-01-01

    This paper partially reviews and updates the potential for monitoring and verification of nuclear weapons, including verification of their destruction. Cooperative monitoring with templates of the gamma-ray spectrum are an important tool, dependent on the use of information barriers

  11. DarcyTools, Version 2.1. Verification and validation

    International Nuclear Information System (INIS)

    Svensson, Urban

    2004-03-01

    DarcyTools is a computer code for simulation of flow and transport in porous and/or fractured media. The fractured media in mind is a fractured rock and the porous media the soil cover on the top of the rock; it is hence groundwater flows, which is the class of flows in mind. A number of novel methods and features form the present version of DarcyTools. In the verification studies, these methods are evaluated by comparisons with analytical solutions for idealized situations. The five verification groups, thus reflect the main areas of recent developments. The present report will focus on the Verification and Validation of DarcyTools. Two accompanying reports cover other aspects: - Concepts, Methods, Equations and Demo Simulations. - User's Guide. The objective of this report is to compile all verification and validation studies that have been carried out so far. After some brief introductory sections, all cases will be reported in Appendix A (verification cases) and Appendix B (validation cases)

  12. Verification and quality control of routine hematology analyzers.

    Science.gov (United States)

    Vis, J Y; Huisman, A

    2016-05-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and linearity throughout the expected range of results. Yet, which standard should be met or which verification limit be used is at the discretion of the laboratory specialist. This paper offers practical guidance on verification and quality control of automated hematology analyzers and provides an expert opinion on the performance standard that should be met by the contemporary generation of hematology analyzers. Therefore (i) the state-of-the-art performance of hematology analyzers for complete blood count parameters is summarized, (ii) considerations, challenges, and pitfalls concerning the development of a verification plan are discussed, (iii) guidance is given regarding the establishment of reference intervals, and (iv) different methods on quality control of hematology analyzers are reviewed. © 2016 John Wiley & Sons Ltd.

  13. DarcyTools, Version 2.1. Verification and validation

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, Urban [Computer-aided Fluid Engineering AB, Norrkoeping (Sweden)

    2004-03-01

    DarcyTools is a computer code for simulation of flow and transport in porous and/or fractured media. The fractured media in mind is a fractured rock and the porous media the soil cover on the top of the rock; it is hence groundwater flows, which is the class of flows in mind. A number of novel methods and features form the present version of DarcyTools. In the verification studies, these methods are evaluated by comparisons with analytical solutions for idealized situations. The five verification groups, thus reflect the main areas of recent developments. The present report will focus on the Verification and Validation of DarcyTools. Two accompanying reports cover other aspects: - Concepts, Methods, Equations and Demo Simulations. - User's Guide. The objective of this report is to compile all verification and validation studies that have been carried out so far. After some brief introductory sections, all cases will be reported in Appendix A (verification cases) and Appendix B (validation cases)

  14. Monte Carlo investigation of collapsed versus rotated IMRT plan verification.

    Science.gov (United States)

    Conneely, Elaine; Alexander, Andrew; Ruo, Russell; Chung, Eunah; Seuntjens, Jan; Foley, Mark J

    2014-05-08

    IMRT QA requires, among other tests, a time-consuming process of measuring the absorbed dose, at least to a point, in a high-dose, low-dose-gradient region. Some clinics use a technique of measuring this dose with all beams delivered at a single gantry angle (collapsed delivery), as opposed to the beams delivered at the planned gantry angle (rotated delivery). We examined, established, and optimized Monte Carlo simulations of the dosimetry for IMRT verification of treatment plans for these two different delivery modes (collapsed versus rotated). The results of the simulations were compared to the treatment planning system dose calculations for the two delivery modes, as well as to measurements taken. This was done in order to investigate the validity of the use of a collapsed delivery technique for IMRT QA. The BEAMnrc, DOSXYZnrc, and egs_chamber codes were utilized for the Monte Carlo simulations along with the MMCTP system. A number of different plan complexity metrics were also used in the analysis of the dose distributions in a bid to qualify why verification in a collapsed delivery may or may not be optimal for IMRT QA. Following the Alfonso et al. formalism, the kfclin,frefQclin,Q correction factor was calculated to correct the deviation of small fields from the reference conditions used for beam calibration. We report on the results obtained for a cohort of 20 patients. The plan complexity was investigated for each plan using the complexity metrics of homogeneity index, conformity index, modulation complexity score, and the fraction of beams from a particular plan that intersect the chamber when performing the QA. Rotated QA gives more consistent results than the collapsed QA technique. The kfclin,frefQclin,Qfactor deviates less from 1 for rotated QA than for collapsed QA. If the homogeneity index is less than 0.05 then the kfclin,frefQclin,Q factor does not deviate from unity by more than 1%. A value this low for the homogeneity index can only be obtained

  15. Verification-Based Interval-Passing Algorithm for Compressed Sensing

    OpenAIRE

    Wu, Xiaofu; Yang, Zhen

    2013-01-01

    We propose a verification-based Interval-Passing (IP) algorithm for iteratively reconstruction of nonnegative sparse signals using parity check matrices of low-density parity check (LDPC) codes as measurement matrices. The proposed algorithm can be considered as an improved IP algorithm by further incorporation of the mechanism of verification algorithm. It is proved that the proposed algorithm performs always better than either the IP algorithm or the verification algorithm. Simulation resul...

  16. Multi-canister overpack project - verification and validation, MCNP 4A

    International Nuclear Information System (INIS)

    Goldmann, L.H.

    1997-01-01

    This supporting document contains the software verification and validation (V and V) package used for Phase 2 design of the Spent Nuclear Fuel Multi-Canister Overpack. V and V packages for both ANSYS and MCNP are included. Description of Verification Run(s): This software requires that it be compiled specifically for the machine it is to be used on. Therefore to facilitate ease in the verification process the software automatically runs 25 sample problems to ensure proper installation and compilation. Once the runs are completed the software checks for verification by performing a file comparison on the new output file and the old output file. Any differences between any of the files will cause a verification error. Due to the manner in which the verification is completed a verification error does not necessarily indicate a problem. This indicates that a closer look at the output files is needed to determine the cause of the error

  17. Simulation-based design process for the verification of ITER remote handling systems

    International Nuclear Information System (INIS)

    Sibois, Romain; Määttä, Timo; Siuko, Mikko; Mattila, Jouni

    2014-01-01

    Highlights: •Verification and validation process for ITER remote handling system. •Simulation-based design process for early verification of ITER RH systems. •Design process centralized around simulation lifecycle management system. •Verification and validation roadmap for digital modelling phase. -- Abstract: The work behind this paper takes place in the EFDA's European Goal Oriented Training programme on Remote Handling (RH) “GOT-RH”. The programme aims to train engineers for activities supporting the ITER project and the long-term fusion programme. One of the projects of this programme focuses on the verification and validation (V and V) of ITER RH system requirements using digital mock-ups (DMU). The purpose of this project is to study and develop efficient approach of using DMUs in the V and V process of ITER RH system design utilizing a System Engineering (SE) framework. Complex engineering systems such as ITER facilities lead to substantial rise of cost while manufacturing the full-scale prototype. In the V and V process for ITER RH equipment, physical tests are a requirement to ensure the compliance of the system according to the required operation. Therefore it is essential to virtually verify the developed system before starting the prototype manufacturing phase. This paper gives an overview of the current trends in using digital mock-up within product design processes. It suggests a simulation-based process design centralized around a simulation lifecycle management system. The purpose of this paper is to describe possible improvements in the formalization of the ITER RH design process and V and V processes, in order to increase their cost efficiency and reliability

  18. Key Nuclear Verification Priorities: Safeguards and Beyond

    International Nuclear Information System (INIS)

    Carlson, J.

    2010-01-01

    In addressing nuclear verification priorities, we should look beyond the current safeguards system. Non-proliferation, which the safeguards system underpins, is not an end in itself, but an essential condition for achieving and maintaining nuclear disarmament. Effective safeguards are essential for advancing disarmament, and safeguards issues, approaches and techniques are directly relevant to the development of future verification missions. The extent to which safeguards challenges are successfully addressed - or otherwise - will impact not only on confidence in the safeguards system, but on the effectiveness of, and confidence in, disarmament verification. To identify the key nuclear verification priorities, we need to consider the objectives of verification, and the challenges to achieving these. The strategic objective of IAEA safeguards might be expressed as: To support the global nuclear non-proliferation regime by: - Providing credible assurance that states are honouring their safeguards commitments - thereby removing a potential motivation to proliferate; and - Early detection of misuse of nuclear material and technology - thereby deterring proliferation by the risk of early detection, enabling timely intervention by the international community. Or to summarise - confidence-building, detection capability, and deterrence. These will also be essential objectives for future verification missions. The challenges to achieving these involve a mix of political, technical and institutional dimensions. Confidence is largely a political matter, reflecting the qualitative judgment of governments. Clearly assessments of detection capability and deterrence have a major impact on confidence. Detection capability is largely thought of as 'technical', but also involves issues of legal authority, as well as institutional issues. Deterrence has both political and institutional aspects - including judgments on risk of detection and risk of enforcement action being taken. The

  19. Key Nuclear Verification Priorities - Safeguards and Beyond

    International Nuclear Information System (INIS)

    Carlson, J.

    2010-01-01

    In addressing nuclear verification priorities, we should look beyond the current safeguards system. Non-proliferation, which the safeguards system underpins, is not an end in itself, but an essential condition for achieving and maintaining nuclear disarmament. Effective safeguards are essential for advancing disarmament, and safeguards issues, approaches and techniques are directly relevant to the development of future verification missions. The extent to which safeguards challenges are successfully addressed - or otherwise - will impact not only on confidence in the safeguards system, but on the effectiveness of, and confidence in, disarmament verification. To identify the key nuclear verification priorities, we need to consider the objectives of verification, and the challenges to achieving these. The strategic objective of IAEA safeguards might be expressed as: To support the global nuclear non-proliferation regime by: - Providing credible assurance that states are honouring their safeguards commitments - thereby removing a potential motivation to proliferate; and - Early detection of misuse of nuclear material and technology - thereby deterring proliferation by the risk of early detection, enabling timely intervention by the international community. Or to summarise - confidence-building, detection capability, and deterrence. These will also be essential objectives for future verification missions. The challenges to achieving these involve a mix of political, technical and institutional dimensions. Confidence is largely a political matter, reflecting the qualitative judgment of governments. Clearly assessments of detection capability and deterrence have a major impact on confidence. Detection capability is largely thought of as 'technical', but also involves issues of legal authority, as well as institutional issues. Deterrence has both political and institutional aspects - including judgments on risk of detection and risk of enforcement action being taken. The

  20. On the organisation of program verification competitions

    NARCIS (Netherlands)

    Huisman, Marieke; Klebanov, Vladimir; Monahan, Rosemary; Klebanov, Vladimir; Beckert, Bernhard; Biere, Armin; Sutcliffe, Geoff

    In this paper, we discuss the challenges that have to be addressed when organising program verification competitions. Our focus is on competitions for verification systems where the participants both formalise an informally stated requirement and (typically) provide some guidance for the tool to

  1. Remedial activities effectiveness verification in tailing areas

    International Nuclear Information System (INIS)

    Kluson, J.; Thinova, L.; Svoboda, T.; Neznal, M.

    2015-01-01

    The complex radiological study of the basin of sludge from the uranium ore mining and preprocessing was done. Air kerma rates (including its spectral analysis) at the reference height of 1 m above ground over the whole area were measured and radiation fields mapped during two measuring campaigns (years 2009 and 2014). K, U and Th concentrations in sludge and concentrations in depth profiles (including radon concentration and radon exhalation rates) in selected points were determined using gamma spectrometry for in situ as well as laboratory samples measurement. Results were used for the analysis, design evaluation and verification of the efficiency of the remediation measures. Efficiency of the sludge basin covering by the inert material was modelled using MicroShield code. (authors)

  2. Simulation environment based on the Universal Verification Methodology

    International Nuclear Information System (INIS)

    Fiergolski, A.

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.

  3. Dosimetric verification of the dynamic intensity modulated radiotherapy (IMR) of 21 patients

    International Nuclear Information System (INIS)

    Tsai, J.-S.; Engler, Mark J.; Ling, Marilyn N.; Wu, Julian; Kramer, Bradley; Fagundes, Marcio; Dipetrillo, Thomas; Wazer, David E.

    1996-01-01

    exposed to known doses, and a high speed, 300 dot/'' scanner driven by photoshop software. Film data were analyzed with NIH image software. Absolute dose verification was achieved with TLD in the anthropomorphic phantom and diodes and ion chambers in calibration phantom slabs. Phantom setup closely simulated the patient's CT and treatment setups. Interslab spaces for films and phantom position were chosen to best sample conformity of tumor prescription dose, and compliance of maximum measured dose in normal tissues to doses entered as constraints. Verifications applied to commission the system consisted of annealing the cost function for simulated targets in the anthropomorphic phantom, and then comparing planned with measured doses. Subsequently a 'hybrid' verification was performed where the beam set obtained from patient geometry was detached from patient anatomic files and applied to calculate doses in the phantoms, followed by a comparison of measured with planned doses. Phantom slabs and positions were carefully selected to obtain an average TMR to the gantry isocenter in the phantom within 2% of the average within the patient. In vivo dosimetry was obtained with TLD under 1 cm of bolus at the location of the maximum skin surface dose. Plans were reoptimized including the contour of the added bolus to improve the accuracy of the measurement. The average leakage dose was assumed to be 0.4% of the total monitor units of the treatment. Results: Verification of planned dose distributions simulated in phantom indicate agreement of planned with measured doses of ±5% throughout numerous transverse plane films of 18 of 21 patients. In three patients with unusually large and complex shaped tumors, planned monitor units were altered to compensate for verifications indicating up to 10% differences between planned and measured doses. TLD in the phantom indicated improved agreement of absolute dose of ±5%. However, the accuracy of initial 'hybrid' verifications of patient

  4. Assessment of surface solar irradiance derived from real-time modelling techniques and verification with ground-based measurements

    Science.gov (United States)

    Kosmopoulos, Panagiotis G.; Kazadzis, Stelios; Taylor, Michael; Raptis, Panagiotis I.; Keramitsoglou, Iphigenia; Kiranoudis, Chris; Bais, Alkiviadis F.

    2018-02-01

    This study focuses on the assessment of surface solar radiation (SSR) based on operational neural network (NN) and multi-regression function (MRF) modelling techniques that produce instantaneous (in less than 1 min) outputs. Using real-time cloud and aerosol optical properties inputs from the Spinning Enhanced Visible and Infrared Imager (SEVIRI) on board the Meteosat Second Generation (MSG) satellite and the Copernicus Atmosphere Monitoring Service (CAMS), respectively, these models are capable of calculating SSR in high resolution (1 nm, 0.05°, 15 min) that can be used for spectrally integrated irradiance maps, databases and various applications related to energy exploitation. The real-time models are validated against ground-based measurements of the Baseline Surface Radiation Network (BSRN) in a temporal range varying from 15 min to monthly means, while a sensitivity analysis of the cloud and aerosol effects on SSR is performed to ensure reliability under different sky and climatological conditions. The simulated outputs, compared to their common training dataset created by the radiative transfer model (RTM) libRadtran, showed median error values in the range -15 to 15 % for the NN that produces spectral irradiances (NNS), 5-6 % underestimation for the integrated NN and close to zero errors for the MRF technique. The verification against BSRN revealed that the real-time calculation uncertainty ranges from -100 to 40 and -20 to 20 W m-2, for the 15 min and monthly mean global horizontal irradiance (GHI) averages, respectively, while the accuracy of the input parameters, in terms of aerosol and cloud optical thickness (AOD and COT), and their impact on GHI, was of the order of 10 % as compared to the ground-based measurements. The proposed system aims to be utilized through studies and real-time applications which are related to solar energy production planning and use.

  5. HTGR analytical methods and design verification

    International Nuclear Information System (INIS)

    Neylan, A.J.; Northup, T.E.

    1982-05-01

    Analytical methods for the high-temperature gas-cooled reactor (HTGR) include development, update, verification, documentation, and maintenance of all computer codes for HTGR design and analysis. This paper presents selected nuclear, structural mechanics, seismic, and systems analytical methods related to the HTGR core. This paper also reviews design verification tests in the reactor core, reactor internals, steam generator, and thermal barrier

  6. Comparative study of the swabbing properties of seven commercially available swab materials for cleaning verification.

    Science.gov (United States)

    Corrigan, Damion K; Piletsky, Sergey; McCrossen, Sean

    2009-01-01

    This article compares the technical performances of several different commercially available swabbing materials for the purpose of cleaning verification. A steel surface was soiled with solutions of acetaminophen, nicotinic acid, diclofenac, and benzamidine and wiped with each swabbing material. The compounds were extracted with water or ethanol (depending on polarity of analyte) and their concentration in extract was quantified spectrophotometrically. The study also investigated swab debris on the wiped surface. The swab performances were compared and the best swab material was identified.

  7. IMRT plan verification in radiotherapy

    International Nuclear Information System (INIS)

    Vlk, P.

    2006-01-01

    This article describes the procedure for verification of IMRT (Intensity modulated radiation therapy) plan, which is used in the Oncological Institute of St. Elisabeth in Bratislava. It contains basic description of IMRT technology and developing a deployment plan for IMRT planning system CORVUS 6.0, the device Mimic (Multilammelar intensity modulated collimator) and the overall process of verifying the schedule created. The aim of verification is particularly good control of the functions of MIMIC and evaluate the overall reliability of IMRT planning. (author)

  8. Automatic Verification of Timing Constraints for Safety Critical Space Systems

    Science.gov (United States)

    Fernandez, Javier; Parra, Pablo; Sanchez Prieto, Sebastian; Polo, Oscar; Bernat, Guillem

    2015-09-01

    In this paper is presented an automatic process of verification. We focus in the verification of scheduling analysis parameter. This proposal is part of process based on Model Driven Engineering to automate a Verification and Validation process of the software on board of satellites. This process is implemented in a software control unit of the energy particle detector which is payload of Solar Orbiter mission. From the design model is generated a scheduling analysis model and its verification model. The verification as defined as constraints in way of Finite Timed Automatas. When the system is deployed on target the verification evidence is extracted as instrumented points. The constraints are fed with the evidence, if any of the constraints is not satisfied for the on target evidence the scheduling analysis is not valid.

  9. Mobile Pit verification system design based on passive special nuclear material verification in weapons storage facilities

    Energy Technology Data Exchange (ETDEWEB)

    Paul, J. N.; Chin, M. R.; Sjoden, G. E. [Nuclear and Radiological Engineering Program, George W. Woodruff School of Mechanical Engineering, Georgia Institute of Technology, 770 State St, Atlanta, GA 30332-0745 (United States)

    2013-07-01

    A mobile 'drive by' passive radiation detection system to be applied in special nuclear materials (SNM) storage facilities for validation and compliance purposes has been designed through the use of computational modeling and new radiation detection methods. This project was the result of work over a 1 year period to create optimal design specifications to include creation of 3D models using both Monte Carlo and deterministic codes to characterize the gamma and neutron leakage out each surface of SNM-bearing canisters. Results were compared and agreement was demonstrated between both models. Container leakages were then used to determine the expected reaction rates using transport theory in the detectors when placed at varying distances from the can. A 'typical' background signature was incorporated to determine the minimum signatures versus the probability of detection to evaluate moving source protocols with collimation. This established the criteria for verification of source presence and time gating at a given vehicle speed. New methods for the passive detection of SNM were employed and shown to give reliable identification of age and material for highly enriched uranium (HEU) and weapons grade plutonium (WGPu). The finalized 'Mobile Pit Verification System' (MPVS) design demonstrated that a 'drive-by' detection system, collimated and operating at nominally 2 mph, is capable of rapidly verifying each and every weapon pit stored in regularly spaced, shelved storage containers, using completely passive gamma and neutron signatures for HEU and WGPu. This system is ready for real evaluation to demonstrate passive total material accountability in storage facilities. (authors)

  10. Identification of Uranyl Surface Complexes an Ferrihydrite: Advanced EXAFS Data Analysis and CD-MUSIC Modeling

    NARCIS (Netherlands)

    Rossberg, A.; Ulrich, K.U.; Weiss, S.; Tsushima, S.; Hiemstra, T.; Scheinost, A.C.

    2009-01-01

    Previous spectroscopic research suggested that uranium(VI) adsorption to iron oxides is dominated by ternary uranyl-carbonato surface complexes across an unexpectedly wide pH range. Formation of such complexes would have a significant impact on the sorption behavior and mobility of uranium in

  11. A framework for nuclear agreement and verification

    International Nuclear Information System (INIS)

    Ali, A.

    1991-01-01

    This chapter assesses the prospects for a nuclear agreement between India and Pakistan. The chapter opens with a review of past and present political environments of the two countries. The discussion proceeds to describe the linkage of global arms control agreements, prospects for verification of a Comprehensive Test Ban Treaty, the role of nuclear power in any agreements, the intrusiveness of verification, and possible post-proliferation agreements. Various monitoring and verification technologies are described (mainly satellite oriented). The chapter concludes with an analysis of the likelihood of persuading India and Pakistan to agree to a nonproliferation arrangement

  12. Verification of Many-Qubit States

    Directory of Open Access Journals (Sweden)

    Yuki Takeuchi

    2018-06-01

    Full Text Available Verification is a task to check whether a given quantum state is close to an ideal state or not. In this paper, we show that a variety of many-qubit quantum states can be verified with only sequential single-qubit measurements of Pauli operators. First, we introduce a protocol for verifying ground states of Hamiltonians. We next explain how to verify quantum states generated by a certain class of quantum circuits. We finally propose an adaptive test of stabilizers that enables the verification of all polynomial-time-generated hypergraph states, which include output states of the Bremner-Montanaro-Shepherd-type instantaneous quantum polynomial time (IQP circuits. Importantly, we do not make any assumption that the identically and independently distributed copies of the same states are given: Our protocols work even if some highly complicated entanglement is created among copies in any artificial way. As applications, we consider the verification of the quantum computational supremacy demonstration with IQP models, and verifiable blind quantum computing.

  13. Calibration of Ge gamma-ray spectrometers for complex sample geometries and matrices

    Energy Technology Data Exchange (ETDEWEB)

    Semkow, T.M., E-mail: thomas.semkow@health.ny.gov [Wadsworth Center, New York State Department of Health, Empire State Plaza, Albany, NY 12201 (United States); Department of Environmental Health Sciences, School of Public Health, University at Albany, State University of New York, Rensselaer, NY 12144 (United States); Bradt, C.J.; Beach, S.E.; Haines, D.K.; Khan, A.J.; Bari, A.; Torres, M.A.; Marrantino, J.C.; Syed, U.-F. [Wadsworth Center, New York State Department of Health, Empire State Plaza, Albany, NY 12201 (United States); Kitto, M.E. [Wadsworth Center, New York State Department of Health, Empire State Plaza, Albany, NY 12201 (United States); Department of Environmental Health Sciences, School of Public Health, University at Albany, State University of New York, Rensselaer, NY 12144 (United States); Hoffman, T.J. [Wadsworth Center, New York State Department of Health, Empire State Plaza, Albany, NY 12201 (United States); Curtis, P. [Kiltel Systems, Inc., Clyde Hill, WA 98004 (United States)

    2015-11-01

    A comprehensive study of the efficiency calibration and calibration verification of Ge gamma-ray spectrometers was performed using semi-empirical, computational Monte-Carlo (MC), and transfer methods. The aim of this study was to evaluate the accuracy of the quantification of gamma-emitting radionuclides in complex matrices normally encountered in environmental and food samples. A wide range of gamma energies from 59.5 to 1836.0 keV and geometries from a 10-mL jar to 1.4-L Marinelli beaker were studied on four Ge spectrometers with the relative efficiencies between 102% and 140%. Density and coincidence summing corrections were applied. Innovative techniques were developed for the preparation of artificial complex matrices from materials such as acidified water, polystyrene, ethanol, sugar, and sand, resulting in the densities ranging from 0.3655 to 2.164 g cm{sup −3}. They were spiked with gamma activity traceable to international standards and used for calibration verifications. A quantitative method of tuning MC calculations to experiment was developed based on a multidimensional chi-square paraboloid. - Highlights: • Preparation and spiking of traceable complex matrices in extended geometries. • Calibration of Ge gamma spectrometers for complex matrices. • Verification of gamma calibrations. • Comparison of semi-empirical, computational Monte Carlo, and transfer methods of Ge calibration. • Tuning of Monte Carlo calculations using a multidimensional paraboloid.

  14. Calibration of Ge gamma-ray spectrometers for complex sample geometries and matrices

    International Nuclear Information System (INIS)

    Semkow, T.M.; Bradt, C.J.; Beach, S.E.; Haines, D.K.; Khan, A.J.; Bari, A.; Torres, M.A.; Marrantino, J.C.; Syed, U.-F.; Kitto, M.E.; Hoffman, T.J.; Curtis, P.

    2015-01-01

    A comprehensive study of the efficiency calibration and calibration verification of Ge gamma-ray spectrometers was performed using semi-empirical, computational Monte-Carlo (MC), and transfer methods. The aim of this study was to evaluate the accuracy of the quantification of gamma-emitting radionuclides in complex matrices normally encountered in environmental and food samples. A wide range of gamma energies from 59.5 to 1836.0 keV and geometries from a 10-mL jar to 1.4-L Marinelli beaker were studied on four Ge spectrometers with the relative efficiencies between 102% and 140%. Density and coincidence summing corrections were applied. Innovative techniques were developed for the preparation of artificial complex matrices from materials such as acidified water, polystyrene, ethanol, sugar, and sand, resulting in the densities ranging from 0.3655 to 2.164 g cm −3 . They were spiked with gamma activity traceable to international standards and used for calibration verifications. A quantitative method of tuning MC calculations to experiment was developed based on a multidimensional chi-square paraboloid. - Highlights: • Preparation and spiking of traceable complex matrices in extended geometries. • Calibration of Ge gamma spectrometers for complex matrices. • Verification of gamma calibrations. • Comparison of semi-empirical, computational Monte Carlo, and transfer methods of Ge calibration. • Tuning of Monte Carlo calculations using a multidimensional paraboloid

  15. Formal Verification -26 ...

    Indian Academy of Sciences (India)

    by testing of the components and successful testing leads to the software being ... Formal verification is based on formal methods which are mathematically based ..... scenario under which a similar error could occur. There are various other ...

  16. Experimental inventory verification system

    International Nuclear Information System (INIS)

    Steverson, C.A.; Angerman, M.I.

    1991-01-01

    As Low As Reasonably Achievable (ALARA) goals and Department of Energy (DOE) inventory requirements are frequently in conflict at facilities across the DOE complex. The authors wish, on one hand, to verify the presence of correct amounts of nuclear materials that are in storage or in process; yet on the other hand, we wish to achieve ALARA goals by keeping individual and collective exposures as low as social, technical, economic, practical, and public policy considerations permit. The Experimental Inventory Verification System (EIVSystem) is a computer-based, camera-driven system that utilizes image processing technology to detect change in vault areas. Currently in the test and evaluation phase at Idaho National Engineering Laboratory, this system guards personnel. The EIVSystem continually monitors the vault, providing proof of changed status for objects sorted within the vault. This paper reports that these data could provide the basis for reducing inventory requirements when no change has occurred, thus helping implement ALARA policy; the data will also help describe there target area of an inventory when change has been shown to occur

  17. Verification of wet blasting decontamination technology

    International Nuclear Information System (INIS)

    Matsubara, Sachito; Murayama, Kazunari; Yoshida, Hirohisa; Igei, Shigemitsu; Izumida, Tatsuo

    2013-01-01

    Macoho Co., Ltd. participated in the projects of 'Decontamination Verification Test FY 2011 by the Ministry of the Environment' and 'Decontamination Verification Test FY 2011 by the Cabinet Office.' And we tested verification to use a wet blasting technology for decontamination of rubble and roads contaminated by the accident of Fukushima Daiichi Nuclear Power Plant of the Tokyo Electric Power Company. As a results of the verification test, the wet blasting decontamination technology showed that a decontamination rate became 60-80% for concrete paving, interlocking, dense-grated asphalt pavement when applied to the decontamination of the road. When it was applied to rubble decontamination, a decontamination rate was 50-60% for gravel and approximately 90% for concrete and wood. It was thought that Cs-134 and Cs-137 attached to the fine sludge scraped off from a decontamination object and the sludge was found to be separated from abrasives by wet cyclene classification: the activity concentration of the abrasives is 1/30 or less than the sludge. The result shows that the abrasives can be reused without problems when the wet blasting decontamination technology is used. (author)

  18. Software Testing and Verification in Climate Model Development

    Science.gov (United States)

    Clune, Thomas L.; Rood, RIchard B.

    2011-01-01

    Over the past 30 years most climate models have grown from relatively simple representations of a few atmospheric processes to a complex multi-disciplinary system. Computer infrastructure over that period has gone from punch card mainframes to modem parallel clusters. Model implementations have become complex, brittle, and increasingly difficult to extend and maintain. Existing verification processes for model implementations rely almost exclusively upon some combination of detailed analysis of output from full climate simulations and system-level regression tests. In additional to being quite costly in terms of developer time and computing resources, these testing methodologies are limited in terms of the types of defects that can be detected, isolated and diagnosed. Mitigating these weaknesses of coarse-grained testing with finer-grained "unit" tests has been perceived as cumbersome and counter-productive. In the commercial software sector, recent advances in tools and methodology have led to a renaissance for systematic fine-grained testing. We discuss the availability of analogous tools for scientific software and examine benefits that similar testing methodologies could bring to climate modeling software. We describe the unique challenges faced when testing complex numerical algorithms and suggest techniques to minimize and/or eliminate the difficulties.

  19. Development of a Novel Bone Conduction Verification Tool Using a Surface Microphone: Validation With Percutaneous Bone Conduction Users.

    Science.gov (United States)

    Hodgetts, William; Scott, Dylan; Maas, Patrick; Westover, Lindsey

    2018-03-23

    . There were 90 planned comparisons of interest, three at each frequency (3 × 10) for the three input levels (30 × 3). Therefore, to minimize a type 1 error associated with multiple comparisons, we adjusted alpha using the Holm-Bonferroni method. There were five comparisons that yielded significant differences between the skull simulator and surface microphone (test and retest) in the estimation of audibility. However, the mean difference in these effects was small at 3.3 dB. Both sensors yielded equivalent results for the majority of comparisons. Models of bone conduction devices that have intact skin cannot be measured with the skull simulator. This study is the first to present and evaluate a new tool for bone conduction verification. The surface microphone is capable of yielding equivalent audibility measurements as the skull simulator for percutaneous bone conduction users at multiple input levels. This device holds potential for measuring other bone conduction devices (Sentio, BoneBridge, Attract, Soft headband devices) that do not have a percutaneous implant.

  20. Cleanup Verification Package for the 118-F-3, Minor Construction Burial Ground

    International Nuclear Information System (INIS)

    Appel, M.J.

    2007-01-01

    This cleanup verification package documents completion of remedial action for the 118-F-3, Minor Construction Burial Ground waste site. This site was an open field covered with cobbles, with no vegetation growing on the surface. The site received irradiated reactor parts that were removed during conversion of the 105-F Reactor from the Liquid 3X to the Ball 3X Project safety systems and received mostly vertical safety rod thimbles and step plugs

  1. K Basins Field Verification Program

    International Nuclear Information System (INIS)

    Booth, H.W.

    1994-01-01

    The Field Verification Program establishes a uniform and systematic process to ensure that technical information depicted on selected engineering drawings accurately reflects the actual existing physical configuration. This document defines the Field Verification Program necessary to perform the field walkdown and inspection process that identifies the physical configuration of the systems required to support the mission objectives of K Basins. This program is intended to provide an accurate accounting of the actual field configuration by documenting the as-found information on a controlled drawing

  2. Engineering drawing field verification program. Revision 3

    International Nuclear Information System (INIS)

    Ulk, P.F.

    1994-01-01

    Safe, efficient operation of waste tank farm facilities is dependent in part upon the availability of accurate, up-to-date plant drawings. Accurate plant drawings are also required in support of facility upgrades and future engineering remediation projects. This supporting document establishes the procedure for performing a visual field verification of engineering drawings, the degree of visual observation being performed and documenting the results. A copy of the drawing attesting to the degree of visual observation will be paginated into the released Engineering Change Notice (ECN) documenting the field verification for future retrieval and reference. All waste tank farm essential and support drawings within the scope of this program will be converted from manual to computer aided drafting (CAD) drawings. A permanent reference to the field verification status will be placed along the right border of the CAD-converted drawing, referencing the revision level, at which the visual verification was performed and documented

  3. Verification of Open Interactive Markov Chains

    OpenAIRE

    Brazdil, Tomas; Hermanns, Holger; Krcal, Jan; Kretinsky, Jan; Rehak, Vojtech

    2012-01-01

    Interactive Markov chains (IMC) are compositional behavioral models extending both labeled transition systems and continuous-time Markov chains. IMC pair modeling convenience - owed to compositionality properties - with effective verification algorithms and tools - owed to Markov properties. Thus far however, IMC verification did not consider compositionality properties, but considered closed systems. This paper discusses the evaluation of IMC in an open and thus compositional interpretation....

  4. Composition and microstructure alteration of triticale grain surface after processing by enzymes of cellulase complex

    Directory of Open Access Journals (Sweden)

    Elena Kuznetsova

    2016-01-01

    Full Text Available It is found that the pericarp tissue of grain have considerable strength and stiffness, that has an adverse effect on quality of whole-grain bread. Thereby, there exists the need for preliminary chemical and biochemical processing of durable cell walls before industrial use. Increasingly used in the production of bread finds an artificial hybrid of the traditional grain crops of wheat and rye - triticale, grain which has high nutritional value. The purpose of this research was to evaluate the influence of cellulose complex (Penicillium canescens enzymes on composition and microstructure alteration of triticale grain surface, for grain used in baking. Triticale grain was processed by cellulolytic enzyme preparations with different composition (producer is Penicillium canescens. During experiment it is found that triticale grain processing by enzymes of cellulase complex leads to an increase in the content of water-soluble pentosans by 36.3 - 39.2%. The total amount of low molecular sugars increased by 3.8 - 10.5 %. Studies show that under the influence of enzymes the microstructure of the triticale grain surface is changing. Microphotographs characterizing grain surface structure alteration in dynamic (every 2 hours during 10 hours of substrate hydrolysis are shown. It is found that the depth and direction of destruction process for non-starch polysaccharides of grain integument are determined by the composition of the enzyme complex preparation and duration of exposure. It is found, that xylanase involved in the modification of hemicelluloses fiber having both longitudinal and radial orientation. Hydrolysis of non-starch polysaccharides from grain shells led to increase of antioxidant activity. Ferulic acid was identified in alcoholic extract of triticale grain after enzymatic hydrolysis under the influence of complex preparation containing cellulase, xylanase and β-glucanase. Grain processing by independent enzymes containing in complex

  5. Chromate Adsorption on Selected Soil Minerals: Surface Complexation Modeling Coupled with Spectroscopic Investigation.

    Czech Academy of Sciences Publication Activity Database

    Veselská, V.; Fajgar, Radek; Číhalová, S.; Bolanz, R.M.; Göttlicher, J.; Steininger, R.; Siddique, J.A.; Komárek, M.

    2016-01-01

    Roč. 318, NOV 15 (2016), s. 433-442 ISSN 0304-3894 Institutional support: RVO:67985858 Keywords : surface complexation modeling * chromate * soil minerals Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 6.065, year: 2016

  6. Earth Science Enterprise Scientific Data Purchase Project: Verification and Validation

    Science.gov (United States)

    Jenner, Jeff; Policelli, Fritz; Fletcher, Rosea; Holecamp, Kara; Owen, Carolyn; Nicholson, Lamar; Dartez, Deanna

    2000-01-01

    This paper presents viewgraphs on the Earth Science Enterprise Scientific Data Purchase Project's verification,and validation process. The topics include: 1) What is Verification and Validation? 2) Why Verification and Validation? 3) Background; 4) ESE Data Purchas Validation Process; 5) Data Validation System and Ingest Queue; 6) Shipment Verification; 7) Tracking and Metrics; 8) Validation of Contract Specifications; 9) Earth Watch Data Validation; 10) Validation of Vertical Accuracy; and 11) Results of Vertical Accuracy Assessment.

  7. Surface complexation modeling of Cu(II adsorption on mixtures of hydrous ferric oxide and kaolinite

    Directory of Open Access Journals (Sweden)

    Schaller Melinda S

    2008-09-01

    Full Text Available Abstract Background The application of surface complexation models (SCMs to natural sediments and soils is hindered by a lack of consistent models and data for large suites of metals and minerals of interest. Furthermore, the surface complexation approach has mostly been developed and tested for single solid systems. Few studies have extended the SCM approach to systems containing multiple solids. Results Cu adsorption was measured on pure hydrous ferric oxide (HFO, pure kaolinite (from two sources and in systems containing mixtures of HFO and kaolinite over a wide range of pH, ionic strength, sorbate/sorbent ratios and, for the mixed solid systems, using a range of kaolinite/HFO ratios. Cu adsorption data measured for the HFO and kaolinite systems was used to derive diffuse layer surface complexation models (DLMs describing Cu adsorption. Cu adsorption on HFO is reasonably well described using a 1-site or 2-site DLM. Adsorption of Cu on kaolinite could be described using a simple 1-site DLM with formation of a monodentate Cu complex on a variable charge surface site. However, for consistency with models derived for weaker sorbing cations, a 2-site DLM with a variable charge and a permanent charge site was also developed. Conclusion Component additivity predictions of speciation in mixed mineral systems based on DLM parameters derived for the pure mineral systems were in good agreement with measured data. Discrepancies between the model predictions and measured data were similar to those observed for the calibrated pure mineral systems. The results suggest that quantifying specific interactions between HFO and kaolinite in speciation models may not be necessary. However, before the component additivity approach can be applied to natural sediments and soils, the effects of aging must be further studied and methods must be developed to estimate reactive surface areas of solid constituents in natural samples.

  8. Analysis of gold(I/III)-complexes by HPLC-ICP-MS demonstrates gold(III) stability in surface waters.

    Science.gov (United States)

    Ta, Christine; Reith, Frank; Brugger, Joël; Pring, Allan; Lenehan, Claire E

    2014-05-20

    Understanding the form in which gold is transported in surface- and groundwaters underpins our understanding of gold dispersion and (bio)geochemical cycling. Yet, to date, there are no direct techniques capable of identifying the oxidation state and complexation of gold in natural waters. We present a reversed phase ion-pairing HPLC-ICP-MS method for the separation and determination of aqueous gold(III)-chloro-hydroxyl, gold(III)-bromo-hydroxyl, gold(I)-thiosulfate, and gold(I)-cyanide complexes. Detection limits for the gold species range from 0.05 to 0.30 μg L(-1). The [Au(CN)2](-) gold cyanide complex was detected in five of six waters from tailings and adjacent monitoring bores of working gold mines. Contrary to thermodynamic predictions, evidence was obtained for the existence of Au(III)-complexes in circumneutral, hypersaline waters of a natural lake overlying a gold deposit in Western Australia. This first direct evidence for the existence and stability of Au(III)-complexes in natural surface waters suggests that Au(III)-complexes may be important for the transport and biogeochemical cycling of gold in surface environments. Overall, these results show that near-μg L(-1) enrichments of Au in environmental waters result from metastable ligands (e.g., CN(-)) as well as kinetically controlled redox processes leading to the stability of highly soluble Au(III)-complexes.

  9. Computational Redox Potential Predictions: Applications to Inorganic and Organic Aqueous Complexes, and Complexes Adsorbed to Mineral Surfaces

    Directory of Open Access Journals (Sweden)

    Krishnamoorthy Arumugam

    2014-04-01

    Full Text Available Applications of redox processes range over a number of scientific fields. This review article summarizes the theory behind the calculation of redox potentials in solution for species such as organic compounds, inorganic complexes, actinides, battery materials, and mineral surface-bound-species. Different computational approaches to predict and determine redox potentials of electron transitions are discussed along with their respective pros and cons for the prediction of redox potentials. Subsequently, recommendations are made for certain necessary computational settings required for accurate calculation of redox potentials. This article reviews the importance of computational parameters, such as basis sets, density functional theory (DFT functionals, and relativistic approaches and the role that physicochemical processes play on the shift of redox potentials, such as hydration or spin orbit coupling, and will aid in finding suitable combinations of approaches for different chemical and geochemical applications. Identifying cost-effective and credible computational approaches is essential to benchmark redox potential calculations against experiments. Once a good theoretical approach is found to model the chemistry and thermodynamics of the redox and electron transfer process, this knowledge can be incorporated into models of more complex reaction mechanisms that include diffusion in the solute, surface diffusion, and dehydration, to name a few. This knowledge is important to fully understand the nature of redox processes be it a geochemical process that dictates natural redox reactions or one that is being used for the optimization of a chemical process in industry. In addition, it will help identify materials that will be useful to design catalytic redox agents, to come up with materials to be used for batteries and photovoltaic processes, and to identify new and improved remediation strategies in environmental engineering, for example the

  10. Modelling for Near-Surface Transport Dynamics of Hydrogen of Plasma Facing Materials by use of Cellular Automaton

    International Nuclear Information System (INIS)

    Shimura, K.; Terai, T.; Yamawaki, M.

    2003-01-01

    In this study, the kinetics of desorption of adsorbed hydrogen from an ideal metallic surface is modelled in Cellular Automaton (CA). The modelling is achieved by downgrading the surface to one dimension. The model consists of two parts that are surface migration and desorption. The former is attained by randomly sorting the particles at each time, the latter is realised by modelling the thermally-activated process. For the verification of this model, thermal desorption is simulated then the comparison with the chemical kinetics is carried out. Excellent agreement is observed from the result. The results show that this model is reasonable to express the recombinative desorption of two chemisorbed adatoms. Though, the application of this model is limited to the second-order reaction case. But it can be believed that the groundwork of modelling the transport dynamics of hydrogen through the surface under complex conditions is established

  11. Material integrity verification radar

    International Nuclear Information System (INIS)

    Koppenjan, S.K.

    1999-01-01

    The International Atomic Energy Agency (IAEA) has the need for verification of 'as-built' spent fuel-dry storage containers and other concrete structures. The IAEA has tasked the Special Technologies Laboratory (STL) to fabricate, test, and deploy a stepped-frequency Material Integrity Verification Radar (MIVR) system to nondestructively verify the internal construction of these containers. The MIVR system is based on previously deployed high-frequency, ground penetrating radar (GPR) systems that have been developed by STL for the U.S. Department of Energy (DOE). Whereas GPR technology utilizes microwave radio frequency energy to create subsurface images, MTVR is a variation for which the medium is concrete instead of soil. The purpose is to nondestructively verify the placement of concrete-reinforcing materials, pipes, inner liners, and other attributes of the internal construction. The MIVR system underwent an initial field test on CANDU reactor spent fuel storage canisters at Atomic Energy of Canada Limited (AECL), Chalk River Laboratories, Ontario, Canada, in October 1995. A second field test at the Embalse Nuclear Power Plant in Embalse, Argentina, was completed in May 1996. The DOE GPR also was demonstrated at the site. Data collection and analysis were performed for the Argentine National Board of Nuclear Regulation (ENREN). IAEA and the Brazilian-Argentine Agency for the Control and Accounting of Nuclear Material (ABACC) personnel were present as observers during the test. Reinforcing materials were evident in the color, two-dimensional images produced by the MIVR system. A continuous pattern of reinforcing bars was evident and accurate estimates on the spacing, depth, and size were made. The potential uses for safeguard applications were jointly discussed. The MIVR system, as successfully demonstrated in the two field tests, can be used as a design verification tool for IAEA safeguards. A deployment of MIVR for Design Information Questionnaire (DIQ

  12. Oriented coupling of major histocompatibility complex (MHC) to sensor surfaces using light assisted immobilisation technology

    DEFF Research Database (Denmark)

    Snabe, Torben; Røder, Gustav Andreas; Neves-Petersen, Maria Teresa

    2005-01-01

    Controlled and oriented immobilisation of proteins for biosensor purposes is of extreme interest since this provides more efficient sensors with a larger density of active binding sites per area compared to sensors produced by conventional immobilisation. In this paper oriented coupling of a major...... histocompatibility complex (MHC class I) to a sensor surface is presented. The coupling was performed using light assisted immobilisation--a novel immobilisation technology which allows specific opening of particular disulphide bridges in proteins which then is used for covalent bonding to thiol-derivatised surfaces...... via a new disulphide bond. Light assisted immobilisation specifically targets the disulphide bridge in the MHC-I molecule alpha(3)-domain which ensures oriented linking of the complex with the peptide binding site exposed away from the sensor surface. Structural analysis reveals that a similar...

  13. First principles studies of complex oxide surfaces and interfaces

    International Nuclear Information System (INIS)

    Noguera, Claudine; Finocchi, Fabio; Goniakowski, Jacek

    2004-01-01

    Oxides enter our everyday life and exhibit an impressive variety of physical and chemical properties. The understanding of their behaviour, which is often determined by the electronic and atomic structures of their surfaces and interfaces, is a key question in many fields, such as geology, environmental chemistry, catalysis, thermal coatings, microelectronics, and bioengineering. In the last decade, first principles methods, mainly those based on the density functional theory, have been frequently applied to study complex oxide surfaces and interfaces, complementing the experimental observations. In this work, we discuss some of these contributions, with emphasis on several issues that are especially important when dealing with oxides: the local electronic structure at interfaces, and its connection with chemical reactivity; the charge redistribution and the bonding variations, in relation to screening properties; and the possibility of bridging the gap between model and real systems by taking into account the chemical environments and the effect of finite temperatures, and by performing simulations on systems of an adequate (large) size

  14. Specification and Automated Verification of Real-Time Behaviour

    DEFF Research Database (Denmark)

    Kristensen, C.H.; Andersen, J.H.; Skou, A.

    1995-01-01

    In this paper we sketch a method for specification and automatic verification of real-time software properties.......In this paper we sketch a method for specification and automatic verification of real-time software properties....

  15. Specification and Automated Verification of Real-Time Behaviour

    DEFF Research Database (Denmark)

    Andersen, J.H.; Kristensen, C.H.; Skou, A.

    1996-01-01

    In this paper we sketch a method for specification and automatic verification of real-time software properties.......In this paper we sketch a method for specification and automatic verification of real-time software properties....

  16. Biometric Technologies and Verification Systems

    CERN Document Server

    Vacca, John R

    2007-01-01

    Biometric Technologies and Verification Systems is organized into nine parts composed of 30 chapters, including an extensive glossary of biometric terms and acronyms. It discusses the current state-of-the-art in biometric verification/authentication, identification and system design principles. It also provides a step-by-step discussion of how biometrics works; how biometric data in human beings can be collected and analyzed in a number of ways; how biometrics are currently being used as a method of personal identification in which people are recognized by their own unique corporal or behavior

  17. Current status of verification practices in clinical biochemistry in Spain.

    Science.gov (United States)

    Gómez-Rioja, Rubén; Alvarez, Virtudes; Ventura, Montserrat; Alsina, M Jesús; Barba, Núria; Cortés, Mariano; Llopis, María Antonia; Martínez, Cecilia; Ibarz, Mercè

    2013-09-01

    Verification uses logical algorithms to detect potential errors before laboratory results are released to the clinician. Even though verification is one of the main processes in all laboratories, there is a lack of standardization mainly in the algorithms used and the criteria and verification limits applied. A survey in clinical laboratories in Spain was conducted in order to assess the verification process, particularly the use of autoverification. Questionnaires were sent to the laboratories involved in the External Quality Assurance Program organized by the Spanish Society of Clinical Biochemistry and Molecular Pathology. Seven common biochemical parameters were included (glucose, cholesterol, triglycerides, creatinine, potassium, calcium, and alanine aminotransferase). Completed questionnaires were received from 85 laboratories. Nearly all the laboratories reported using the following seven verification criteria: internal quality control, instrument warnings, sample deterioration, reference limits, clinical data, concordance between parameters, and verification of results. The use of all verification criteria varied according to the type of verification (automatic, technical, or medical). Verification limits for these parameters are similar to biological reference ranges. Delta Check was used in 24% of laboratories. Most laboratories (64%) reported using autoverification systems. Autoverification use was related to laboratory size, ownership, and type of laboratory information system, but amount of use (percentage of test autoverified) was not related to laboratory size. A total of 36% of Spanish laboratories do not use autoverification, despite the general implementation of laboratory information systems, most of them, with autoverification ability. Criteria and rules for seven routine biochemical tests were obtained.

  18. Development of Onboard Computer Complex for Russian Segment of ISS

    Science.gov (United States)

    Branets, V.; Brand, G.; Vlasov, R.; Graf, I.; Clubb, J.; Mikrin, E.; Samitov, R.

    1998-01-01

    Report present a description of the Onboard Computer Complex (CC) that was developed during the period of 1994-1998 for the Russian Segment of ISS. The system was developed in co-operation with NASA and ESA. ESA developed a new computation system under the RSC Energia Technical Assignment, called DMS-R. The CC also includes elements developed by Russian experts and organizations. A general architecture of the computer system and the characteristics of primary elements of this system are described. The system was integrated at RSC Energia with the participation of American and European specialists. The report contains information on software simulators, verification and de-bugging facilities witch were been developed for both stand-alone and integrated tests and verification. This CC serves as the basis for the Russian Segment Onboard Control Complex on ISS.

  19. Subsurface barrier verification technologies, informal report

    International Nuclear Information System (INIS)

    Heiser, J.H.

    1994-06-01

    One of the more promising remediation options available to the DOE waste management community is subsurface barriers. Some of the uses of subsurface barriers include surrounding and/or containing buried waste, as secondary confinement of underground storage tanks, to direct or contain subsurface contaminant plumes and to restrict remediation methods, such as vacuum extraction, to a limited area. To be most effective the barriers should be continuous and depending on use, have few or no breaches. A breach may be formed through numerous pathways including: discontinuous grout application, from joints between panels and from cracking due to grout curing or wet-dry cycling. The ability to verify barrier integrity is valuable to the DOE, EPA, and commercial sector and will be required to gain full public acceptance of subsurface barriers as either primary or secondary confinement at waste sites. It is recognized that no suitable method exists for the verification of an emplaced barrier's integrity. The large size and deep placement of subsurface barriers makes detection of leaks challenging. This becomes magnified if the permissible leakage from the site is low. Detection of small cracks (fractions of an inch) at depths of 100 feet or more has not been possible using existing surface geophysical techniques. Compounding the problem of locating flaws in a barrier is the fact that no placement technology can guarantee the completeness or integrity of the emplaced barrier. This report summarizes several commonly used or promising technologies that have been or may be applied to in-situ barrier continuity verification

  20. Complementary technologies for verification of excess plutonium

    International Nuclear Information System (INIS)

    Langner, D.G.; Nicholas, N.J.; Ensslin, N.; Fearey, B.L.; Mitchell, D.J.; Marlow, K.W.; Luke, S.J.; Gosnell, T.B.

    1998-01-01

    Three complementary measurement technologies have been identified as candidates for use in the verification of excess plutonium of weapons origin. These technologies: high-resolution gamma-ray spectroscopy, neutron multiplicity counting, and low-resolution gamma-ray spectroscopy, are mature, robust technologies. The high-resolution gamma-ray system, Pu-600, uses the 630--670 keV region of the emitted gamma-ray spectrum to determine the ratio of 240 Pu to 239 Pu. It is useful in verifying the presence of plutonium and the presence of weapons-grade plutonium. Neutron multiplicity counting is well suited for verifying that the plutonium is of a safeguardable quantity and is weapons-quality material, as opposed to residue or waste. In addition, multiplicity counting can independently verify the presence of plutonium by virtue of a measured neutron self-multiplication and can detect the presence of non-plutonium neutron sources. The low-resolution gamma-ray spectroscopic technique is a template method that can provide continuity of knowledge that an item that enters the a verification regime remains under the regime. In the initial verification of an item, multiple regions of the measured low-resolution spectrum form a unique, gamma-radiation-based template for the item that can be used for comparison in subsequent verifications. In this paper the authors discuss these technologies as they relate to the different attributes that could be used in a verification regime

  1. A Synthesized Framework for Formal Verification of Computing Systems

    Directory of Open Access Journals (Sweden)

    Nikola Bogunovic

    2003-12-01

    Full Text Available Design process of computing systems gradually evolved to a level that encompasses formal verification techniques. However, the integration of formal verification techniques into a methodical design procedure has many inherent miscomprehensions and problems. The paper explicates the discrepancy between the real system implementation and the abstracted model that is actually used in the formal verification procedure. Particular attention is paid to the seamless integration of all phases of the verification procedure that encompasses definition of the specification language and denotation and execution of conformance relation between the abstracted model and its intended behavior. The concealed obstacles are exposed, computationally expensive steps identified and possible improvements proposed.

  2. Independent verification in operations at nuclear power plants

    International Nuclear Information System (INIS)

    Donderi, D.C.; Smiley, A.; Ostry, D.J.; Moray, N.P.

    1995-09-01

    A critical review of approaches to independent verification in operations used in nuclear power plant quality assurance programs in other countries, was conducted for this study. This report identifies the uses of independent verification and provides an assessment of the effectiveness of the various approaches. The findings indicate that at Canadian nuclear power plants as much, if not more, independent verification is performed than at power plants in the other countries included in the study. Additional requirements in this area are not proposed for Canadian stations. (author)

  3. Design and Developmental Research on the VV&A of Complex Simulation System

    Directory of Open Access Journals (Sweden)

    Liu Li

    2016-01-01

    Full Text Available The Verification, Validation and Accreditation (VV&A of a complex simulation system is a complex systems engineering. Based on the brief introduction to the concept of VV&A, this paper puts forward its design principles, approaches and basic contents, expounds the typical developing process and predicts its up-to-date technology developing trend of complex simulation system.

  4. Verification of alternative dew point hygrometer for CV-LRT in MONJU. Short- and long-term verification of capacitance-type dew point hygrometer (Translated document)

    International Nuclear Information System (INIS)

    Ichikawa, Shoichi; Chiba, Yusuke; Ono, Fumiyasu; Hatori, Masakazu; Kobayashi, Takanori; Uekura, Ryoichi; Hashiri, Nobuo; Inuzuka, Taisuke; Kitano, Hiroshi; Abe, Hisashi

    2017-03-01

    To reduce the influence of maintenance of dew point hygrometers on the plant schedule at the prototype fast-breeder reactor MONJU, Japan Atomic Energy Agency examined a capacitance-type dew point hygrometer as an alternative to the lithium-chloride dew point hygrometer being used in the containment vessel leak rate test. As verifications, a capacitance-type dew point hygrometer was compared with a lithium-chloride dew point hygrometer under a containment vessel leak rate test condition. And the capacitance-type dew point hygrometer was compared with a high-precision-mirror-surface dew point hygrometer for long-term (2 years) in the containment vessel as an unprecedented try. A comparison of a capacitance-type dew point hygrometer with a lithium-chloride dew point hygrometer in a containment vessel leak rate test (Atmosphere: nitrogen, Testing time: 24 h) revealed no significant difference between the capacitance-type dew point hygrometer and the lithium-chloride dew point hygrometer. A comparison of the capacitance-type dew point hygrometer with the high-precision-mirror-surface dew point hygrometer for long-term verification (Atmosphere: air, Testing time: 24 months) revealed that the capacitance-type dew point hygrometer satisfied the instrumental specification (synthesized precision of detector and converter: ±2.04°C) specified in the Leak Rate Test Regulations for Nuclear Reactor Containment Vessel. It was confirmed that the capacitance-type dew point hygrometer can be used as a long-term alternative to the lithium-chloride dew point hygrometer without affecting the dew point hygrometer maintenance schedule of the MONJU plant. (author)

  5. Experimental verification of the new RISOe-A1 airfoil family for wind turbines

    Energy Technology Data Exchange (ETDEWEB)

    Dahl, K S; Fuglsang, P; Antoniou, I [Risoe National Lab., Roskilde (Denmark)

    1999-03-01

    This paper concerns the experimental verification of a new airfoil family for wind turbines. The family consist of airfoils in the relative thickness range from 15% to 30%. Three airfoils, Risoe-A1-18, Risoe-A1-21, and Risoe-A1-24 were tested in a wind tunnel. The verification consisted of both static and dynamic measurements. Here, the static results are presented for a Reynolds number of 1.6x10{sup 6} for the following airfoil configurations: smooth surface (all three airfoils) and Risoe-A1-24 mounted with leading edge roughness, vortex generators, and Gurney-flaps, respectively. All three airfoils have constant lift curve slope and almost constant drag coefficient until the maximum lift coefficient of about 1.4 is reached. The experimental results are compared with corresponding computational from the general purpose flow solver, EllipSys2D, showing good agreement. (au)

  6. Verification in Referral-Based Crowdsourcing

    Science.gov (United States)

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  7. Nuclear Data Verification and Standardization

    Energy Technology Data Exchange (ETDEWEB)

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.

    2011-10-01

    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  8. Verification and quality control of routine hematology analyzers

    NARCIS (Netherlands)

    Vis, J Y; Huisman, A

    2016-01-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and

  9. Formation mechanism of a silane-PVA/PVAc complex film on a glass fiber surface.

    Science.gov (United States)

    Repovsky, Daniel; Jane, Eduard; Palszegi, Tibor; Slobodnik, Marek; Velic, Dusan

    2013-10-21

    Mechanical properties of glass fiber reinforced composite materials are affected by fiber sizing. A complex film formation, based on a silane film and PVA/PVAc (polyvinyl alcohol/polyvinyl acetate) microspheres on a glass fiber surface is determined at 1) the nanoscale by using atomic force microscopy (AFM), and 2) the macroscale by using the zeta potential. Silane groups strongly bind through the Si-O-Si bond to the glass surface, which provides the attachment mechanism as a coupling agent. The silane groups form islands, a homogeneous film, as well as empty sites. The average roughness of the silanized surface is 6.5 nm, whereas it is only 0.6 nm for the non-silanized surface. The silane film vertically penetrates in a honeycomb fashion from the glass surface through the deposited PVA/PVAc microspheres to form a hexagonal close pack structure. The silane film not only penetrates, but also deforms the PVA/PVAc microspheres from the spherical shape in a dispersion to a ellipsoidal shape on the surface with average dimensions of 300/600 nm. The surface area value Sa represents an area of PVA/PVAc microspheres that are not affected by the silane penetration. The areas are found to be 0.2, 0.08, and 0.03 μm(2) if the ellipsoid sizes are 320/570, 300/610, and 270/620 nm for silane concentrations of 0, 3.8, and 7.2 μg mL(-1), respectively. The silane film also moves PVA/PVAc microspheres in the process of complex film formation, from the low silane concentration areas to the complex film area providing enough silane groups to stabilize the structure. The values for the residual silane honeycomb structure heights (Ha ) are 6.5, 7, and 12 nm for silane concentrations of 3.8, 7.2, and 14.3 μg mL(-1), respectively. The pH-dependent zeta-potential results suggest a specific role of the silane groups with effects on the glass fiber surface and also on the PVA/PVAc microspheres. The non-silanized glass fiber surface and the silane film have similar zeta potentials ranging

  10. 21 CFR 21.44 - Verification of identity.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Verification of identity. 21.44 Section 21.44 Food... Verification of identity. (a) An individual seeking access to records in a Privacy Act Record System may be... identity. The identification required shall be suitable considering the nature of the records sought. No...

  11. Surface Structures Formed by a Copper(II Complex of Alkyl-Derivatized Indigo

    Directory of Open Access Journals (Sweden)

    Akinori Honda

    2016-10-01

    Full Text Available Assembled structures of dyes have great influence on their coloring function. For example, metal ions added in the dyeing process are known to prevent fading of color. Thus, we have investigated the influence of an addition of copper(II ion on the surface structure of alkyl-derivatized indigo. Scanning tunneling microscope (STM analysis revealed that the copper(II complexes of indigo formed orderly lamellar structures on a HOPG substrate. These lamellar structures of the complexes are found to be more stable than those of alkyl-derivatized indigos alone. Furthermore, 2D chirality was observed.

  12. Content analysis of age verification, purchase and delivery methods of internet e-cigarette vendors, 2013 and 2014.

    Science.gov (United States)

    Williams, Rebecca S; Derrick, Jason; Liebman, Aliza Kate; LaFleur, Kevin; Ribisl, Kurt M

    2018-05-01

    Identify the population of internet e-cigarette vendors (IEVs) and conduct content analyses of their age verification, purchase and delivery methods in 2013 and 2014. We used multiple sources to identify IEV websites, primarily complex search algorithms scanning more than 180 million websites. In 2013, we manually screened 32 446 websites, identifying 980 IEVs, selecting the 281 most popular for content analysis. This methodology yielded 31 239 websites for screening in 2014, identifying 3096 IEVs, with 283 selected for content analysis. The proportion of vendors that sold online-only, with no retail store, dropped significantly from 2013 (74.7%) to 2014 (64.3%) (ponline age verification services (7.1% in 2013 and 8.5% in 2014), driving licences (1.8% in 2013 and 7.4% in 2014, ponline e-cigarette sales are needed, including strict age and identity verification requirements. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  13. Modelling and Formal Verification of Timing Aspects in Large PLC Programs

    CERN Document Server

    Fernandez Adiego, B; Blanco Vinuela, E; Tournier, J-C; Gonzalez Suarez, V M; Blech, J O

    2014-01-01

    One of the main obstacle that prevents model checking from being widely used in industrial control systems is the complexity of building formal models out of PLC programs, especially when timing aspects need to be integrated. This paper brings an answer to this obstacle by proposing a methodology to model and verify timing aspects of PLC programs. Two approaches are proposed to allow the users to balance the trade-off between the complexity of the model, i.e. its number of states, and the set of specifications possible to be verified. A tool supporting the methodology which allows to produce models for different model checkers directly from PLC programs has been developed. Verification of timing aspects for real-life PLC programs are presented in this paper using NuSMV.

  14. Piecewise quadratic Lyapunov functions for stability verification of approximate explicit MPC

    Directory of Open Access Journals (Sweden)

    Morten Hovd

    2010-04-01

    Full Text Available Explicit MPC of constrained linear systems is known to result in a piecewise affine controller and therefore also piecewise affine closed loop dynamics. The complexity of such analytic formulations of the control law can grow exponentially with the prediction horizon. The suboptimal solutions offer a trade-off in terms of complexity and several approaches can be found in the literature for the construction of approximate MPC laws. In the present paper a piecewise quadratic (PWQ Lyapunov function is used for the stability verification of an of approximate explicit Model Predictive Control (MPC. A novel relaxation method is proposed for the LMI criteria on the Lyapunov function design. This relaxation is applicable to the design of PWQ Lyapunov functions for discrete-time piecewise affine systems in general.

  15. A methodology for the rigorous verification of plasma simulation codes

    Science.gov (United States)

    Riva, Fabio

    2016-10-01

    The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.

  16. Verification and Performance Analysis for Embedded Systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2009-01-01

    This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems.......This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems....

  17. The Challenge for Arms Control Verification in the Post-New START World

    Energy Technology Data Exchange (ETDEWEB)

    Wuest, C R

    2012-05-24

    Nuclear weapon arms control treaty verification is a key aspect of any agreement between signatories to establish that the terms and conditions spelled out in the treaty are being met. Historically, arms control negotiations have focused more on the rules and protocols for reducing the numbers of warheads and delivery systems - sometimes resorting to complex and arcane procedures for counting forces - in an attempt to address perceived or real imbalances in a nation's strategic posture that could lead to instability. Verification procedures are generally defined in arms control treaties and supporting documents and tend to focus on technical means and measures designed to ensure that a country is following the terms of the treaty and that it is not liable to engage in deception or outright cheating in an attempt to circumvent the spirit and the letter of the agreement. As the Obama Administration implements the articles, terms, and conditions of the recently ratified and entered-into-force New START treaty, there are already efforts within and outside of government to move well below the specified New START levels of 1550 warheads, 700 deployed strategic delivery vehicles, and 800 deployed and nondeployed strategic launchers (Inter-Continental Ballistic Missile (ICBM) silos, Submarine-Launched Ballistic Missile (SLBM) tubes on submarines, and bombers). A number of articles and opinion pieces have appeared that advocate for significantly deeper cuts in the U.S. nuclear stockpile, with some suggesting that unilateral reductions on the part of the U.S. would help coax Russia and others to follow our lead. Papers and studies prepared for the U.S. Department of Defense and at the U.S. Air War College have also been published, suggesting that nuclear forces totaling no more than about 300 warheads would be sufficient to meet U.S. national security and deterrence needs. (Davis 2011, Schaub and Forsyth 2010) Recent articles by James M. Acton and others suggest that

  18. Tolerance Verification of Micro and Nano Structures on Polycarbonate Substrates

    DEFF Research Database (Denmark)

    Gasparin, Stefania; Tosello, Guido; Hansen, Hans Nørgaard

    2010-01-01

    Micro and nano structures are an increasing challenge in terms of tolerance verification and process quality control: smaller dimensions led to a smaller tolerance zone to be evaluated. This paper focuses on the verification of CD, DVD and HD-DVD nanoscale features. CD tolerance features are defi......Micro and nano structures are an increasing challenge in terms of tolerance verification and process quality control: smaller dimensions led to a smaller tolerance zone to be evaluated. This paper focuses on the verification of CD, DVD and HD-DVD nanoscale features. CD tolerance features...

  19. Direct observation of surface reconstruction and termination on a complex metal oxide catalyst by electron microscopy

    KAUST Repository

    Zhu, Yihan

    2012-03-19

    On the surface: The surface reconstruction of an MoVTeO complex metal oxide catalyst was observed directly by various electron microscopic techniques and the results explain the puzzling catalytic behavior. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Surface complexation modeling of the effects of phosphate on uranium(VI) adsorption

    Energy Technology Data Exchange (ETDEWEB)

    Romero-Gonzalez, M.R.; Cheng, T.; Barnett, M.O. [Auburn Univ., AL (United States). Dept. of Civil Engeneering; Roden, E.E. [Wisconsin Univ., Madison, WI (United States). Dept. of Geology and Geophysics

    2007-07-01

    Previous published data for the adsorption of U(VI) and/or phosphate onto amorphous Fe(III) oxides (hydrous ferric oxide, HFO) and crystalline Fe(III) oxides (goethite) was examined. These data were then used to test the ability of a commonly-used surface complexation model (SCM) to describe the adsorption of U(VI) and phosphate onto pure amorphous and crystalline Fe(III) oxides and synthetic goethite-coated sand, a surrogate for a natural Fe(III)-coated material, using the component additivity (CA) approach. Our modeling results show that this model was able to describe U(VI) adsorption onto both amorphous and crystalline Fe(III) oxides and also goethite-coated sand quite well in the absence of phosphate. However, because phosphate adsorption exhibits a stronger dependence on Fe(III) oxide type than U(VI) adsorption, we could not use this model to consistently describe phosphate adsorption onto both amorphous and crystalline Fe(III) oxides and goethite-coated sand. However, the effects of phosphate on U(VI) adsorption could be incorporated into the model to describe U(VI) adsorption to both amorphous and crystalline Fe(III) oxides and goethite-coated sand, at least for an initial approximation. These results illustrate both the potential and limitations of using surface complexation models developed from pure systems to describe metal/radionuclide adsorption under more complex conditions. (orig.)

  1. Surface complexation modeling of the effects of phosphate on uranium(VI) adsorption

    International Nuclear Information System (INIS)

    Romero-Gonzalez, M.R.; Cheng, T.; Barnett, M.O.; Roden, E.E.

    2007-01-01

    Previous published data for the adsorption of U(VI) and/or phosphate onto amorphous Fe(III) oxides (hydrous ferric oxide, HFO) and crystalline Fe(III) oxides (goethite) was examined. These data were then used to test the ability of a commonly-used surface complexation model (SCM) to describe the adsorption of U(VI) and phosphate onto pure amorphous and crystalline Fe(III) oxides and synthetic goethite-coated sand, a surrogate for a natural Fe(III)-coated material, using the component additivity (CA) approach. Our modeling results show that this model was able to describe U(VI) adsorption onto both amorphous and crystalline Fe(III) oxides and also goethite-coated sand quite well in the absence of phosphate. However, because phosphate adsorption exhibits a stronger dependence on Fe(III) oxide type than U(VI) adsorption, we could not use this model to consistently describe phosphate adsorption onto both amorphous and crystalline Fe(III) oxides and goethite-coated sand. However, the effects of phosphate on U(VI) adsorption could be incorporated into the model to describe U(VI) adsorption to both amorphous and crystalline Fe(III) oxides and goethite-coated sand, at least for an initial approximation. These results illustrate both the potential and limitations of using surface complexation models developed from pure systems to describe metal/radionuclide adsorption under more complex conditions. (orig.)

  2. Standard Verification System (SVS)

    Data.gov (United States)

    Social Security Administration — SVS is a mainframe program that accesses the NUMIDENT to perform SSN verifications. This program is called by SSA Internal applications to verify SSNs. There is also...

  3. Surface rheological properties of liquid-liquid interfaces stabilized by protein fibrillar aggregates and protein-polysaccharide complexes

    NARCIS (Netherlands)

    Humblet-Hua, K.N.P.; Linden, van der E.; Sagis, L.M.C.

    2013-01-01

    In this study we have investigated the surface rheological properties of oil-water interfaces stabilized by fibrils from lysozyme (long and semi-flexible and short and rigid ones), fibrils from ovalbumin (short and semi-flexible), lysozyme-pectin complexes, or ovalbumin-pectin complexes. We have

  4. Do the TTBT and JVE provide a framework for 'effective' verification?

    International Nuclear Information System (INIS)

    Vergino, E.S.

    1998-01-01

    The Threshold Test Ban Treaty (TTBT) was signed in 1974 by Richard Nixon and Leonid Brezhnev with both the US and USSR agreeing to adhere to the 150 kt limit of the treaty as of March 31, 1976. Yet the treaty remained non ratified for more than twelve years and during this time during the height of the Cold War, the US and USSR continued to accuse one another of violating the treaty. During late 1987, during the Nuclear Testing Talks in Geneva the Joint Verification Experiment (JVE) was discussed and then was formally announced at the Shultz/Shevardnadze meeting in December, 1987. In the course of arranging JVE Information and data for five Soviet and five US nuclear tests, were exchanged. JVE activity culminated with Kearsarge, detonated on August 17, 1988 and Shagan, detonated on September 14, 1988. JVE provided a unique opportunity for US and USSR technical experts to work together to demonstrate that effective verification of the TTBT could be achieved. The TTBT was the first treaty in which the US pursued a series of complex protocols involving additional, intrusive verification measures. These required extensive collaboration between scientific and political communities, a collaboration necessary to address the balance between the technical capabilities and requirements and the political drivers and needs. During this talk the author discusses this balance, how the balance changed with time, the drivers for change and the lessons learned, and weather there are lessons to be learned that are applicable to the development of other, future, arms control agreements

  5. Packaged low-level waste verification system

    International Nuclear Information System (INIS)

    Tuite, K.T.; Winberg, M.; Flores, A.Y.; Killian, E.W.; McIsaac, C.V.

    1996-01-01

    Currently, states and low-level radioactive waste (LLW) disposal site operators have no method of independently verifying the radionuclide content of packaged LLW that arrive at disposal sites for disposal. At this time, disposal sites rely on LLW generator shipping manifests and accompanying records to insure that LLW received meets the waste acceptance criteria. An independent verification system would provide a method of checking generator LLW characterization methods and help ensure that LLW disposed of at disposal facilities meets requirements. The Mobile Low-Level Waste Verification System (MLLWVS) provides the equipment, software, and methods to enable the independent verification of LLW shipping records to insure that disposal site waste acceptance criteria are being met. The MLLWVS system was developed under a cost share subcontract between WMG, Inc., and Lockheed Martin Idaho Technologies through the Department of Energy's National Low-Level Waste Management Program at the Idaho National Engineering Laboratory (INEL)

  6. Essential issues in SOC design designing complex systems-on-chip

    CERN Document Server

    Lin, Youn-long Steve

    2007-01-01

    Covers issues related to system-on-chip (SoC) design. This book covers IP development, verification, integration, chip implementation, testing and software. It contains valuable academic and industrial examples for those involved with the design of complex SOCs.

  7. Cell surface clustering of Cadherin adhesion complex induced by antibody coated beads

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Cadherin receptors mediate cell-cell adhesion, signal transduction and assembly of cytoskeletons. How a single transmembrane molecule Cadherin can be involved in multiple functions through modulating its binding activities with many membrane adhesion molecules and cytoskeletal components is an unanswered question which can be elucidated by clues from bead experiments. Human lung cells expressing N-Cadherin were examined. After co-incubation with anti-N-Cadherin monoclonal antibody coated beads, cell surface clustering of N-Cadherin was induced. Immunofluorescent detection demonstrated that in addition to Cadherin, β-Catenin, α-Catenin, α-Actinin and Actin fluorescence also aggregated respectively at the membrane site of bead attachment. Myosin heavy chain (MHC), another major component of Actin cytoskeleton, did not aggregate at the membrane site of bead attachment. Adhesion unrelated protein Con A and polylysine conjugated beads did not induce the clustering of adhesion molecules. It is indicated that the Cadherin/Catenins/α-Actinin/Actin complex is formed at Cadherin mediated cell adherens junction; occupancy and cell surface clustering of Cadherin is crucial for the formation of Cadherin adhesion protein complexes.

  8. Spent Nuclear Fuel (SNF) Project Design Verification and Validation Process

    International Nuclear Information System (INIS)

    OLGUIN, L.J.

    2000-01-01

    This document provides a description of design verification and validation activities implemented by the Spent Nuclear Fuel (SNF) Project. During the execution of early design verification, a management assessment (Bergman, 1999) and external assessments on configuration management (Augustenburg, 1999) and testing (Loscoe, 2000) were conducted and identified potential uncertainties in the verification process. This led the SNF Chief Engineer to implement corrective actions to improve process and design products. This included Design Verification Reports (DVRs) for each subproject, validation assessments for testing, and verification of the safety function of systems and components identified in the Safety Equipment List to ensure that the design outputs were compliant with the SNF Technical Requirements. Although some activities are still in progress, the results of the DVR and associated validation assessments indicate that Project requirements for design verification are being effectively implemented. These results have been documented in subproject-specific technical documents (Table 2). Identified punch-list items are being dispositioned by the Project. As these remaining items are closed, the technical reports (Table 2) will be revised and reissued to document the results of this work

  9. Runtime verification of embedded real-time systems.

    Science.gov (United States)

    Reinbacher, Thomas; Függer, Matthias; Brauer, Jörg

    We present a runtime verification framework that allows on-line monitoring of past-time Metric Temporal Logic (ptMTL) specifications in a discrete time setting. We design observer algorithms for the time-bounded modalities of ptMTL, which take advantage of the highly parallel nature of hardware designs. The algorithms can be translated into efficient hardware blocks, which are designed for reconfigurability, thus, facilitate applications of the framework in both a prototyping and a post-deployment phase of embedded real-time systems. We provide formal correctness proofs for all presented observer algorithms and analyze their time and space complexity. For example, for the most general operator considered, the time-bounded Since operator, we obtain a time complexity that is doubly logarithmic both in the point in time the operator is executed and the operator's time bounds. This result is promising with respect to a self-contained, non-interfering monitoring approach that evaluates real-time specifications in parallel to the system-under-test. We implement our framework on a Field Programmable Gate Array platform and use extensive simulation and logic synthesis runs to assess the benefits of the approach in terms of resource usage and operating frequency.

  10. As-Built Verification Plan Spent Nuclear Fuel Canister Storage Building MCO Handling Machine

    International Nuclear Information System (INIS)

    SWENSON, C.E.

    2000-01-01

    This as-built verification plan outlines the methodology and responsibilities that will be implemented during the as-built field verification activity for the Canister Storage Building (CSB) MCO HANDLING MACHINE (MHM). This as-built verification plan covers THE ELECTRICAL PORTION of the CONSTRUCTION PERFORMED BY POWER CITY UNDER CONTRACT TO MOWAT. The as-built verifications will be performed in accordance Administrative Procedure AP 6-012-00, Spent Nuclear Fuel Project As-Built Verification Plan Development Process, revision I. The results of the verification walkdown will be documented in a verification walkdown completion package, approved by the Design Authority (DA), and maintained in the CSB project files

  11. 37 CFR 262.7 - Verification of royalty payments.

    Science.gov (United States)

    2010-07-01

    ... Designated Agent have agreed as to proper verification methods. (b) Frequency of verification. A Copyright Owner or a Performer may conduct a single audit of the Designated Agent upon reasonable notice and... COPYRIGHT ARBITRATION ROYALTY PANEL RULES AND PROCEDURES RATES AND TERMS FOR CERTAIN ELIGIBLE...

  12. 40 CFR 1065.675 - CLD quench verification calculations.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false CLD quench verification calculations... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Calculations and Data Requirements § 1065.675 CLD quench verification calculations. Perform CLD quench-check calculations as follows: (a) Perform a CLD analyzer quench...

  13. Damage Detection and Verification System (DDVS) for In-Situ Health Monitoring

    Science.gov (United States)

    Williams, Martha K.; Lewis, Mark; Szafran, J.; Shelton, C.; Ludwig, L.; Gibson, T.; Lane, J.; Trautwein, T.

    2015-01-01

    Project presentation for Game Changing Program Smart Book Release. Detection and Verification System (DDVS) expands the Flat Surface Damage Detection System (FSDDS) sensory panels damage detection capabilities and includes an autonomous inspection capability utilizing cameras and dynamic computer vision algorithms to verify system health. Objectives of this formulation task are to establish the concept of operations, formulate the system requirements for a potential ISS flight experiment, and develop a preliminary design of an autonomous inspection capability system that will be demonstrated as a proof-of-concept ground based damage detection and inspection system.

  14. Verification of fluid-structure-interaction algorithms through the method of manufactured solutions for actuator-line applications

    Science.gov (United States)

    Vijayakumar, Ganesh; Sprague, Michael

    2017-11-01

    Demonstrating expected convergence rates with spatial- and temporal-grid refinement is the ``gold standard'' of code and algorithm verification. However, the lack of analytical solutions and generating manufactured solutions presents challenges for verifying codes for complex systems. The application of the method of manufactured solutions (MMS) for verification for coupled multi-physics phenomena like fluid-structure interaction (FSI) has only seen recent investigation. While many FSI algorithms for aeroelastic phenomena have focused on boundary-resolved CFD simulations, the actuator-line representation of the structure is widely used for FSI simulations in wind-energy research. In this work, we demonstrate the verification of an FSI algorithm using MMS for actuator-line CFD simulations with a simplified structural model. We use a manufactured solution for the fluid velocity field and the displacement of the SMD system. We demonstrate the convergence of both the fluid and structural solver to second-order accuracy with grid and time-step refinement. This work was funded by the U.S. Department of Energy, Office of Energy Efficiency and Renewable Energy, Wind Energy Technologies Office, under Contract No. DE-AC36-08-GO28308 with the National Renewable Energy Laboratory.

  15. A New Approach to Nuclear Warhead Verification Using a Zero-Knowledge Protocol

    Energy Technology Data Exchange (ETDEWEB)

    Glaser,; Alexander, [Princeton University

    2012-05-16

    Warhead verification systems proposed to date fundamentally rely on the use of information barriers to prevent the release of classified design information. Measurements with information carriers significantly increase the complexity of inspection systems, make their certification and authentication difficult, and may reduce the overall confidence in the verifiability of future arms- control agreements. This talk presents a proof-of-concept of a new approach to nuclear warhead verification that minimizes the role of information barriers from the outset and envisions instead an inspection system that a priori avoids leakage of sensitive information using a so-called zero-knowledge protocol. The proposed inspection system is based on the template-matching approach and relies on active interrogation of a test object with 14-MeV neutrons. The viability of the method is examined with MCNP Monte Carlo neutron transport calculations modeling the experimental setup, an investigation of different diversion scenarios, and an analysis of the simulated data showing that it does not contain information about the properties of the inspected object.

  16. Analysis and Transformation Tools for Constrained Horn Clause Verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2014-01-01

    Several techniques and tools have been developed for verification of properties expressed as Horn clauses with constraints over a background theory (CHC). Current CHC verification tools implement intricate algorithms and are often limited to certain subclasses of CHC problems. Our aim in this work...... is to investigate the use of a combination of off-the-shelf techniques from the literature in analysis and transformation of Constraint Logic Programs (CLPs) to solve challenging CHC verification problems. We find that many problems can be solved using a combination of tools based on well-known techniques from...... abstract interpretation, semantics-preserving transformations, program specialisation and query-answer transformations. This gives insights into the design of automatic, more general CHC verification tools based on a library of components....

  17. Verification of the thermal design of electronic equipment

    Energy Technology Data Exchange (ETDEWEB)

    Hienonen, R.; Karjalainen, M.; Lankinen, R. [VTT Automation, Espoo (Finland). ProTechno

    1997-12-31

    The project `Verification of the thermal design of electronic equipment` studied the methodology to be followed in the verification of thermal design of electronic equipment. This project forms part of the `Cool Electronics` research programme funded by TEKES, the Finnish Technology Development Centre. This project was carried out jointly by VTT Automation, Lappeenranta University of Technology, Nokia Research Center and ABB Industry Oy VSD-Technology. The thermal design of electronic equipment has a significant impact on the cost, reliability, tolerance to different environments, selection of components and materials, and ergonomics of the product. This report describes the method for verification of thermal design. It assesses the goals set for thermal design, environmental requirements, technical implementation of the design, thermal simulation and modelling, and design qualification testing and the measurements needed. The verification method covers all packaging levels of electronic equipment from the system level to the electronic component level. The method described in this report can be used as part of the quality system of a corporation. The report includes information about the measurement and test methods needed in the verification process. Some measurement methods for the temperature, flow and pressure of air are described. (orig.) Published in Finnish VTT Julkaisuja 824. 22 refs.

  18. Radiological verification survey results at 14 Peck Ave., Pequannock, New Jersey (PJ001V)

    International Nuclear Information System (INIS)

    Rodriguez, R.E.; Johnson, C.A.

    1995-05-01

    The U.S. Department of Energy (DOE) conducted remedial action during 1993 at the Pompton Plains Railroad Spur and eight vicinity properties in the Wayne and Pequannock Townships in New Jersey as part of the Formerly Utilized Sites Remedial Action Program (FUSRAP). These properties are in the vicinity of the DOE-owned Wayne Interim Storage Site (WISS), formerly the W. R. Grace facility. The property at 14 Peck Ave., Pequannock, New Jersey is one of these vicinity properties. At the request of DOE, a team from Oak Ridge National Laboratory conducted an independent radiological verification survey at this property. The purpose of the survey, conducted between September and December 1993, was to confirm the success of the remedial actions performed to remove any radioactive materials in excess of the identified guidelines. The verification survey included surface gamma scans and gamma readings at 1 meter, beta-gamma scans, and the collection of soil and debris samples for radionuclide analysis. Results of the survey demonstrated that all radiological measurements on the property at 14 Peck Ave. were within applicable DOE guidelines. Based on the results of the remedial action data and confirmed by the verification survey data, the portions of the site that had been remediated during this action successfully meet the DOE remedial action objectives

  19. Radiological verification survey results at 3 Peck Ave., Pequannock, New Jersey (PJ002V)

    International Nuclear Information System (INIS)

    Rodriguez, R.E.; Johnson, C.A.

    1995-05-01

    The US Department of Energy (DOE) conducted remedial action during 1993 at the Pompton Plains Railroad Spur and eight vicinity properties in the Wayne and Pequannock Townships in New Jersey as part of the Formerly Utilized Sites Remedial Action Program (FUSRAP). These properties are in the vicinity of the DOE-owned Wayne Interim Storage Site (WISS), formerly the W.R. Grace facility. The property at 3 Peck Ave., Pequannock, New Jersey is one of these vicinity properties. At the request of DOE, a team from Oak Ridge National Laboratory conducted an independent radiological verification survey at this property. The purpose of the survey, conducted between September and December 1993, was to confirm the success of the remedial actions performed to remove any radioactive materials in excess of the identified guidelines. The verification survey included surface gamma scans and gamma readings at 1 meter, beta-gamma scans, and the collection of soil and debris samples for radionuclide analysis. Results of the survey demonstrated that all radiological measurements on the property at 3 Peck Ave. were within applicable DOE guidelines. Based on the results of the remedial action data and confirmed by the verification survey data, the portions of the site that had been remediated during this action successfully meet the DOE remedial action objectives

  20. Radiological verification survey results at 15 Peck Ave., Pequannock, New Jersey (PJ005V)

    International Nuclear Information System (INIS)

    Rodriguez, R.E.; Johnson, C.A.

    1995-05-01

    The US Department of Energy (DOE) conducted remedial action during 1993 at the Pompton Plains Railroad Spur and eight vicinity properties in the Wayne and Pequannock Townships in New Jersey as part of the Formerly Utilized Sites Remedial Action Program (FUSRAP). These properties are in the vicinity of the DOE-owned Wayne Interim Storage Site (WISS), formerly the W.R. Grace facility. The property at 15 Peck Ave., Pequannock, New Jersey is one of these vicinity properties. At the request of DOE, a team from Oak Ridge National Laboratory conducted an independent radiological verification survey at this property. The purpose of the survey, conducted between September and December 1993, was to confirm the success of the remedial actions performed to remove any radioactive materials in excess of the identified guidelines. The verification survey included surface gamma scans and gamma readings at 1 meter, beta-gamma scans, and the collection of soil and debris samples for radionuclide analysis. Results of the survey demonstrated that all radiological measurements on the property at 15 Peck Ave. were within applicable DOE guidelines. Based on the results of the remedial action data and confirmed by the verification survey data, the portions of the site that had been remediated during this action successfully meet the DOE remedial action objectives

  1. Radiological verification survey results at 17 Peck Ave., Pequannock, New Jersey (PJ006V)

    International Nuclear Information System (INIS)

    Rodriguez, R.E.; Johnson, C.A.

    1995-05-01

    The US Department of Energy (DOE) conducted remedial action during 1993 at the Pompton Plains Railroad Spur and eight vicinity properties in the Wayne and Pequannock Townships in New Jersey as part of the Formerly Utilized Sites Remedial Action Program (FUSRAP). These properties are in the vicinity of the DOE-owned Wayne Interim Storage Site (WISS), formerly the W.R. Grace facility. The property at 17 Peck Ave., Pequannock, New Jersey is one of these vicinity properties. At the request of DOE, a team from Oak Ridge National Laboratory conducted an independent radiological verification survey at this property. The purpose of the survey, conducted between September and December 1993, was to confirm the success of the remedial actions performed to remove any radioactive materials in excess of the identified guidelines. The verification survey included surface gamma scans and gamma readings at 1 meter, beta-gamma scans, and the collection of soil and debris samples for radionuclide analysis. Results of the survey demonstrated that all radiological measurements on the property at 17 Peck Ave. were within applicable DOE guidelines. Based on the results of the remedial action data and confirmed by the verification survey data, the portions of the site that had been remediated during this action successfully meet the DOE remedial action objectives

  2. Radiological verification survey results at 7 Peck Ave., Pequannock, New Jersey (PJ003V)

    International Nuclear Information System (INIS)

    Rodriguez, R.E.; Johnson, C.A.

    1995-05-01

    The US Department of Energy (DOE) conducted remedial action during 1993 at the Pompton Plains Railroad Spur and eight vicinity properties in the Wayne and Pequannock Townships in New Jersey as part of the Formerly Utilized Sites Remedial Action Program (FUSRAP). These properties are in the vicinity of the DOE-owned Wayne Interim Storage Site (WISS), formerly the W.R. Grace facility. The property at 7 Peck Ave., Pequannock, New Jersey is one of these vicinity properties. At the request of DOE, a team from Oak Ridge National Laboratory conducted an independent radiological verification survey at this property. The purpose of the survey, conducted between September and December 1993, was to confirm the success of the remedial actions performed to remove any radioactive materials in excess of the identified guidelines. The verification survey included surface gamma scans and gamma readings at 1 meter, beta-gamma scans, and the collection of soil samples for radionuclide analysis. Results of the survey demonstrated that all radiological measurements on the property at 7 Peck Ave. were within applicable DOE guidelines. Based on the results of the remedial action data and confirmed by the verification survey data, the portions of the site that had been remediated during this action successfully meet the DOE remedial action objectives

  3. Radiological verification survey results as 13 Peck Ave., Pequannock, New Jersey (PJ004V)

    International Nuclear Information System (INIS)

    Rodriguez, R.E.; Johnson, C.A.

    1995-05-01

    The US Department of Energy (DOE) conducted remedial action during 1993 at the Pompton Plains Railroad Spur and eight vicinity properties in the Wayne and Pequannock Townships in New Jersey as part of the Formerly Utilized Sites Remedial Action Program (FUSRAP). These properties are in the vicinity of the DOE-owned Wayne Interim Storage Site (WISS), formerly the W.R. Grace facility. The property at 13 Peck Ave., Pequannock, New Jersey is one of these vicinity properties. At the request of DOE, a team from Oak Ridge National Laboratory conducted an independent radiological verification survey at this property. The purpose of the survey, conducted between September and December 1993, was to confirm the success of the remedial actions performed to remove any radioactive materials in excess of the identified guidelines. The verification survey included surface gamma scans and gamma readings at 1 meter, beta-gamma scans, and the collection of soil and debris samples for radionuclide analysis. Results of the survey demonstrated that all radiological measurements on the property at 13 Peck Ave. were within applicable DOE guidelines. Based on the results of the remedial action data and confirmed by the verification survey data, the portions of the site that had been remediated during this action successfully meet the DOE remedial action objectives

  4. A coupled mass transfer and surface complexation model for uranium (VI) removal from wastewaters

    International Nuclear Information System (INIS)

    Lenhart, J.; Figueroa, L.A.; Honeyman, B.D.

    1994-01-01

    A remediation technique has been developed for removing uranium (VI) from complex contaminated groundwater using flake chitin as a biosorbent in batch and continuous flow configurations. With this system, U(VI) removal efficiency can be predicted using a model that integrates surface complexation models, mass transport limitations and sorption kinetics. This integration allows the reactor model to predict removal efficiencies for complex groundwaters with variable U(VI) concentrations and other constituents. The system has been validated using laboratory-derived kinetic data in batch and CSTR systems to verify the model predictions of U(VI) uptake from simulated contaminated groundwater

  5. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  6. The structure of surface texture knowledge

    International Nuclear Information System (INIS)

    Yan Wang; Scott, Paul J; Jiang Xiangqian

    2005-01-01

    This research aims to create an intelligent knowledge-based system for engineering and bio-medical engineering surface texture, which will provide expert knowledge of surface texture to link surface function, specification of micro- and nano-geometry through manufacture, and verification. The intelligent knowledge base should be capable of incorporating knowledge from multiple sources (standards, books, experts, etc), adding new knowledge from these sources and still remain a coherent reliable system. A new data model based on category theory will be adopted to construct this system

  7. Synthesis in situ of gold nanoparticles by a dialkynyl Fischer carbene complex anchored to glass surfaces

    International Nuclear Information System (INIS)

    Bertolino, María Candelaria; Granados, Alejandro Manuel

    2016-01-01

    Highlights: • Fischer carbene 1-W reacts via cycloaddition without Cu(I) with azide terminal surface. • This reaction on the surface is regioselective to internal triple bond of 1-W. • 1-W bound to glass surface produce AuNps in situ fixed to the surface. • This ability is independent of how 1-W is bonded to the surface. • This hybrid surface can be valuable as SERS substrate or in heterogeneous catalysis. - Abstract: In this work we present a detailed study of classic reactions such as “click reaction” and nucleophilic substitution reaction but on glass solid surface (slides). We used different reactive center of a dialkynylalcoxy Fischer carbene complex of tungsten(0) to be anchored to modified glass surface with amine, to obtain aminocarbene, and azide terminal groups. These cycloaddition reaction showed regioselectivity to internal triple bond of dialkynyl Fischer carbene complex without Cu(I) as catalyst. Anyway the carbene anchored was able to act as a reducing agent to produce in situ very stable gold nanoparticles fixed on surface. We showed the characterization of modified glasses by contact angle measurements and XPS. Synthesized nanoparticles were characterized by SEM, XPS, EDS and UV–vis. The modified glasses showed an important enhancement Raman-SERS. This simple, fast and robust method to create a polifunctional and hybrid surfaces can be valuable in a wide range of applications such as Raman-SERS substrates and other optical fields.

  8. Synthesis in situ of gold nanoparticles by a dialkynyl Fischer carbene complex anchored to glass surfaces

    Energy Technology Data Exchange (ETDEWEB)

    Bertolino, María Candelaria, E-mail: cbertolino@fcq.unc.edu.ar; Granados, Alejandro Manuel, E-mail: ale@fcq.unc.edu.ar

    2016-10-15

    Highlights: • Fischer carbene 1-W reacts via cycloaddition without Cu(I) with azide terminal surface. • This reaction on the surface is regioselective to internal triple bond of 1-W. • 1-W bound to glass surface produce AuNps in situ fixed to the surface. • This ability is independent of how 1-W is bonded to the surface. • This hybrid surface can be valuable as SERS substrate or in heterogeneous catalysis. - Abstract: In this work we present a detailed study of classic reactions such as “click reaction” and nucleophilic substitution reaction but on glass solid surface (slides). We used different reactive center of a dialkynylalcoxy Fischer carbene complex of tungsten(0) to be anchored to modified glass surface with amine, to obtain aminocarbene, and azide terminal groups. These cycloaddition reaction showed regioselectivity to internal triple bond of dialkynyl Fischer carbene complex without Cu(I) as catalyst. Anyway the carbene anchored was able to act as a reducing agent to produce in situ very stable gold nanoparticles fixed on surface. We showed the characterization of modified glasses by contact angle measurements and XPS. Synthesized nanoparticles were characterized by SEM, XPS, EDS and UV–vis. The modified glasses showed an important enhancement Raman-SERS. This simple, fast and robust method to create a polifunctional and hybrid surfaces can be valuable in a wide range of applications such as Raman-SERS substrates and other optical fields.

  9. Verification and Validation in Systems Engineering

    CERN Document Server

    Debbabi, Mourad; Jarraya, Yosr; Soeanu, Andrei; Alawneh, Luay

    2010-01-01

    "Verification and validation" represents an important process used for the quality assessment of engineered systems and their compliance with the requirements established at the beginning of or during the development cycle. Debbabi and his coauthors investigate methodologies and techniques that can be employed for the automatic verification and validation of systems engineering design models expressed in standardized modeling languages. Their presentation includes a bird's eye view of the most prominent modeling languages for software and systems engineering, namely the Unified Model

  10. Verification Failures: What to Do When Things Go Wrong

    Science.gov (United States)

    Bertacco, Valeria

    Every integrated circuit is released with latent bugs. The damage and risk implied by an escaped bug ranges from almost imperceptible to potential tragedy; unfortunately it is impossible to discern within this range before a bug has been exposed and analyzed. While the past few decades have witnessed significant efforts to improve verification methodology for hardware systems, these efforts have been far outstripped by the massive complexity of modern digital designs, leading to product releases for which an always smaller fraction of system's states has been verified. The news of escaped bugs in large market designs and/or safety critical domains is alarming because of safety and cost implications (due to replacements, lawsuits, etc.).

  11. 37 CFR 260.6 - Verification of royalty payments.

    Science.gov (United States)

    2010-07-01

    ... verification of the payment of royalty fees to those parties entitled to receive such fees, according to terms... 37 Patents, Trademarks, and Copyrights 1 2010-07-01 2010-07-01 false Verification of royalty... COPYRIGHT ARBITRATION ROYALTY PANEL RULES AND PROCEDURES RATES AND TERMS FOR PREEXISTING SUBSCRIPTION...

  12. Design verification for reactor head replacement

    International Nuclear Information System (INIS)

    Dwivedy, K.K.; Whitt, M.S.; Lee, R.

    2005-01-01

    This paper outlines the challenges of design verification for reactor head replacement for PWR plants and the program for qualification from the prospective of the utility design engineering group. This paper is based on the experience with the design confirmation of four reactor head replacements for two plants, and their interfacing components, parts, appurtenances, and support structures. The reactor head replacement falls under the jurisdiction of the applicable edition of the ASME Section XI code, with particular reference to repair/replacement activities. Under any repair/replacement activities, demands may be encountered in the development of program and plan for replacement due to the vintage of the original design/construction Code and the design reports governing the component qualifications. Because of the obvious importance of the reactor vessel, these challenges take on an added significance. Additional complexities are introduced to the project, when the replacement components are fabricated by vendors different from the original vendor. Specific attention is needed with respect to compatibility with the original design and construction of the part and interfacing components. The program for reactor head replacement requires evaluation of welding procedures, applicable examination, test, and acceptance criteria for material, welds, and the components. Also, the design needs to take into consideration the life of the replacement components with respect to the extended period of operation of the plant after license renewal and other plant improvements. Thus, the verification of acceptability of reactor head replacement provides challenges for development and maintenance of a program and plan, design specification, design report, manufacturer's data report and material certification, and a report of reconciliation. The technical need may also be compounded by other challenges such as widely scattered global activities and organizational barriers, which

  13. Using AFM to probe the complexation of DNA with anionic lipids mediated by Ca(2+): the role of surface pressure.

    Science.gov (United States)

    Luque-Caballero, Germán; Martín-Molina, Alberto; Sánchez-Treviño, Alda Yadira; Rodríguez-Valverde, Miguel A; Cabrerizo-Vílchez, Miguel A; Maldonado-Valderrama, Julia

    2014-04-28

    Complexation of DNA with lipids is currently being developed as an alternative to classical vectors based on viruses. Most of the research to date focuses on cationic lipids owing to their spontaneous complexation with DNA. Nonetheless, recent investigations have revealed that cationic lipids induce a large number of adverse effects on DNA delivery. Precisely, the lower cytotoxicity of anionic lipids accounts for their use as a promising alternative. However, the complexation of DNA with anionic lipids (mediated by cations) is still in early stages and is not yet well understood. In order to explore the molecular mechanisms underlying the complexation of anionic lipids and DNA we proposed a combined methodology based on the surface pressure-area isotherms, Gibbs elasticity and Atomic Force Microscopy (AFM). These techniques allow elucidation of the role of the surface pressure in the complexation and visualization of the interfacial aggregates for the first time. We demonstrate that the DNA complexes with negatively charged model monolayers (DPPC/DPPS 4 : 1) only in the presence of Ca(2+), but is expelled at very high surface pressures. Also, according to the Gibbs elasticity plot, the complexation of lipids and DNA implies a whole fluidisation of the monolayer and a completely different phase transition map in the presence of DNA and Ca(2+). AFM imaging allows identification for the first time of specific morphologies associated with different packing densities. At low surface coverage, a branched net like structure is observed whereas at high surface pressure fibers formed of interfacial aggregates appear. In summary, Ca(2+) mediates the interaction between DNA and negatively charged lipids and also the conformation of the ternary system depends on the surface pressure. Such observations are important new generic features of the interaction between DNA and anionic lipids.

  14. International exchange on nuclear safety related expert systems: The role of software verification and validation

    International Nuclear Information System (INIS)

    Sun, B.K.H.

    1996-01-01

    An important lesson learned from the Three Mile Island accident is that human errors can be significant contributors to risk. Recent advancement in computer hardware and software technology helped make expert system techniques potentially viable tools for improving nuclear power plant safety and reliability. As part of the general man-machine interface technology, expert systems have recently become increasingly prominent as a potential solution to a number of previously intractable problems in many phases of human activity, including operation, maintenance, and engineering functions. Traditional methods for testing and analyzing analog systems are no longer adequate to handle the increased complexity of software systems. The role of Verification and Validation (V and V) is to add rigor to the software development and maintenance cycle to guarantee the high level confidence needed for applications. Verification includes the process and techniques for confirming that all the software requirements in one stage of the development are met before proceeding on to the next stage. Validation involves testing the integrated software and hardware system to ensure that it reliably fulfills its intended functions. Only through a comprehensive V and V program can a high level of confidence be achieved. There exist many different standards and techniques for software verification and validation, yet they lack uniform approaches that provides adequate levels of practical guidance which can be used by users for nuclear power plant applications. There is a need to unify different approaches for addressing software verification and validation and to develop practical and cost effective guidelines for user and regulatory acceptance. (author). 8 refs

  15. Remedial activities effectiveness verification in tailing areas.

    Science.gov (United States)

    Kluson, J; Thinova, L; Neznal, M; Svoboda, T

    2015-06-01

    The complex radiological study of the basin of sludge from the uranium ore mining and preprocessing was done. Air kerma rates (including its spectral analysis) at the reference height of 1 m above ground over the whole area were measured and radiation fields mapped during two measuring campaigns (years 2009 and 2014). K, U and Th concentrations in sludge and concentrations in depth profiles (including radon concentration and radon exhalation rates) in selected points were determined using gamma spectrometry for in situ as well as laboratory samples measurement. Results were used for the analysis, design evaluation and verification of the efficiency of the remediation measures. Efficiency of the sludge basin covering by the inert material was modelled using MicroShield code. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  16. Constraining the Surface Energy Balance of Snow in Complex Terrain

    Science.gov (United States)

    Lapo, Karl E.

    Physically-based snow models form the basis of our understanding of current and future water and energy cycles, especially in mountainous terrain. These models are poorly constrained and widely diverge from each other, demonstrating a poor understanding of the surface energy balance. This research aims to improve our understanding of the surface energy balance in regions of complex terrain by improving our confidence in existing observations and improving our knowledge of remotely sensed irradiances (Chapter 1), critically analyzing the representation of boundary layer physics within land models (Chapter 2), and utilizing relatively novel observations to in the diagnoses of model performance (Chapter 3). This research has improved the understanding of the literal and metaphorical boundary between the atmosphere and land surface. Solar irradiances are difficult to observe in regions of complex terrain, as observations are subject to harsh conditions not found in other environments. Quality control methods were developed to handle these unique conditions. These quality control methods facilitated an analysis of estimated solar irradiances over mountainous environments. Errors in the estimated solar irradiance are caused by misrepresenting the effect of clouds over regions of topography and regularly exceed the range of observational uncertainty (up to 80Wm -2) in all regions examined. Uncertainty in the solar irradiance estimates were especially pronounced when averaging over high-elevation basins, with monthly differences between estimates up to 80Wm-2. These findings can inform the selection of a method for estimating the solar irradiance and suggest several avenues of future research for improving existing methods. Further research probed the relationship between the land surface and atmosphere as it pertains to the stable boundary layers that commonly form over snow-covered surfaces. Stable conditions are difficult to represent, especially for low wind speed

  17. Inner-sphere, outer-sphere and ternary surface complexes: a TRLFS study of the sorption process of europium(III) onto smectite

    International Nuclear Information System (INIS)

    Stumpf, Th.; Fanghaenel, Th.; Bauer, A.; Kim, J.I.

    2002-01-01

    The surface sorption process of Eu(III) onto smectite was investigated by TRLFS in the trace concentration range. With increasing pH the formation of an inner-sphere Eu(III) surface complex was observed. The differences in the spectra and the fluorescence emission lifetimes of the surface sorbed Eu(III) in presence and absence of carbonate indicate the formation of ternary clay/Eu(III)/carbonate complexes /1/. (orig.)

  18. Synthesis of mixed ligand europium complexes: Verification of predicted luminescence intensification

    International Nuclear Information System (INIS)

    Lima, Nathalia B.D.; Silva, Anderson I.S.; Gonçalves, Simone M.C.; Simas, Alfredo M.

    2016-01-01

    Mixed ligand europium complexes are predicted to be more luminescent than what would be expected from their corresponding repeating ligand compounds according to a conjecture recently advanced by our research group; a conjecture that has already been validated for strongly luminescent europium complexes. In this article, we seek to further verify the validity of this conjecture for complexes which are much more symmetric, and which thus display lower levels of luminescence. Accordingly, we synthesized complexes Eu(DBM) 3 (L) 2 , and all novel mixed ligand combinations Eu(DBM) 3 (L,L') with L and L' equal to DBSO, PTSO, and TPPO. The syntheses were carried out via displacement reactions from the starting complex Eu(DBM) 3 (H 2 O) 2 , passing through the intermediates Eu(DBM) 3 (L) 2 and finally, by displacement of L by L', arriving at Eu(DBM) 3 (L,L'). The ligands L obey the following order of displacement TPPO>PTSO>DBSO>H 2 O, which had been previously described by our group. In the present article, we further show that this displacement order could have been predicted by Sparkle/RM1 thermochemical calculations. Subsequently, we determined the radiative decay rates, A rad , for all six compounds by photophysical measurements. As expected, results show that the measured A rad values for all novel mixed ligand complexes are larger than the average of the A rad values for the corresponding repeating ligand coordination compounds. In conclusion, the present article does broaden the scope of our conjecture, which enunciates that an increase in the diversity of ligands around the europium ion tends to intensify the luminescence. - Highlights: • Mixed ligand europium complexes are predicted to be more luminescent than repeating ligand ones. • Radiative decay rates increase with structural coordination asymmetry. • The non-ionic ligands displacement order in substitution reactions is TPPO>PTSO>DBSO>H 2 O. • Sparkle/RM1 correctly predicts the

  19. A Roadmap for the Implementation of Continued Process Verification.

    Science.gov (United States)

    Boyer, Marcus; Gampfer, Joerg; Zamamiri, Abdel; Payne, Robin

    2016-01-01

    In 2014, the members of the BioPhorum Operations Group (BPOG) produced a 100-page continued process verification case study, entitled "Continued Process Verification: An Industry Position Paper with Example Protocol". This case study captures the thought processes involved in creating a continued process verification plan for a new product in response to the U.S. Food and Drug Administration's guidance on the subject introduced in 2011. In so doing, it provided the specific example of a plan developed for a new molecular antibody product based on the "A MAb Case Study" that preceded it in 2009.This document provides a roadmap that draws on the content of the continued process verification case study to provide a step-by-step guide in a more accessible form, with reference to a process map of the product life cycle. It could be used as a basis for continued process verification implementation in a number of different scenarios: For a single product and process;For a single site;To assist in the sharing of data monitoring responsibilities among sites;To assist in establishing data monitoring agreements between a customer company and a contract manufacturing organization. The U.S. Food and Drug Administration issued guidance on the management of manufacturing processes designed to improve quality and control of drug products. This involved increased focus on regular monitoring of manufacturing processes, reporting of the results, and the taking of opportunities to improve. The guidance and practice associated with it is known as continued process verification This paper summarizes good practice in responding to continued process verification guidance, gathered from subject matter experts in the biopharmaceutical industry. © PDA, Inc. 2016.

  20. On Backward-Style Anonymity Verification

    Science.gov (United States)

    Kawabe, Yoshinobu; Mano, Ken; Sakurada, Hideki; Tsukada, Yasuyuki

    Many Internet services and protocols should guarantee anonymity; for example, an electronic voting system should guarantee to prevent the disclosure of who voted for which candidate. To prove trace anonymity, which is an extension of the formulation of anonymity by Schneider and Sidiropoulos, this paper presents an inductive method based on backward anonymous simulations. We show that the existence of an image-finite backward anonymous simulation implies trace anonymity. We also demonstrate the anonymity verification of an e-voting protocol (the FOO protocol) with our backward anonymous simulation technique. When proving the trace anonymity, this paper employs a computer-assisted verification tool based on a theorem prover.

  1. Guest-Host Complex Formed between Ascorbic Acid and β-Cyclodextrin Immobilized on the Surface of an Electrode

    Directory of Open Access Journals (Sweden)

    María Teresa Ramírez-Silva

    2014-05-01

    Full Text Available This work deals with the formation of supramolecular complexes between ascorbic acid (AA, the guest, and β-cyclodextrin (β-CD, the host, that was first potentiodynamically immobilized on the surface of a carbon paste electrode (CPE throughout the formation of a β-CD-based conducting polymer (poly-β-CD. With the bare CPE and the β-CD-modified CPE, an electrochemical study was performed to understand the effect of such surface modification on the electrochemical response of the AA. From this study it was shown that on the modified-CPE, the AA was surface-immobilized through formation of an inclusion complex with β-CD, which provoked the adsorption of AA in such a way that this stage became the limiting step for the electrochemical oxidation of AA. Moreover, from the analysis of the experimental voltammetric plots recorded during AA oxidation on the CPE/poly-β-CD electrode surfaces, the Gibbs’ standard free energy of the inclusion complex formed by the oxidation product of AA and β-CD has been determined for the first time, ∆G0inclus = −36.4 kJ/mol.

  2. 78 FR 27882 - VA Veteran-Owned Small Business (VOSB) Verification Guidelines

    Science.gov (United States)

    2013-05-13

    ... Verification Self-Assessment Tool that walks the veteran through the regulation and how it applies to the...) Verification Guidelines AGENCY: Department of Veterans Affairs. ACTION: Advanced notice of proposed rulemaking... regulations governing the Department of Veterans Affairs (VA) Veteran-Owned Small Business (VOSB) Verification...

  3. Verification and validation for waste disposal models

    International Nuclear Information System (INIS)

    1987-07-01

    A set of evaluation criteria has been developed to assess the suitability of current verification and validation techniques for waste disposal methods. A survey of current practices and techniques was undertaken and evaluated using these criteria with the items most relevant to waste disposal models being identified. Recommendations regarding the most suitable verification and validation practices for nuclear waste disposal modelling software have been made

  4. Surface complexation modeling of uranyl adsorption on corrensite from the Waste Isolation Pilot Plant Site

    Energy Technology Data Exchange (ETDEWEB)

    Park, Sang-Won; Leckie, J.O. [Stanford Univ., CA (United States); Siegel, M.D. [Sandia National Labs., Albuquerque, NM (United States)

    1995-09-01

    Corrensite is the dominant clay mineral in the Culebra Dolomite at the Waste Isolation Pilot Plant. The surface characteristics of corrensite, a mixed chlorite/smectite clay mineral, have been studied. Zeta potential measurements and titration experiments suggest that the corrensite surface contains a mixture of permanent charge sites on the basal plane and SiOH and AlOH sites with a net pH-dependent charge at the edge of the clay platelets. Triple-layer model parameters were determined by the double extrapolation technique for use in chemical speciation calculations of adsorption reactions using the computer program HYDRAQL. Batch adsorption studies showed that corrensite is an effective adsorbent for uranyl. The pH-dependent adsorption behavior indicates that adsorption occurs at the edge sites. Adsorption studies were also conducted in the presence of competing cations and complexing ligands. The cations did not affect uranyl adsorption in the range studied. This observation lends support to the hypothesis that uranyl adsorption occurs at the edge sites. Uranyl adsorption was significantly hindered by carbonate. It is proposed that the formation of carbonate uranyl complexes inhibits uranyl adsorption and that only the carbonate-free species adsorb to the corrensite surface. The presence of the organic complexing agents EDTA and oxine also inhibits uranyl sorption.

  5. Surface complexation modeling of uranyl adsorption on corrensite from the Waste Isolation Pilot Plant Site

    International Nuclear Information System (INIS)

    Park, Sang-Won; Leckie, J.O.; Siegel, M.D.

    1995-09-01

    Corrensite is the dominant clay mineral in the Culebra Dolomite at the Waste Isolation Pilot Plant. The surface characteristics of corrensite, a mixed chlorite/smectite clay mineral, have been studied. Zeta potential measurements and titration experiments suggest that the corrensite surface contains a mixture of permanent charge sites on the basal plane and SiOH and AlOH sites with a net pH-dependent charge at the edge of the clay platelets. Triple-layer model parameters were determined by the double extrapolation technique for use in chemical speciation calculations of adsorption reactions using the computer program HYDRAQL. Batch adsorption studies showed that corrensite is an effective adsorbent for uranyl. The pH-dependent adsorption behavior indicates that adsorption occurs at the edge sites. Adsorption studies were also conducted in the presence of competing cations and complexing ligands. The cations did not affect uranyl adsorption in the range studied. This observation lends support to the hypothesis that uranyl adsorption occurs at the edge sites. Uranyl adsorption was significantly hindered by carbonate. It is proposed that the formation of carbonate uranyl complexes inhibits uranyl adsorption and that only the carbonate-free species adsorb to the corrensite surface. The presence of the organic complexing agents EDTA and oxine also inhibits uranyl sorption

  6. Survey and assessment of conventional software verification and validation techniques

    International Nuclear Information System (INIS)

    Miller, L.A.; Groundwater, E.; Mirsky, S.M.

    1993-02-01

    Reliable software is required for nuclear power plant applications. Verification and validation (V ampersand V) techniques may be applied during software development to help eliminate errors that can inhibit the proper operation of digital systems and that may cause safety problems. EPRI and the NRC are cosponsoring this investigation to determine the best strategies for V ampersand V of expert system software. The strategy used for a particular system will depend on the complexity of the software and the level of integrity required. This report covers the first task in the investigation of reviewing methods for V ampersand V of conventional software systems and evaluating them for use with expert systems

  7. Structure and reactivity of oxalate surface complexes on lepidocrocite derived from infrared spectroscopy, DFT-calculations, adsorption, dissolution and photochemical experiments

    Science.gov (United States)

    Borowski, Susan C.; Biswakarma, Jagannath; Kang, Kyounglim; Schenkeveld, Walter D. C.; Hering, Janet G.; Kubicki, James D.; Kraemer, Stephan M.; Hug, Stephan J.

    2018-04-01

    Oxalate, together with other ligands, plays an important role in the dissolution of iron(hdyr)oxides and the bio-availability of iron. The formation and properties of oxalate surface complexes on lepidocrocite were studied with a combination of infrared spectroscopy (IR), density functional theory (DFT) calculations, dissolution, and photochemical experiments. IR spectra measured as a function of time, concentration, and pH (50-200 μM oxalate, pH 3-7) showed that several surface complexes are formed at different rates and in different proportions. Measured spectra could be separated into three contributions described by Gaussian line shapes, with frequencies that agreed well with the theoretical frequencies of three different surface complexes: an outer-sphere complex (OS), an inner-sphere monodentate mononuclear complex (MM), and a bidentate mononuclear complex (BM) involving one O atom from each carboxylate group. At pH 6, OS was formed at the highest rate. The contribution of BM increased with decreasing pH. In dissolution experiments, lepidocrocite was dissolved at rates proportional to the surface concentration of BM, rather than to the total adsorbed concentration. Under UV-light (365 nm), BM was photolyzed at a higher rate than MM and OS. Although the comparison of measured spectra with calculated frequencies cannot exclude additional possible structures, the combined results allowed the assignment of three main structures with different reactivities consistent with experiments. The results illustrate the importance of the surface speciation of adsorbed ligands in dissolution and photochemical reactions.

  8. Design verification methodology for a solenoid valve for industrial applications

    International Nuclear Information System (INIS)

    Park, Chang Dae; Lim, Byung Ju; Chun, Kyung Yul

    2015-01-01

    Solenoid operated valves (SOV) are widely used in many applications due to their fast dynamic responses, cost effectiveness, and less contamination sensitive characteristics. In this paper, we tried to provide a convenient method of design verification of SOV to design engineers who depend on their experiences and experiment during design and development process of SOV. First, we summarize a detailed procedure for designing SOVs for industrial applications. All of the design constraints are defined in the first step of the design, and then the detail design procedure is presented based on design experiences as well as various physical and electromagnetic relationships. Secondly, we have suggested a verification method of this design using theoretical relationships, which enables optimal design of SOV from a point of view of safety factor of design attraction force. Lastly, experimental performance tests using several prototypes manufactured based on this design method show that the suggested design verification methodology is appropriate for designing new models of solenoids. We believe that this verification process is novel logic and useful to save time and expenses during development of SOV because verification tests with manufactured specimen may be substituted partly by this verification methodology.

  9. Image-based fingerprint verification system using LabVIEW

    Directory of Open Access Journals (Sweden)

    Sunil K. Singla

    2008-09-01

    Full Text Available Biometric-based identification/verification systems provide a solution to the security concerns in the modern world where machine is replacing human in every aspect of life. Fingerprints, because of their uniqueness, are the most widely used and highly accepted biometrics. Fingerprint biometric systems are either minutiae-based or pattern learning (image based. The minutiae-based algorithm depends upon the local discontinuities in the ridge flow pattern and are used when template size is important while image-based matching algorithm uses both the micro and macro feature of a fingerprint and is used if fast response is required. In the present paper an image-based fingerprint verification system is discussed. The proposed method uses a learning phase, which is not present in conventional image-based systems. The learning phase uses pseudo random sub-sampling, which reduces the number of comparisons needed in the matching stage. This system has been developed using LabVIEW (Laboratory Virtual Instrument Engineering Workbench toolbox version 6i. The availability of datalog files in LabVIEW makes it one of the most promising candidates for its usage as a database. Datalog files can access and manipulate data and complex data structures quickly and easily. It makes writing and reading much faster. After extensive experimentation involving a large number of samples and different learning sizes, high accuracy with learning image size of 100 100 and a threshold value of 700 (1000 being the perfect match has been achieved.

  10. Efficient and Secure Fingerprint Verification for Embedded Devices

    Directory of Open Access Journals (Sweden)

    Sakiyama Kazuo

    2006-01-01

    Full Text Available This paper describes a secure and memory-efficient embedded fingerprint verification system. It shows how a fingerprint verification module originally developed to run on a workstation can be transformed and optimized in a systematic way to run real-time on an embedded device with limited memory and computation power. A complete fingerprint recognition module is a complex application that requires in the order of 1000 M unoptimized floating-point instruction cycles. The goal is to run both the minutiae extraction and the matching engines on a small embedded processor, in our case a 50 MHz LEON-2 softcore. It does require optimization and acceleration techniques at each design step. In order to speed up the fingerprint signal processing phase, we propose acceleration techniques at the algorithm level, at the software level to reduce the execution cycle number, and at the hardware level to distribute the system work load. Thirdly, a memory trace map-based memory reduction strategy is used for lowering the system memory requirement. Lastly, at the hardware level, it requires the development of specialized coprocessors. As results of these optimizations, we achieve a 65% reduction on the execution time and a 67% reduction on the memory storage requirement for the minutiae extraction process, compared against the reference implementation. The complete operation, that is, fingerprint capture, feature extraction, and matching, can be done in real-time of less than 4 seconds

  11. Spectroscopic evidence for ternary surface complexes in the lead(II)-malonic acid-hematite system

    Science.gov (United States)

    Lenhart, J.J.; Bargar, J.R.; Davis, J.A.

    2001-01-01

    Using extended X-ray absorption fine structure (EXAFS) and attenuated total reflectance Fourier-transform infrared (ATR-FTIR) measurements, we examined the sorption of Pb(II) to hematite in the presence of malonic acid. Pb LIII-edge EXAFS measurements performed in the presence of malonate indicate the presence of both Fe and C neighbors, suggesting that a major fraction of surface-bound malonate is bonded to adsorbed Pb(II). In the absence of Pb(II), ATR-FTIR measurements of sorbed malonate suggest the formation of more than one malonate surface complex. The dissimilarity of the IR spectrum of malonate sorbed on hematite to those for aqueous malonate suggest at least one of the sorbed malonate species is directly coordinated to surface Fe atoms in an inner-sphere mode. In the presence of Pb, little change is seen in the IR spectrum for sorbed malonate, indicating that geometry of malonate as it coordinates to sorbed Pb(II) adions is similar to the geometry of malonate as it coordinates to Fe in the hematite surface. Fits of the raw EXAFS spectra collected from pH 4 to pH 8 result in average Pb-C distances of 2.98 to 3.14 A??, suggesting the presence of both four- and six-membered Pb-malonate rings. The IR results are consistent with this interpretation. Thus, our results suggest that malonate binds to sorbed Pb(II) adions, forming ternary metal-bridging surface complexes. ?? 2001 Academic Press.

  12. Complexation of lysozyme with adsorbed PtBS-b-SCPI block polyelectrolyte micelles on silver surface.

    Science.gov (United States)

    Papagiannopoulos, Aristeidis; Christoulaki, Anastasia; Spiliopoulos, Nikolaos; Vradis, Alexandros; Toprakcioglu, Chris; Pispas, Stergios

    2015-01-20

    We present a study of the interaction of the positively charged model protein lysozyme with the negatively charged amphiphilic diblock polyelectrolyte micelles of poly(tert-butylstyrene-b-sodium (sulfamate/carboxylate)isoprene) (PtBS-b-SCPI) on the silver/water interface. The adsorption kinetics are monitored by surface plasmon resonance, and the surface morphology is probed by atomic force microscopy. The micellar adsorption is described by stretched-exponential kinetics, and the micellar layer morphology shows that the micelles do not lose their integrity upon adsorption. The complexation of lysozyme with the adsorbed micellar layers depends on the micelles arrangement and density in the underlying layer, and lysozyme follows the local morphology of the underlying roughness. When the micellar adsorbed amount is small, the layers show low capacity in protein complexation and low resistance in loading. When the micellar adsorbed amount is high, the situation is reversed. The adsorbed layers both with or without added protein are found to be irreversibly adsorbed on the Ag surface.

  13. Verification test report on a solar heating and hot water system

    Science.gov (United States)

    1978-01-01

    Information is provided on the development, qualification and acceptance verification of commercial solar heating and hot water systems and components. The verification includes the performances, the efficiences and the various methods used, such as similarity, analysis, inspection, test, etc., that are applicable to satisfying the verification requirements.

  14. The surface chemistry of divalent metal carbonate minerals; a critical assessment of surface charge and potential data using the charge distribution multi-site ion complexation model

    NARCIS (Netherlands)

    Wolthers, M.; Charlet, L.; Van Cappellen, P.

    2008-01-01

    The Charge Distribution MUltiSite Ion Complexation or CD–MUSIC modeling approach is used to describe the chemical structure of carbonate mineralaqueous solution interfaces. The new model extends existing surface complexation models of carbonate minerals, by including atomic scale information on

  15. Temporal Specification and Verification of Real-Time Systems.

    Science.gov (United States)

    1991-08-30

    of concrete real - time systems can be modeled adequately. Specification: We present two conservative extensions of temporal logic that allow for the...logic. We present both model-checking algorithms for the automatic verification of finite-state real - time systems and proof methods for the deductive verification of real - time systems .

  16. Neoclassical toroidal viscosity calculations in tokamaks using a δf Monte Carlo simulation and their verifications.

    Science.gov (United States)

    Satake, S; Park, J-K; Sugama, H; Kanno, R

    2011-07-29

    Neoclassical toroidal viscosities (NTVs) in tokamaks are investigated using a δf Monte Carlo simulation, and are successfully verified with a combined analytic theory over a wide range of collisionality. A Monte Carlo simulation has been required in the study of NTV since the complexities in guiding-center orbits of particles and their collisions cannot be fully investigated by any means of analytic theories alone. Results yielded the details of the complex NTV dependency on particle precessions and collisions, which were predicted roughly in a combined analytic theory. Both numerical and analytic methods can be utilized and extended based on these successful verifications.

  17. A verification regime for the spatial discretization of the SN transport equations

    Energy Technology Data Exchange (ETDEWEB)

    Schunert, S.; Azmy, Y. [North Carolina State Univ., Dept. of Nuclear Engineering, 2500 Stinson Drive, Raleigh, NC 27695 (United States)

    2012-07-01

    The order-of-accuracy test in conjunction with the method of manufactured solutions is the current state of the art in computer code verification. In this work we investigate the application of a verification procedure including the order-of-accuracy test on a generic SN transport solver that implements the AHOTN spatial discretization. Different types of semantic errors, e.g. removal of a line of code or changing a single character, are introduced randomly into the previously verified S{sub N} code and the proposed verification procedure is used to identify the coding mistakes (if possible) and classify them. Itemized by error type we record the stage of the verification procedure where the error is detected and report the frequency with which the errors are correctly identified at various stages of the verification. Errors that remain undetected by the verification procedure are further scrutinized to determine the reason why the introduced coding mistake eluded the verification procedure. The result of this work is that the verification procedure based on an order-of-accuracy test finds almost all detectable coding mistakes but rarely, 1.44% of the time, and under certain circumstances can fail. (authors)

  18. The step complexity measure for emergency operating procedures: measure verification

    International Nuclear Information System (INIS)

    Park, Jinkyun; Jung, Wondea; Ha, Jaejoo; Park, Changkue

    2002-01-01

    In complex systems, such as nuclear power plants (NPPs) or airplane control systems, human errors play a major role in many accidents. Therefore, to prevent an occurrence of accidents or to ensure system safety, extensive effort has been made to identify significant factors that can cause human errors. According to related studies, written manuals or operating procedures are revealed as one of the most important factors, and the understandability is pointed out as one of the major reasons for procedure-related human errors. Many qualitative checklists are suggested to evaluate emergency operating procedures (EOPs) of NPPs. However, since qualitative evaluations using checklists have some drawbacks, a quantitative measure that can quantify the complexity of EOPs is very necessary to compensate for them. In order to quantify the complexity of steps included in EOPs, Park et al. suggested the step complexity (SC) measure. In addition, to ascertain the appropriateness of the SC measure, averaged step performance time data obtained from emergency training records for the loss of coolant accident and the excess steam dump event were compared with estimated SC scores. Although averaged step performance time data show good correlation with estimated SC scores, conclusions for some important issues that have to be clarified to ensure the appropriateness of the SC measure were not properly drawn because of lack of backup data. In this paper, to clarify remaining issues, additional activities to verify the appropriateness of the SC measure are performed using averaged step performance time data obtained from emergency training records. The total number of available records is 36, and training scenarios are the steam generator tube rupture and the loss of all feedwater. The number of scenarios is 18 each. From these emergency training records, averaged step performance time data for 30 steps are retrieved. As the results, the SC measure shows statistically meaningful

  19. Aliasing in the Complex Cepstrum of Linear-Phase Signals

    DEFF Research Database (Denmark)

    Bysted, Tommy Kristensen

    1997-01-01

    Assuming linear-phase of the associated time signal, this paper presents an approximated analytical description of the unavoidable aliasing in practical use of complex cepstrums. The linear-phase assumption covers two major applications of complex cepstrums which are linear- to minimum-phase FIR......-filter transformation and minimum-phase estimation from amplitude specifications. The description is made in the cepstrum domain, the Fourier transform of the complex cepstrum and in the frequency domain. Two examples are given, one for verification of the derived equations and one using the description to reduce...... aliasing in minimum-phase estimation...

  20. Adaptive Surface Modeling of Soil Properties in Complex Landforms

    Directory of Open Access Journals (Sweden)

    Wei Liu

    2017-06-01

    Full Text Available Abstract: Spatial discontinuity often causes poor accuracy when a single model is used for the surface modeling of soil properties in complex geomorphic areas. Here we present a method for adaptive surface modeling of combined secondary variables to improve prediction accuracy during the interpolation of soil properties (ASM-SP. Using various secondary variables and multiple base interpolation models, ASM-SP was used to interpolate soil K+ in a typical complex geomorphic area (Qinghai Lake Basin, China. Five methods, including inverse distance weighting (IDW, ordinary kriging (OK, and OK combined with different secondary variables (e.g., OK-Landuse, OK-Geology, and OK-Soil, were used to validate the proposed method. The mean error (ME, mean absolute error (MAE, root mean square error (RMSE, mean relative error (MRE, and accuracy (AC were used as evaluation indicators. Results showed that: (1 The OK interpolation result is spatially smooth and has a weak bull's-eye effect, and the IDW has a stronger ‘bull’s-eye’ effect, relatively. They both have obvious deficiencies in depicting spatial variability of soil K+. (2 The methods incorporating combinations of different secondary variables (e.g., ASM-SP, OK-Landuse, OK-Geology, and OK-Soil were associated with lower estimation bias. Compared with IDW, OK, OK-Landuse, OK-Geology, and OK-Soil, the accuracy of ASM-SP increased by 13.63%, 10.85%, 9.98%, 8.32%, and 7.66%, respectively. Furthermore, ASM-SP was more stable, with lower MEs, MAEs, RMSEs, and MREs. (3 ASM-SP presents more details than others in the abrupt boundary, which can render the result consistent with the true secondary variables. In conclusion, ASM-SP can not only consider the nonlinear relationship between secondary variables and soil properties, but can also adaptively combine the advantages of multiple models, which contributes to making the spatial interpolation of soil K+ more reasonable.

  1. Environmental technology verification methods

    CSIR Research Space (South Africa)

    Szewczuk, S

    2016-03-01

    Full Text Available Environmental Technology Verification (ETV) is a tool that has been developed in the United States of America, Europe and many other countries around the world to help innovative environmental technologies reach the market. Claims about...

  2. Pure Surface Texture Mapping Technology and it's Application for Mirror Image

    Directory of Open Access Journals (Sweden)

    Wei Feng Wang

    2013-02-01

    Full Text Available Based on the study of pure surface texture mapping technology, pure texture surface rendering method is proposed. The method is combined pure surface texture rendering and view mirror, real-time rendering has an index of refraction, reflection, and the flow of water ripple effect. Through the experimental verification of the validity of the algorithm.

  3. Surface Complexation Modeling in Variable Charge Soils: Prediction of Cadmium Adsorption

    Directory of Open Access Journals (Sweden)

    Giuliano Marchi

    2015-10-01

    Full Text Available ABSTRACT Intrinsic equilibrium constants for 22 representative Brazilian Oxisols were estimated from a cadmium adsorption experiment. Equilibrium constants were fitted to two surface complexation models: diffuse layer and constant capacitance. Intrinsic equilibrium constants were optimized by FITEQL and by hand calculation using Visual MINTEQ in sweep mode, and Excel spreadsheets. Data from both models were incorporated into Visual MINTEQ. Constants estimated by FITEQL and incorporated in Visual MINTEQ software failed to predict observed data accurately. However, FITEQL raw output data rendered good results when predicted values were directly compared with observed values, instead of incorporating the estimated constants into Visual MINTEQ. Intrinsic equilibrium constants optimized by hand calculation and incorporated in Visual MINTEQ reliably predicted Cd adsorption reactions on soil surfaces under changing environmental conditions.

  4. Verification and Optimization of a PLC Control Schedule

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.; Havelund, K.; Penix, J.; Visser, W.

    We report on the use of the SPIN model checker for both the verification of a process control program and the derivation of optimal control schedules. This work was carried out as part of a case study for the EC VHS project (Verification of Hybrid Systems), in which the program for a Programmable

  5. Compressive sensing using optimized sensing matrix for face verification

    Science.gov (United States)

    Oey, Endra; Jeffry; Wongso, Kelvin; Tommy

    2017-12-01

    Biometric appears as one of the solutions which is capable in solving problems that occurred in the usage of password in terms of data access, for example there is possibility in forgetting password and hard to recall various different passwords. With biometrics, physical characteristics of a person can be captured and used in the identification process. In this research, facial biometric is used in the verification process to determine whether the user has the authority to access the data or not. Facial biometric is chosen as its low cost implementation and generate quite accurate result for user identification. Face verification system which is adopted in this research is Compressive Sensing (CS) technique, in which aims to reduce dimension size as well as encrypt data in form of facial test image where the image is represented in sparse signals. Encrypted data can be reconstructed using Sparse Coding algorithm. Two types of Sparse Coding namely Orthogonal Matching Pursuit (OMP) and Iteratively Reweighted Least Squares -ℓp (IRLS-ℓp) will be used for comparison face verification system research. Reconstruction results of sparse signals are then used to find Euclidean norm with the sparse signal of user that has been previously saved in system to determine the validity of the facial test image. Results of system accuracy obtained in this research are 99% in IRLS with time response of face verification for 4.917 seconds and 96.33% in OMP with time response of face verification for 0.4046 seconds with non-optimized sensing matrix, while 99% in IRLS with time response of face verification for 13.4791 seconds and 98.33% for OMP with time response of face verification for 3.1571 seconds with optimized sensing matrix.

  6. 340 and 310 drawing field verification

    International Nuclear Information System (INIS)

    Langdon, J.

    1996-01-01

    The purpose of the drawing field verification work plan is to provide reliable drawings for the 310 Treated Effluent Disposal Facility (TEDF) and 340 Waste Handling Facility (340 Facility). The initial scope of this work plan is to provide field verified and updated versions of all the 340 Facility essential drawings. This plan can also be used for field verification of any other drawings that the facility management directs to be so updated. Any drawings revised by this work plan will be issued in an AutoCAD format

  7. Verification of Scientific Simulations via Hypothesis-Driven Comparative and Quantitative Visualization

    Energy Technology Data Exchange (ETDEWEB)

    Ahrens, James P [ORNL; Heitmann, Katrin [ORNL; Petersen, Mark R [ORNL; Woodring, Jonathan [Los Alamos National Laboratory (LANL); Williams, Sean [Los Alamos National Laboratory (LANL); Fasel, Patricia [Los Alamos National Laboratory (LANL); Ahrens, Christine [Los Alamos National Laboratory (LANL); Hsu, Chung-Hsing [ORNL; Geveci, Berk [ORNL

    2010-11-01

    This article presents a visualization-assisted process that verifies scientific-simulation codes. Code verification is necessary because scientists require accurate predictions to interpret data confidently. This verification process integrates iterative hypothesis verification with comparative, feature, and quantitative visualization. Following this process can help identify differences in cosmological and oceanographic simulations.

  8. Formal verification of Simulink/Stateflow diagrams a deductive approach

    CERN Document Server

    Zhan, Naijun; Zhao, Hengjun

    2017-01-01

    This book presents a state-of-the-art technique for formal verification of continuous-time Simulink/Stateflow diagrams, featuring an expressive hybrid system modelling language, a powerful specification logic and deduction-based verification approach, and some impressive, realistic case studies. Readers will learn the HCSP/HHL-based deductive method and the use of corresponding tools for formal verification of Simulink/Stateflow diagrams. They will also gain some basic ideas about fundamental elements of formal methods such as formal syntax and semantics, and especially the common techniques applied in formal modelling and verification of hybrid systems. By investigating the successful case studies, readers will realize how to apply the pure theory and techniques to real applications, and hopefully will be inspired to start to use the proposed approach, or even develop their own formal methods in their future work.

  9. Applying Monte Carlo Simulation to Launch Vehicle Design and Requirements Verification

    Science.gov (United States)

    Hanson, John M.; Beard, Bernard B.

    2010-01-01

    This paper is focused on applying Monte Carlo simulation to probabilistic launch vehicle design and requirements verification. The approaches developed in this paper can be applied to other complex design efforts as well. Typically the verification must show that requirement "x" is met for at least "y" % of cases, with, say, 10% consumer risk or 90% confidence. Two particular aspects of making these runs for requirements verification will be explored in this paper. First, there are several types of uncertainties that should be handled in different ways, depending on when they become known (or not). The paper describes how to handle different types of uncertainties and how to develop vehicle models that can be used to examine their characteristics. This includes items that are not known exactly during the design phase but that will be known for each assembled vehicle (can be used to determine the payload capability and overall behavior of that vehicle), other items that become known before or on flight day (can be used for flight day trajectory design and go/no go decision), and items that remain unknown on flight day. Second, this paper explains a method (order statistics) for determining whether certain probabilistic requirements are met or not and enables the user to determine how many Monte Carlo samples are required. Order statistics is not new, but may not be known in general to the GN&C community. The methods also apply to determining the design values of parameters of interest in driving the vehicle design. The paper briefly discusses when it is desirable to fit a distribution to the experimental Monte Carlo results rather than using order statistics.

  10. Engineering yeast consortia for surface-display of complex cellulosome structures

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Wilfred [University of Delaware

    2014-03-31

    As our society marches toward a more technologically advanced future, energy and environmental sustainability are some of the most challenging problems we face today. Biomass is one of the most abundant renewable-feedstock for sustainable production of biofuels. However, the main technological obstacle to more widespread uses of this resource is the lack of low-cost technologies to overcome the recalcitrant nature of the cellulosic structure, especially the hydrolysis step on highly ordered celluloses. In this proposal, we successfully engineered several efficient and inexpensive whole-cell biocatalysts in an effort to produce economically compatible and sustainable biofuels, namely cellulosic ethanol. Our approach was to display of a highly efficient cellulolytic enzyme complex, named cellulosome, on the surface of a historical ethanol producer Saccharomyces cerevisiae for the simultaneous and synergistic saccharification and fermentation of cellulose to ethanol. We first demonstrated the feasibility of assembling a mini-cellulosome by incubating E. coli lysates expressing three different cellulases. Resting cells displaying mini-cellulosomes produced 4-fold more ethanol from phosphoric acid-swollen cellulose (PASC) than cultures with only added enzymes. The flexibility to assemble the mini-cellulosome structure was further demonstrated using a synthetic yeast consortium through intracellular complementation. Direct ethanol production from PASC was demonstrated with resting cell cultures. To create a microorganism suitable for a more cost-effective process, called consolidated bioprocessing (CBP), a synthetic consortium capable of displaying mini-cellulosomes on the cell surface via intercellular complementation was created. To further improve the efficiency, a new adaptive strategy of employing anchoring and adaptor scaffoldins to amplify the number of enzymatic subunits was developed, resulting in the creation of an artificial tetravalent cellulosome on the

  11. Ontology Matching with Semantic Verification.

    Science.gov (United States)

    Jean-Mary, Yves R; Shironoshita, E Patrick; Kabuka, Mansur R

    2009-09-01

    ASMOV (Automated Semantic Matching of Ontologies with Verification) is a novel algorithm that uses lexical and structural characteristics of two ontologies to iteratively calculate a similarity measure between them, derives an alignment, and then verifies it to ensure that it does not contain semantic inconsistencies. In this paper, we describe the ASMOV algorithm, and then present experimental results that measure its accuracy using the OAEI 2008 tests, and that evaluate its use with two different thesauri: WordNet, and the Unified Medical Language System (UMLS). These results show the increased accuracy obtained by combining lexical, structural and extensional matchers with semantic verification, and demonstrate the advantage of using a domain-specific thesaurus for the alignment of specialized ontologies.

  12. Interpolant tree automata and their application in Horn clause verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2016-01-01

    This paper investigates the combination of abstract interpretation over the domain of convex polyhedra with interpolant tree automata, in an abstraction-refinement scheme for Horn clause verification. These techniques have been previously applied separately, but are combined in a new way in this ......This paper investigates the combination of abstract interpretation over the domain of convex polyhedra with interpolant tree automata, in an abstraction-refinement scheme for Horn clause verification. These techniques have been previously applied separately, but are combined in a new way...... clause verification problems indicates that the combination of interpolant tree automaton with abstract interpretation gives some increase in the power of the verification tool, while sometimes incurring a performance overhead....

  13. Synergies across verification regimes: Nuclear safeguards and chemical weapons convention compliance

    International Nuclear Information System (INIS)

    Kadner, Steven P.; Turpen, Elizabeth

    2001-01-01

    In the implementation of all arms control agreements, accurate verification is essential. In setting a course for verifying compliance with a given treaty - whether the NPT or the CWC, one must make a technical comparison of existing information-gathering capabilities against the constraints in an agreement. Then it must be decided whether this level of verifiability is good enough. Generally, the policy standard of 'effective verification' includes the ability to detect significant violations, with high confidence, in sufficient time to respond effectively with policy adjustments or other responses, as needed. It is at this juncture where verification approaches have traditionally diverged. Nuclear safeguards requirements have taken one path while chemical verification methods have pursued another. However, recent technological advances have brought a number of changes affecting verification, and lately their pace has been accelerating. First, all verification regimes have more and better information as a result of new kinds of sensors, imagery, and other technologies. Second, the verification provisions in agreements have also advanced, to include on-site inspections, portal monitoring, data exchanges, and a variety of transparency, confidence-building, and other cooperative measures, Together these developments translate into a technological overlap of certain institutional verification measures such as the NPT's safeguards requirements and the IAEA and the CWC's verification visions and the OPCW. Hence, a priority of international treaty-implementing organizations is exploring the development of a synergistic and coordinated approach to WMD policy making that takes into account existing inter-linkages between nuclear, chemical, and biological weapons issues. Specific areas of coordination include harmonizing information systems and information exchanges and the shared application of scientific mechanisms, as well as collaboration on technological developments

  14. Heavy water physical verification in power plants

    International Nuclear Information System (INIS)

    Morsy, S.; Schuricht, V.; Beetle, T.; Szabo, E.

    1986-01-01

    This paper is a report on the Agency experience in verifying heavy water inventories in power plants. The safeguards objectives and goals for such activities are defined in the paper. The heavy water is stratified according to the flow within the power plant, including upgraders. A safeguards scheme based on a combination of records auditing, comparing records and reports, and physical verification has been developed. This scheme has elevated the status of heavy water safeguards to a level comparable to nuclear material safeguards in bulk facilities. It leads to attribute and variable verification of the heavy water inventory in the different system components and in the store. The verification methods include volume and weight determination, sampling and analysis, non-destructive assay (NDA), and criticality check. The analysis of the different measurement methods and their limits of accuracy are discussed in the paper

  15. Riemann surfaces of complex classical trajectories and tunnelling splitting in one-dimensional systems

    Science.gov (United States)

    Harada, Hiromitsu; Mouchet, Amaury; Shudo, Akira

    2017-10-01

    The topology of complex classical paths is investigated to discuss quantum tunnelling splittings in one-dimensional systems. Here the Hamiltonian is assumed to be given as polynomial functions, so the fundamental group for the Riemann surface provides complete information on the topology of complex paths, which allows us to enumerate all the possible candidates contributing to the semiclassical sum formula for tunnelling splittings. This naturally leads to action relations among classically disjoined regions, revealing entirely non-local nature in the quantization condition. The importance of the proper treatment of Stokes phenomena is also discussed in Hamiltonians in the normal form.

  16. A Correctness Verification Technique for Commercial FPGA Synthesis Tools

    International Nuclear Information System (INIS)

    Kim, Eui Sub; Yoo, Jun Beom; Choi, Jong Gyun; Kim, Jang Yeol; Lee, Jang Soo

    2014-01-01

    Once the FPGA (Filed-Programmable Gate Array) designers designs Verilog programs, the commercial synthesis tools automatically translate the Verilog programs into EDIF programs so that the designers can have largely focused on HDL designs for correctness of functionality. Nuclear regulation authorities, however, require more considerate demonstration of the correctness and safety of mechanical synthesis processes of FPGA synthesis tools, even if the FPGA industry have acknowledged them empirically as correct and safe processes and tools. In order to assure of the safety, the industry standards for the safety of electronic/electrical devices, such as IEC 61508 and IEC 60880, recommend using the formal verification technique. There are several formal verification tools (i.e., 'FormalPro' 'Conformal' 'Formality' and so on) to verify the correctness of translation from Verilog into EDIF programs, but it is too expensive to use and hard to apply them to the works of 3rd-party developers. This paper proposes a formal verification technique which can contribute to the correctness demonstration in part. It formally checks the behavioral equivalence between Verilog and subsequently synthesized Net list with the VIS verification system. A Net list is an intermediate output of FPGA synthesis process, and EDIF is used as a standard format of Net lists. If the formal verification succeeds, then we can assure that the synthesis process from Verilog into Net list worked correctly at least for the Verilog used. In order to support the formal verification, we developed the mechanical translator 'EDIFtoBLIFMV,' which translates EDIF into BLIF-MV as an input front-end of VIS system, while preserving their behavior equivalence.. We performed the case study with an example of a preliminary version of RPS in a Korean nuclear power plant in order to provide the efficiency of the proposed formal verification technique and implemented translator. It

  17. A Correctness Verification Technique for Commercial FPGA Synthesis Tools

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Eui Sub; Yoo, Jun Beom [Konkuk University, Seoul (Korea, Republic of); Choi, Jong Gyun; Kim, Jang Yeol; Lee, Jang Soo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    Once the FPGA (Filed-Programmable Gate Array) designers designs Verilog programs, the commercial synthesis tools automatically translate the Verilog programs into EDIF programs so that the designers can have largely focused on HDL designs for correctness of functionality. Nuclear regulation authorities, however, require more considerate demonstration of the correctness and safety of mechanical synthesis processes of FPGA synthesis tools, even if the FPGA industry have acknowledged them empirically as correct and safe processes and tools. In order to assure of the safety, the industry standards for the safety of electronic/electrical devices, such as IEC 61508 and IEC 60880, recommend using the formal verification technique. There are several formal verification tools (i.e., 'FormalPro' 'Conformal' 'Formality' and so on) to verify the correctness of translation from Verilog into EDIF programs, but it is too expensive to use and hard to apply them to the works of 3rd-party developers. This paper proposes a formal verification technique which can contribute to the correctness demonstration in part. It formally checks the behavioral equivalence between Verilog and subsequently synthesized Net list with the VIS verification system. A Net list is an intermediate output of FPGA synthesis process, and EDIF is used as a standard format of Net lists. If the formal verification succeeds, then we can assure that the synthesis process from Verilog into Net list worked correctly at least for the Verilog used. In order to support the formal verification, we developed the mechanical translator 'EDIFtoBLIFMV,' which translates EDIF into BLIF-MV as an input front-end of VIS system, while preserving their behavior equivalence.. We performed the case study with an example of a preliminary version of RPS in a Korean nuclear power plant in order to provide the efficiency of the proposed formal verification technique and implemented translator. It

  18. Final Technical Report: Metal—Organic Surface Catalyst for Low-temperature Methane Oxidation: Bi-functional Union of Metal—Organic Complex and Chemically Complementary Surface

    Energy Technology Data Exchange (ETDEWEB)

    Tait, Steven L. [Indiana Univ., Bloomington, IN (United States)

    2016-10-01

    Stabilization and chemical control of transition metal centers is a critical problem in the advancement of heterogeneous catalysts to next-generation catalysts that exhibit high levels of selectivity, while maintaining strong activity and facile catalyst recycling. Supported metal nanoparticle catalysts typically suffer from having a wide range of metal sites with different coordination numbers and varying chemistry. This project is exploring new possibilities in catalysis by combining features of homogeneous catalysts with those of heterogeneous catalysts to develop new, bi-functional systems. The systems are more complex than traditional heterogeneous catalysts in that they utilize sequential active sites to accomplish the desired overall reaction. The interaction of metal—organic catalysts with surface supports and their interactions with reactants to enable the catalysis of critical reactions at lower temperatures are at the focus of this study. Our work targets key fundamental chemistry problems. How do the metal—organic complexes interact with the surface? Can those metal center sites be tuned for selectivity and activity as they are in the homogeneous system by ligand design? What steps are necessary to enable a cooperative chemistry to occur and open opportunities for bi-functional catalyst systems? Study of these systems will develop the concept of bringing together the advantages of heterogeneous catalysis with those of homogeneous catalysis, and take this a step further by pursuing the objective of a bi-functional system. The use of metal-organic complexes in surface catalysts is therefore of interest to create well-defined and highly regular single-site centers. While these are not likely to be stable in the high temperature environments (> 300 °C) typical of industrial heterogeneous catalysts, they could be applied in moderate temperature reactions (100-300 °C), made feasible by lowering reaction temperatures by better catalyst control. They also

  19. Formal specification and verification of interactive systems with plasticity: Applications to nuclear-plant supervision

    International Nuclear Information System (INIS)

    Oliveira, Raquel Araujo de

    2015-01-01

    The advent of ubiquitous computing and the increasing variety of platforms and devices change user expectations in terms of user interfaces. Systems should be able to adapt themselves to their context of use, i.e., the platform (e.g. a PC or a tablet), the users who interact with the system (e.g. administrators or regular users), and the environment in which the system executes (e.g. a dark room or outdoor). The capacity of a UI to withstand variations in its context of use while preserving usability is called plasticity. Plasticity provides users with different versions of a UI. Although it enhances UI capabilities, plasticity adds complexity to the development of user interfaces: the consistency between multiple versions of a given UI should be ensured. Given the large number of possible versions of a UI, it is time-consuming and error prone to check these requirements by hand. Some automation must be provided to verify plasticity.This complexity is further increased when it comes to UIs of safety-critical systems. Safety-critical systems are systems in which a failure has severe consequences. The complexity of such systems is reflected in the UIs, which are now expected not only to provide correct, intuitive, non-ambiguous and adaptable means for users to accomplish a goal, but also to cope with safety requirements aiming to make sure that systems are reasonably safe before they enter the market. Several techniques to ensure quality of systems in general exist, which can also be used to safety-critical systems. Formal verification provides a rigorous way to perform verification, which is suitable for safety-critical systems. Our contribution is an approach to verify safety-critical interactive systems provided with plastic UIs using formal methods. Using a powerful tool-support, our approach permits:-The verification of sets of properties over a model of the system. Using model checking, our approach permits the verification of properties over the system formal

  20. Logic verification system for power plant sequence diagrams

    International Nuclear Information System (INIS)

    Fukuda, Mitsuko; Yamada, Naoyuki; Teshima, Toshiaki; Kan, Ken-ichi; Utsunomiya, Mitsugu.

    1994-01-01

    A logic verification system for sequence diagrams of power plants has been developed. The system's main function is to verify correctness of the logic realized by sequence diagrams for power plant control systems. The verification is based on a symbolic comparison of the logic of the sequence diagrams with the logic of the corresponding IBDs (interlock Block Diagrams) in combination with reference to design knowledge. The developed system points out the sub-circuit which is responsible for any existing mismatches between the IBD logic and the logic realized by the sequence diagrams. Applications to the verification of actual sequence diagrams of power plants confirmed that the developed system is practical and effective. (author)

  1. Implementation and verification of global optimization benchmark problems

    Science.gov (United States)

    Posypkin, Mikhail; Usov, Alexander

    2017-12-01

    The paper considers the implementation and verification of a test suite containing 150 benchmarks for global deterministic box-constrained optimization. A C++ library for describing standard mathematical expressions was developed for this purpose. The library automate the process of generating the value of a function and its' gradient at a given point and the interval estimates of a function and its' gradient on a given box using a single description. Based on this functionality, we have developed a collection of tests for an automatic verification of the proposed benchmarks. The verification has shown that literary sources contain mistakes in the benchmarks description. The library and the test suite are available for download and can be used freely.

  2. Towards automatic verification of ladder logic programs

    OpenAIRE

    Zoubek , Bohumir; Roussel , Jean-Marc; Kwiatkowska , Martha

    2003-01-01

    International audience; Control system programs are usually validated by testing prior to their deployment. Unfortunately, testing is not exhaustive and therefore it is possible that a program which passed all the required tests still contains errors. In this paper we apply techniques of automatic verification to a control program written in ladder logic. A model is constructed mechanically from the ladder logic program and subjected to automatic verification against requirements that include...

  3. Inventory verification measurements using neutron multiplicity counting

    International Nuclear Information System (INIS)

    Ensslin, N.; Foster, L.A.; Harker, W.C.; Krick, M.S.; Langner, D.G.

    1998-01-01

    This paper describes a series of neutron multiplicity measurements of large plutonium samples at the Los Alamos Plutonium Facility. The measurements were corrected for bias caused by neutron energy spectrum shifts and nonuniform multiplication, and are compared with calorimetry/isotopics. The results show that multiplicity counting can increase measurement throughput and yield good verification results for some inventory categories. The authors provide recommendations on the future application of the technique to inventory verification

  4. Noncontact Surface Roughness Estimation Using 2D Complex Wavelet Enhanced ResNet for Intelligent Evaluation of Milled Metal Surface Quality

    Directory of Open Access Journals (Sweden)

    Weifang Sun

    2018-03-01

    Full Text Available Machined surfaces are rough from a microscopic perspective no matter how finely they are finished. Surface roughness is an important factor to consider during production quality control. Using modern techniques, surface roughness measurements are beneficial for improving machining quality. With optical imaging of machined surfaces as input, a convolutional neural network (CNN can be utilized as an effective way to characterize hierarchical features without prior knowledge. In this paper, a novel method based on CNN is proposed for making intelligent surface roughness identifications. The technical scheme incorporates there elements: texture skew correction, image filtering, and intelligent neural network learning. Firstly, a texture skew correction algorithm, based on an improved Sobel operator and Hough transform, is applied such that surface texture directions can be adjusted. Secondly, two-dimensional (2D dual tree complex wavelet transform (DTCWT is employed to retrieve surface topology information, which is more effective for feature classifications. In addition, residual network (ResNet is utilized to ensure automatic recognition of the filtered texture features. The proposed method has verified its feasibility as well as its effectiveness in actual surface roughness estimation experiments using the material of spheroidal graphite cast iron 500-7 in an agricultural machinery manufacturing company. Testing results demonstrate the proposed method has achieved high-precision surface roughness estimation.

  5. The MODUS Approach to Formal Verification

    Directory of Open Access Journals (Sweden)

    Brewka Lukasz

    2014-03-01

    Full Text Available Background: Software reliability is of great importance for the development of embedded systems that are often used in applications that have requirements for safety. Since the life cycle of embedded products is becoming shorter, productivity and quality simultaneously required and closely in the process of providing competitive products Objectives: In relation to this, MODUS (Method and supporting toolset advancing embedded systems quality project aims to provide small and medium-sized businesses ways to improve their position in the embedded market through a pragmatic and viable solution Methods/Approach: This paper will describe the MODUS project with focus on the technical methodologies that can assist formal verification and formal model checking. Results: Based on automated analysis of the characteristics of the system and by controlling the choice of the existing opensource model verification engines, model verification producing inputs to be fed into these engines. Conclusions: The MODUS approach is aligned with present market needs; the familiarity with tools, the ease of use and compatibility/interoperability remain among the most important criteria when selecting the development environment for a project

  6. ENVIRONMENTAL TECHNOLOGY VERIFICATION: TEST/QA PLAN FOR THE VERIFICATION TESTING OF SELECTIVE CATALYTIC REDUCTION CONTROL TECHNOLOGIES FOR HIGHWAY, NONROAD, AND STATIONARY USE DIESEL ENGINES

    Science.gov (United States)

    The U.S. Environmental Protection Agency established the Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of product performance. Research Triangl...

  7. Incorporating classic adsorption isotherms into modern surface complexation models: implications for sorption of radionuclides

    International Nuclear Information System (INIS)

    Kulik, D.A.

    2005-01-01

    Full text of publication follows: Computer-aided surface complexation models (SCM) tend to replace the classic adsorption isotherm (AI) analysis in describing mineral-water interface reactions such as radionuclide sorption onto (hydr) oxides and clays. Any site-binding SCM based on the mole balance of surface sites, in fact, reproduces the (competitive) Langmuir isotherm, optionally amended with electrostatic Coulomb's non-ideal term. In most SCM implementations, it is difficult to incorporate real-surface phenomena (site heterogeneity, lateral interactions, surface condensation) described in classic AI approaches other than Langmuir's. Thermodynamic relations between SCMs and AIs that remained obscure in the past have been recently clarified using new definitions of standard and reference states of surface species [1,2]. On this basis, a method for separating the Langmuir AI into ideal (linear) and non-ideal parts [2] was applied to multi-dentate Langmuir, Frumkin, and BET isotherms. The aim of this work was to obtain the surface activity coefficient terms that make the SCM site mole balance constraints obsolete and, in this way, extend thermodynamic SCMs to cover sorption phenomena described by the respective AIs. The multi-dentate Langmuir term accounts for the site saturation with n-dentate surface species, as illustrated on modeling bi-dentate U VI complexes on goethite or SiO 2 surfaces. The Frumkin term corrects for the lateral interactions of the mono-dentate surface species; in particular, it has the same form as the Coulombic term of the constant-capacitance EDL combined with the Langmuir term. The BET term (three parameters) accounts for more than a monolayer adsorption up to the surface condensation; it can potentially describe the surface precipitation of nickel and other cations on hydroxides and clay minerals. All three non-ideal terms (in GEM SCMs implementation [1,2]) by now are used for non-competing surface species only. Upon 'surface dilution

  8. Results of Remediation and Verification Sampling for the 600-270 Horseshoe Landfill

    Energy Technology Data Exchange (ETDEWEB)

    W. S. Thompson

    2005-12-14

    This report presents the results of the 2005 remedial action and verification soil sampling conducted at the 600-270 waste site after removal of soil containing residual concentrations of dichlorodiphenyl trichloroethane and its breakdown products dichlorodiphenyl dichloroethylene and dichlorodiphenyl dichloroethane. The remediation was performed in response to post-closure surface soil sampling performed between 1998 and 2003 that indicated the presence of residual DDT contamination exceeding the Record of Decision for the 1100 Area National Priorities List site cleanup criteria of 1 mg/kg that was established for the original 1994 cleanup activities.

  9. Results of Remediation and Verification Sampling for the 600-270 Horseshoe Landfill

    International Nuclear Information System (INIS)

    Thompson, W.S.

    2005-01-01

    This report presents the results of the 2005 remedial action and verification soil sampling conducted at the 600-270 waste site after removal of soil containing residual concentrations of dichlorodiphenyl trichloroethane and its breakdown products dichlorodiphenyl dichloroethylene and dichlorodiphenyl dichloroethane. The remediation was performed in response to post-closure surface soil sampling performed between 1998 and 2003 that indicated the presence of residual DDT contamination exceeding the Record of Decision for the 1100 Area National Priorities List site cleanup criteria of 1 mg/kg that was established for the original 1994 cleanup activities.

  10. Symposium on international safeguards: Verification and nuclear material security. Book of extended synopses

    International Nuclear Information System (INIS)

    2001-01-01

    The symposium covered the topics related to international safeguards, verification and nuclear materials security, namely: verification and nuclear material security; the NPT regime: progress and promises; the Additional Protocol as an important tool for the strengthening of the safeguards system; the nuclear threat and the nuclear threat initiative. Eighteen sessions dealt with the following subjects: the evolution of IAEA safeguards (including strengthened safeguards, present and future challenges; verification of correctness and completeness of initial declarations; implementation of the Additional Protocol, progress and experience; security of material; nuclear disarmament and ongoing monitoring and verification in Iraq; evolution of IAEA verification in relation to nuclear disarmament); integrated safeguards; physical protection and illicit trafficking; destructive analysis for safeguards; the additional protocol; innovative safeguards approaches; IAEA verification and nuclear disarmament; environmental sampling; safeguards experience; safeguards equipment; panel discussion on development of state systems of accountancy and control; information analysis in the strengthened safeguard system; satellite imagery and remote monitoring; emerging IAEA safeguards issues; verification technology for nuclear disarmament; the IAEA and the future of nuclear verification and security

  11. Symposium on international safeguards: Verification and nuclear material security. Book of extended synopses

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-07-01

    The symposium covered the topics related to international safeguards, verification and nuclear materials security, namely: verification and nuclear material security; the NPT regime: progress and promises; the Additional Protocol as an important tool for the strengthening of the safeguards system; the nuclear threat and the nuclear threat initiative. Eighteen sessions dealt with the following subjects: the evolution of IAEA safeguards (including strengthened safeguards, present and future challenges; verification of correctness and completeness of initial declarations; implementation of the Additional Protocol, progress and experience; security of material; nuclear disarmament and ongoing monitoring and verification in Iraq; evolution of IAEA verification in relation to nuclear disarmament); integrated safeguards; physical protection and illicit trafficking; destructive analysis for safeguards; the additional protocol; innovative safeguards approaches; IAEA verification and nuclear disarmament; environmental sampling; safeguards experience; safeguards equipment; panel discussion on development of state systems of accountancy and control; information analysis in the strengthened safeguard system; satellite imagery and remote monitoring; emerging IAEA safeguards issues; verification technology for nuclear disarmament; the IAEA and the future of nuclear verification and security.

  12. Computational Complexity of Combinatorial Surfaces

    NARCIS (Netherlands)

    Vegter, Gert; Yap, Chee K.

    1990-01-01

    We investigate the computational problems associated with combinatorial surfaces. Specifically, we present an algorithm (based on the Brahana-Dehn-Heegaard approach) for transforming the polygonal schema of a closed triangulated surface into its canonical form in O(n log n) time, where n is the

  13. Verification Games: Crowd-Sourced Formal Verification

    Science.gov (United States)

    2016-03-01

    additional paintbrushes. Additionally, in Paradox , human players are never given small optimization problems (for example, toggling the values of 50...were developed by the Center for Game Science: Pipe Jam, Traffic Jam, Flow Jam and Paradox . Verification tools and games were integrated to verify...4 4. Paradox …………………………………………………......5 5. MyClass ………………………………………………….....7 6. Results …………………………………………………......11 7. Time to

  14. Covalent attachment of pyridine-type molecules to glassy carbon surfaces by electrochemical reduction of in situ generated diazonium salts. Formation of ruthenium complexes on ligand-modified surfaces

    International Nuclear Information System (INIS)

    Yesildag, Ali; Ekinci, Duygu

    2010-01-01

    In this study, pyridine, quinoline and phenanthroline molecules were covalently bonded to glassy carbon (GC) electrode surfaces for the first time using the diazonium modification method. Then, the complexation ability of the modified films with ruthenium metal cations was investigated. The derivatization of GC surfaces with heteroaromatic molecules was achieved by electrochemical reduction of the corresponding in situ generated diazonium salts. X-ray photoelectron spectroscopy (XPS) was used to confirm the attachment of heteroaromatic molecules to the GC surfaces and to determine the surface concentration of the films. The barrier properties of the modified GC electrodes were studied in the presence of redox probes such as Fe(CN) 6 3- and Ru(NH 3 ) 6 3+ by cyclic voltammetry. Additionally, the presence of the resulting organometallic films on the surfaces was verified by XPS after the chemical transformation of the characterized ligand films to the ruthenium complex films. The electrochemical behavior of these films in acetonitrile solution was investigated using voltammetric methods, and the surface coverage of the organometallic films was determined from the reversible metal-based Ru(II)/Ru(III) oxidation waves.

  15. Structure and reactivity of heterogeneous surfaces and study of the geometry of surface complexes. Progress report, January 1, 1984-December 31, 1984

    International Nuclear Information System (INIS)

    Landman, U.

    1984-01-01

    Since the beginning of this project, our group has been involved in theoretical studies of surface phenomena and processes, aimed toward increasing our understanding of fundamental processes which govern the properties of material surfaces. Our studies cover a wide spectrum of surface phenomena: surface reactivity, surface crystallography, electronic and vibrational structure, dynamical processes, phase transformations and phase change, the properties of interfaces and investigations of material processing and novel materials preparation techniques. In these investigations we develop and employ analytical and novel numerical, simulation, methods for the study of complex surface phenomena. Our recent surface molecular dynamics studies and simulations of laser annealing phenomena opened new avenues for the investigation of the microscopic dynamics and evolution of equilibrium and non-equilibrium processes at surfaces and interfaces. Our current studies of metallic glasses using a new langrangian formulation which includes all components of the total energy (density dependent electron gas, single particle and pair interactions) of the system, represents a novel approach for theoretical studies of this important class of systems

  16. Verification of product design using regulation knowledge base and Web services

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ik June [KAERI, Daejeon (Korea, Republic of); Lee, Jae Chul; Mun Du Hwan [Kyungpook National University, Daegu (Korea, Republic of); Kim, Byung Chul [Dong-A University, Busan (Korea, Republic of); Hwang, Jin Sang [PartDB Co., Ltd., Daejeom (Korea, Republic of); Lim, Chae Ho [Korea Institute of Industrial Technology, Incheon (Korea, Republic of)

    2015-11-15

    Since product regulations contain important rules or codes that manufacturers must follow, automatic verification of product design with the regulations related to a product is necessary. For this, this study presents a new method for the verification of product design using regulation knowledge base and Web services. Regulation knowledge base consisting of product ontology and rules was built with a hybrid technique combining ontology and programming languages. Web service for design verification was developed ensuring the flexible extension of knowledge base. By virtue of two technical features, design verification is served to various products while the change of system architecture is minimized.

  17. Verification of product design using regulation knowledge base and Web services

    International Nuclear Information System (INIS)

    Kim, Ik June; Lee, Jae Chul; Mun Du Hwan; Kim, Byung Chul; Hwang, Jin Sang; Lim, Chae Ho

    2015-01-01

    Since product regulations contain important rules or codes that manufacturers must follow, automatic verification of product design with the regulations related to a product is necessary. For this, this study presents a new method for the verification of product design using regulation knowledge base and Web services. Regulation knowledge base consisting of product ontology and rules was built with a hybrid technique combining ontology and programming languages. Web service for design verification was developed ensuring the flexible extension of knowledge base. By virtue of two technical features, design verification is served to various products while the change of system architecture is minimized.

  18. Compromises produced by the dialectic between self-verification and self-enhancement.

    Science.gov (United States)

    Morling, B; Epstein, S

    1997-12-01

    Three studies of people's reactions to evaluative feedback demonstrated that the dialectic between self-enhancement and self-verification results in compromises between these 2 motives, as hypothesized in cognitive-experiential self-theory. The demonstration was facilitated by 2 procedural improvements: Enhancement and verification were established by calibrating evaluative feedback against self appraisals, and degree of enhancement and of verification were varied along a continuum, rather than categorically. There was also support for the hypotheses that processing in an intuitive-experiential mode favors enhancement and processing in an analytical-rational mode favors verification in the kinds of situations investigated.

  19. Hydrous ferric oxide: evaluation of Cd-HFO surface complexation models combining Cd(K) EXAFS data, potentiometric titration results, and surface site structures identified from mineralogical knowledge.

    Science.gov (United States)

    Spadini, Lorenzo; Schindler, Paul W; Charlet, Laurent; Manceau, Alain; Vala Ragnarsdottir, K

    2003-10-01

    The surface properties of ferrihydrite were studied by combining wet chemical data, Cd(K) EXAFS data, and a surface structure and protonation model of the ferrihydrite surface. Acid-base titration experiments and Cd(II)-ferrihydrite sorption experiments were performed within 3titration data could be adequately modeled by triple bond Fe- OH(2)(+1/2)-H(+)triple bond Fe-OH(-1/2),logk((int))=-8.29, assuming the existence of a unique intrinsic microscopic constant, logk((int)), and consequently the existence of a single significant type of acid-base reactive functional groups. The surface structure model indicates that these groups are terminal water groups. The Cd(II) data were modeled assuming the existence of a single reactive site. The model fits the data set at low Cd(II) concentration and up to 50% surface coverage. At high coverage more Cd(II) ions than predicted are adsorbed, which is indicative of the existence of a second type of site of lower affinity. This agrees with the surface structure and protonation model developed, which indicates comparable concentrations of high- and low-affinity sites. The model further shows that for each class of low- and high-affinity sites there exists a variety of corresponding Cd surface complex structure, depending on the model crystal faces on which the complexes develop. Generally, high-affinity surface structures have surface coordinations of 3 and 4, as compared to 1 and 2 for low-affinity surface structures.

  20. Design Process Control for Improved Surface Finish of Metal Additive Manufactured Parts of Complex Build Geometry

    Directory of Open Access Journals (Sweden)

    Mikdam Jamal

    2017-12-01

    Full Text Available Metal additive manufacturing (AM is increasingly used to create complex 3D components at near net shape. However, the surface finish (SF of the metal AM part is uneven, with surface roughness being variable over the facets of the design. Standard post-processing methods such as grinding and linishing often meet with major challenges in finishing parts of complex shape. This paper reports on research that demonstrated that mass finishing (MF processes are able to deliver high-quality surface finishes (Ra and Sa on AM-generated parts of a relatively complex geometry (both internal features and external facets under select conditions. Four processes were studied in this work: stream finishing, high-energy (HE centrifuge, drag finishing and disc finishing. Optimisation of the drag finishing process was then studied using a structured design of experiments (DOE. The effects of a range of finishing parameters were evaluated and optimal parameters and conditions were determined. The study established that the proposed method can be successfully applied in drag finishing to optimise the surface roughness in an industrial application and that it is an economical way of obtaining the maximum amount of information in a short period of time with a small number of tests. The study has also provided an important step in helping understand the requirements of MF to deliver AM-generated parts to a target quality finish and cycle time.

  1. Integrated knowledge base tool for acquisition and verification of NPP alarm systems

    International Nuclear Information System (INIS)

    Park, Joo Hyun; Seong, Poong Hyun

    1998-01-01

    Knowledge acquisition and knowledge base verification are important activities in developing knowledge-based systems such as alarm processing systems. In this work, we developed the integrated tool, for knowledge acquisition and verification of NPP alarm processing systems, by using G2 tool. The tool integrates document analysis method and ECPN matrix analysis method, for knowledge acquisition and knowledge verification, respectively. This tool enables knowledge engineers to perform their tasks from knowledge acquisition to knowledge verification consistently

  2. Transmutation Fuel Performance Code Thermal Model Verification

    Energy Technology Data Exchange (ETDEWEB)

    Gregory K. Miller; Pavel G. Medvedev

    2007-09-01

    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  3. Verification and Validation of RADTRAN 5.5.

    Energy Technology Data Exchange (ETDEWEB)

    Osborn, Douglas.; Weiner, Ruth F.; Mills, George Scott; Hamp, Steve C.

    2005-02-01

    This document contains a description of the verification and validation process used for the RADTRAN 5.5 code. The verification and validation process ensured the proper calculational models and mathematical and numerical methods were used in the RADTRAN 5.5 code for the determination of risk and consequence assessments. The differences between RADTRAN 5 and RADTRAN 5.5 are the addition of tables, an expanded isotope library, and the additional User-Defined meteorological option for accident dispersion. 3

  4. Comparing formal verification approaches of interlocking systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Nguyen, Hoang Nga; Roggenbach, Markus

    2016-01-01

    these approaches. As a first step towards this, in this paper we suggest a way to compare different formal approaches for verifying designs of route-based interlocking systems and we demonstrate it on modelling and verification approaches developed within the research groups at DTU/Bremen and at Surrey......The verification of railway interlocking systems is a challenging task, and therefore several research groups have suggested to improve this task by using formal methods, but they use different modelling and verification approaches. To advance this research, there is a need to compare....../Swansea. The focus is on designs that are specified by so-called control tables. The paper can serve as a starting point for further comparative studies. The DTU/Bremen research has been funded by the RobustRailS project granted by Innovation Fund Denmark. The Surrey/Swansea research has been funded by the Safe...

  5. Investigation of Navier-Stokes Code Verification and Design Optimization

    Science.gov (United States)

    Vaidyanathan, Rajkumar

    2004-01-01

    With rapid progress made in employing computational techniques for various complex Navier-Stokes fluid flow problems, design optimization problems traditionally based on empirical formulations and experiments are now being addressed with the aid of computational fluid dynamics (CFD). To be able to carry out an effective CFD-based optimization study, it is essential that the uncertainty and appropriate confidence limits of the CFD solutions be quantified over the chosen design space. The present dissertation investigates the issues related to code verification, surrogate model-based optimization and sensitivity evaluation. For Navier-Stokes (NS) CFD code verification a least square extrapolation (LSE) method is assessed. This method projects numerically computed NS solutions from multiple, coarser base grids onto a freer grid and improves solution accuracy by minimizing the residual of the discretized NS equations over the projected grid. In this dissertation, the finite volume (FV) formulation is focused on. The interplay between the xi concepts and the outcome of LSE, and the effects of solution gradients and singularities, nonlinear physics, and coupling of flow variables on the effectiveness of LSE are investigated. A CFD-based design optimization of a single element liquid rocket injector is conducted with surrogate models developed using response surface methodology (RSM) based on CFD solutions. The computational model consists of the NS equations, finite rate chemistry, and the k-6 turbulence closure. With the aid of these surrogate models, sensitivity and trade-off analyses are carried out for the injector design whose geometry (hydrogen flow angle, hydrogen and oxygen flow areas and oxygen post tip thickness) is optimized to attain desirable goals in performance (combustion length) and life/survivability (the maximum temperatures on the oxidizer post tip and injector face and a combustion chamber wall temperature). A preliminary multi-objective optimization

  6. inner-sphere complexation of cations at the rutile-water interface: A concise surface structural interpretation with the CD and MUSIC model

    Energy Technology Data Exchange (ETDEWEB)

    Ridley, Mora K. [Texas Tech University, Lubbock; Hiemstra, T [Oak Ridge National Laboratory (ORNL); Van Riemsdijk, Willem H. [Wageningen University and Research Centre, The Netherlands; Machesky, Michael L. [Illinois State Water Survey, Champaign, IL

    2009-01-01

    Acid base reactivity and ion-interaction between mineral surfaces and aqueous solutions is most frequently investigated at the macroscopic scale as a function of pH. Experimental data are then rationalized by a variety of surface complexation models. These models are thermodynamically based which in principle does not require a molecular picture. The models are typically calibrated to relatively simple solid-electrolyte solution pairs and may provide poor descriptions of complex multicomponent mineral aqueous solutions, including those found in natural environments. Surface complexation models may be improved by incorporating molecular-scale surface structural information to constrain the modeling efforts. Here, we apply a concise, molecularly-constrained surface complexation model to a diverse suite of surface titration data for rutile and thereby begin to address the complexity of multi-component systems. Primary surface charging curves in NaCl, KCl, and RbCl electrolyte media were fit simultaneously using a charge distribution (CD) and multisite complexation (MUSIC) model [Hiemstra T. and Van Riemsdijk W. H. (1996) A surface structural approach to ion adsorption: the charge distribution (CD) model. J. Colloid Interf. Sci. 179, 488 508], coupled with a Basic Stern layer description of the electric double layer. In addition, data for the specific interaction of Ca2+ and Sr2+ with rutile, in NaCl and RbCl media, were modeled. In recent developments, spectroscopy, quantum calculations, and molecular simulations have shown that electrolyte and divalent cations are principally adsorbed in various inner-sphere configurations on the rutile 110 surface [Zhang Z., Fenter P., Cheng L., Sturchio N. C., Bedzyk M. J., Pr edota M., Bandura A., Kubicki J., Lvov S. N., Cummings P. T., Chialvo A. A., Ridley M. K., Be ne zeth P., Anovitz L., Palmer D. A., Machesky M. L. and Wesolowski D. J. (2004) Ion adsorption at the rutile water interface: linking molecular and macroscopic

  7. Inner-sphere complexation of cations at the rutile-water interface: A concise surface structural interpretation with the CD and MUSIC model

    Science.gov (United States)

    Ridley, Moira K.; Hiemstra, Tjisse; van Riemsdijk, Willem H.; Machesky, Michael L.

    2009-04-01

    Acid-base reactivity and ion-interaction between mineral surfaces and aqueous solutions is most frequently investigated at the macroscopic scale as a function of pH. Experimental data are then rationalized by a variety of surface complexation models. These models are thermodynamically based which in principle does not require a molecular picture. The models are typically calibrated to relatively simple solid-electrolyte solution pairs and may provide poor descriptions of complex multi-component mineral-aqueous solutions, including those found in natural environments. Surface complexation models may be improved by incorporating molecular-scale surface structural information to constrain the modeling efforts. Here, we apply a concise, molecularly-constrained surface complexation model to a diverse suite of surface titration data for rutile and thereby begin to address the complexity of multi-component systems. Primary surface charging curves in NaCl, KCl, and RbCl electrolyte media were fit simultaneously using a charge distribution (CD) and multisite complexation (MUSIC) model [Hiemstra T. and Van Riemsdijk W. H. (1996) A surface structural approach to ion adsorption: the charge distribution (CD) model. J. Colloid Interf. Sci. 179, 488-508], coupled with a Basic Stern layer description of the electric double layer. In addition, data for the specific interaction of Ca 2+ and Sr 2+ with rutile, in NaCl and RbCl media, were modeled. In recent developments, spectroscopy, quantum calculations, and molecular simulations have shown that electrolyte and divalent cations are principally adsorbed in various inner-sphere configurations on the rutile 1 1 0 surface [Zhang Z., Fenter P., Cheng L., Sturchio N. C., Bedzyk M. J., Předota M., Bandura A., Kubicki J., Lvov S. N., Cummings P. T., Chialvo A. A., Ridley M. K., Bénézeth P., Anovitz L., Palmer D. A., Machesky M. L. and Wesolowski D. J. (2004) Ion adsorption at the rutile-water interface: linking molecular and macroscopic

  8. Implementation and verification of global optimization benchmark problems

    Directory of Open Access Journals (Sweden)

    Posypkin Mikhail

    2017-12-01

    Full Text Available The paper considers the implementation and verification of a test suite containing 150 benchmarks for global deterministic box-constrained optimization. A C++ library for describing standard mathematical expressions was developed for this purpose. The library automate the process of generating the value of a function and its’ gradient at a given point and the interval estimates of a function and its’ gradient on a given box using a single description. Based on this functionality, we have developed a collection of tests for an automatic verification of the proposed benchmarks. The verification has shown that literary sources contain mistakes in the benchmarks description. The library and the test suite are available for download and can be used freely.

  9. Integrated Java Bytecode Verification

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian; Franz, Michael

    2005-01-01

    Existing Java verifiers perform an iterative data-flow analysis to discover the unambiguous type of values stored on the stack or in registers. Our novel verification algorithm uses abstract interpretation to obtain definition/use information for each register and stack location in the program...

  10. Verification on spray simulation of a pintle injector for liquid rocket engine

    Science.gov (United States)

    Son, Min; Yu, Kijeong; Radhakrishnan, Kanmaniraja; Shin, Bongchul; Koo, Jaye

    2016-02-01

    The pintle injector used for a liquid rocket engine is a newly re-attracted injection system famous for its wide throttle ability with high efficiency. The pintle injector has many variations with complex inner structures due to its moving parts. In order to study the rotating flow near the injector tip, which was observed from the cold flow experiment using water and air, a numerical simulation was adopted and a verification of the numerical model was later conducted. For the verification process, three types of experimental data including velocity distributions of gas flows, spray angles and liquid distribution were all compared using simulated results. The numerical simulation was performed using a commercial simulation program with the Eulerian multiphase model and axisymmetric two dimensional grids. The maximum and minimum velocities of gas were within the acceptable range of agreement, however, the spray angles experienced up to 25% error when the momentum ratios were increased. The spray density distributions were quantitatively measured and had good agreement. As a result of this study, it was concluded that the simulation method was properly constructed to study specific flow characteristics of the pintle injector despite having the limitations of two dimensional and coarse grids.

  11. US monitoring and verification technology: on-site inspection experience and future challenges

    International Nuclear Information System (INIS)

    Gullickson, R.L.; Carlson, D.; Ingraham, J.; Laird, B.

    2013-01-01

    The United States has a long and successful history of cooperation with treaty partners in monitoring and verification. For strategic arms reduction treaties, our collaboration has resulted in the development and application of systems with limited complexity and intrusiveness. As we progress beyond New START (NST) along the 'road to zero', the reduced number of nuclear weapons is likely to require increased confidence in monitoring and verification techniques. This may place increased demands on the technology to verify the presence of a nuclear weapon and even confirm the presence of a certain type. Simultaneously, this technology must include the ability to protect each treaty partner's sensitive nuclear weapons information. Mutual development of this technology by treaty partners offers the best approach for acceptance in treaty negotiations. This same approach of mutual cooperation and development is essential for developing nuclear test monitoring technology in support of the Comprehensive Nuclear Test Ban Treaty (CTBT). Our ability to detect low yield and evasive testing will be enhanced through mutually developed techniques and experiments using laboratory laser experiments and high explosives tests in a variety of locations and geologies. (authors)

  12. Radiochemical verification and validation in the environmental data collection process

    International Nuclear Information System (INIS)

    Rosano-Reece, D.; Bottrell, D.; Bath, R.J.

    1994-01-01

    A credible and cost effective environmental data collection process should produce analytical data which meets regulatory and program specific requirements. Analytical data, which support the sampling and analysis activities at hazardous waste sites, undergo verification and independent validation before the data are submitted to regulators. Understanding the difference between verification and validation and their respective roles in the sampling and analysis process is critical to the effectiveness of a program. Verification is deciding whether the measurement data obtained are what was requested. The verification process determines whether all the requirements were met. Validation is more complicated than verification. It attempts to assess the impacts on data use, especially when requirements are not met. Validation becomes part of the decision-making process. Radiochemical data consists of a sample result with an associated error. Therefore, radiochemical validation is different and more quantitative than is currently possible for the validation of hazardous chemical data. Radiochemical data include both results and uncertainty that can be statistically compared to identify significance of differences in a more technically defensible manner. Radiochemical validation makes decisions about analyte identification, detection, and uncertainty for a batch of data. The process focuses on the variability of the data in the context of the decision to be made. The objectives of this paper are to present radiochemical verification and validation for environmental data and to distinguish the differences between the two operations

  13. Automatic verification of a lip-synchronisation protocol using Uppaal

    NARCIS (Netherlands)

    Bowman, H.; Faconti, G.; Katoen, J.-P.; Latella, D.; Massink, M.

    1998-01-01

    We present the formal specification and verification of a lip-synchronisation protocol using the real-time model checker Uppaal. A number of specifications of this protocol can be found in the literature, but this is the first automatic verification. We take a published specification of the

  14. Numident Online Verification Utility (NOVU)

    Data.gov (United States)

    Social Security Administration — NOVU is a mainframe application that accesses the NUMIDENT to perform real-time SSN verifications. This program is called by other SSA online programs that serve as...

  15. Systematic study of source mask optimization and verification flows

    Science.gov (United States)

    Ben, Yu; Latypov, Azat; Chua, Gek Soon; Zou, Yi

    2012-06-01

    Source mask optimization (SMO) emerged as powerful resolution enhancement technique (RET) for advanced technology nodes. However, there is a plethora of flow and verification metrics in the field, confounding the end user of the technique. Systemic study of different flows and the possible unification thereof is missing. This contribution is intended to reveal the pros and cons of different SMO approaches and verification metrics, understand the commonality and difference, and provide a generic guideline for RET selection via SMO. The paper discusses 3 different type of variations commonly arise in SMO, namely pattern preparation & selection, availability of relevant OPC recipe for freeform source and finally the metrics used in source verification. Several pattern selection algorithms are compared and advantages of systematic pattern selection algorithms are discussed. In the absence of a full resist model for SMO, alternative SMO flow without full resist model is reviewed. Preferred verification flow with quality metrics of DOF and MEEF is examined.

  16. Verification of Space Weather Forecasts using Terrestrial Weather Approaches

    Science.gov (United States)

    Henley, E.; Murray, S.; Pope, E.; Stephenson, D.; Sharpe, M.; Bingham, S.; Jackson, D.

    2015-12-01

    The Met Office Space Weather Operations Centre (MOSWOC) provides a range of 24/7 operational space weather forecasts, alerts, and warnings, which provide valuable information on space weather that can degrade electricity grids, radio communications, and satellite electronics. Forecasts issued include arrival times of coronal mass ejections (CMEs), and probabilistic forecasts for flares, geomagnetic storm indices, and energetic particle fluxes and fluences. These forecasts are produced twice daily using a combination of output from models such as Enlil, near-real-time observations, and forecaster experience. Verification of forecasts is crucial for users, researchers, and forecasters to understand the strengths and limitations of forecasters, and to assess forecaster added value. To this end, the Met Office (in collaboration with Exeter University) has been adapting verification techniques from terrestrial weather, and has been working closely with the International Space Environment Service (ISES) to standardise verification procedures. We will present the results of part of this work, analysing forecast and observed CME arrival times, assessing skill using 2x2 contingency tables. These MOSWOC forecasts can be objectively compared to those produced by the NASA Community Coordinated Modelling Center - a useful benchmark. This approach cannot be taken for the other forecasts, as they are probabilistic and categorical (e.g., geomagnetic storm forecasts give probabilities of exceeding levels from minor to extreme). We will present appropriate verification techniques being developed to address these forecasts, such as rank probability skill score, and comparing forecasts against climatology and persistence benchmarks. As part of this, we will outline the use of discrete time Markov chains to assess and improve the performance of our geomagnetic storm forecasts. We will also discuss work to adapt a terrestrial verification visualisation system to space weather, to help

  17. Universal Verification Methodology Based Register Test Automation Flow.

    Science.gov (United States)

    Woo, Jae Hun; Cho, Yong Kwan; Park, Sun Kyu

    2016-05-01

    In today's SoC design, the number of registers has been increased along with complexity of hardware blocks. Register validation is a time-consuming and error-pron task. Therefore, we need an efficient way to perform verification with less effort in shorter time. In this work, we suggest register test automation flow based UVM (Universal Verification Methodology). UVM provides a standard methodology, called a register model, to facilitate stimulus generation and functional checking of registers. However, it is not easy for designers to create register models for their functional blocks or integrate models in test-bench environment because it requires knowledge of SystemVerilog and UVM libraries. For the creation of register models, many commercial tools support a register model generation from register specification described in IP-XACT, but it is time-consuming to describe register specification in IP-XACT format. For easy creation of register model, we propose spreadsheet-based register template which is translated to IP-XACT description, from which register models can be easily generated using commercial tools. On the other hand, we also automate all the steps involved integrating test-bench and generating test-cases, so that designers may use register model without detailed knowledge of UVM or SystemVerilog. This automation flow involves generating and connecting test-bench components (e.g., driver, checker, bus adaptor, etc.) and writing test sequence for each type of register test-case. With the proposed flow, designers can save considerable amount of time to verify functionality of registers.

  18. The role of the United Nations in the field of verification

    International Nuclear Information System (INIS)

    1991-01-01

    By resolution 43/81 B of 7 December 1988, the General Assembly requested the Secretary General to undertake, with the assistance of a group of qualified governmental experts, an in-depth study of the role of the United Nations in the field of verification. In August 1990, the Secretary-General transmitted to the General Assembly the unanimously approved report of the experts. The report is structured in six chapters and contains a bibliographic appendix on technical aspects of verification. The Introduction provides a brief historical background on the development of the question of verification in the United Nations context, culminating with the adoption by the General Assembly of resolution 43/81 B, which requested the study. Chapters II and III address the definition and functions of verification and the various approaches, methods, procedures and techniques used in the process of verification. Chapters IV and V examine the existing activities of the United Nations in the field of verification, possibilities for improvements in those activities as well as possible additional activities, while addressing the organizational, technical, legal, operational and financial implications of each of the possibilities discussed. Chapter VI presents the conclusions and recommendations of the Group

  19. Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System

    Directory of Open Access Journals (Sweden)

    Jin-Won Park

    2009-01-01

    Full Text Available As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs and passwords. However, the computational workload of fingerprint verification is much heavier than that of the typical PIN-based solution. In this paper, we consider three strategies to implement fingerprint verification in a smart card environment and how to distribute the modules of fingerprint verification between the smart card and the card reader. We first evaluate the number of instructions of each step of a typical fingerprint verification algorithm, and estimate the execution time of several cryptographic algorithms to guarantee the security/privacy of the fingerprint data transmitted in the smart card with the client-server environment. Based on the evaluation results, we analyze each scenario with respect to the security level and the real-time execution requirements in order to implement fingerprint verification in the smart card with the client-server environment.

  20. Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System

    Science.gov (United States)

    Moon, Daesung; Chung, Yongwha; Pan, Sung Bum; Park, Jin-Won

    2009-12-01

    As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs) and passwords. However, the computational workload of fingerprint verification is much heavier than that of the typical PIN-based solution. In this paper, we consider three strategies to implement fingerprint verification in a smart card environment and how to distribute the modules of fingerprint verification between the smart card and the card reader. We first evaluate the number of instructions of each step of a typical fingerprint verification algorithm, and estimate the execution time of several cryptographic algorithms to guarantee the security/privacy of the fingerprint data transmitted in the smart card with the client-server environment. Based on the evaluation results, we analyze each scenario with respect to the security level and the real-time execution requirements in order to implement fingerprint verification in the smart card with the client-server environment.

  1. Stability of nano-metric colloidal dispersions of titanium: effect of surface complexation

    International Nuclear Information System (INIS)

    Peyre, Veronique

    1996-01-01

    This research thesis reports the study of the adsorption of small organic molecules at the surface of nano-particles of mineral oxides (zirconia), and of its effects on the stability of the colloidal dispersion. Adsorption has been quantified by adsorption isotherms and surface titrations. Processes and mechanisms are thus discussed with respect to pH. The influence of various protecting molecules (acetyl acetone, but also acetic acid, citric acid and diethanolamine) has been studied, and notably highlighted the role of the outer face of the complexing agent in the assessment of reactions between particles which govern the compression and re-dispersability properties of protected dispersions. This study is performed by osmotic pressure measurements and by X-ray diffusion at small angles, completed by statistical mechanics calculations [fr

  2. NRPB models for calculating the transfer of radionuclides through the environment. Verification and validation

    International Nuclear Information System (INIS)

    Attwood, C.; Barraclough, I.; Brown, J.

    1998-06-01

    There is a wide range of models available at NRPB to predict the transfer of radionuclides through the environment. Such models form an essential part of assessments of the radiological impact of releases of radionuclides into the environment. These models cover: the atmosphere; the aquatic environment; the geosphere; the terrestrial environment including foodchains. It is important that the models used for radiological impact assessments are robust, reliable and suitable for the assessment being undertaken. During model development it is, therefore, important that the model is both verified and validated. Verification of a model involves ensuring that it has been implemented correctly, while validation consists of demonstrating that the model is an adequate representation of the real environment. The extent to which a model can be verified depends on its complexity and whether similar models exist. For relatively simple models verification is straightforward, but for more complex models verification has to form part of the development, coding and testing of the model within quality assurance procedures. Validation of models should ideally consist of comparisons between the results of the models and experimental or environmental measurement data that were not used to develop the model. This is more straightforward for some models than for others depending on the quantity and type of data available. Validation becomes increasingly difficult for models which are intended to predict environmental transfer at long times or at great distances. It is, therefore, necessary to adopt qualitative validation techniques to ensure that the model is an adequate representation of the real environment. This report summarises the models used at NRPB to predict the transfer of radionuclides through the environment as part of a radiological impact assessment. It outlines the work carried out to verify and validate the models. The majority of these models are not currently available

  3. 45 CFR 1626.7 - Verification of eligible alien status.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 4 2010-10-01 2010-10-01 false Verification of eligible alien status. 1626.7... CORPORATION RESTRICTIONS ON LEGAL ASSISTANCE TO ALIENS § 1626.7 Verification of eligible alien status. (a) An alien seeking representation shall submit appropriate documents to verify eligibility, unless the only...

  4. A Quantitative Approach to the Formal Verification of Real-Time Systems.

    Science.gov (United States)

    1996-09-01

    Computer Science A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos September 1996 CMU-CS-96-199...ptisiic raieaiSI v Diambimos Lboiamtad _^ A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos...implied, of NSF, the Semiconduc- tor Research Corporation, ARPA or the U.S. government. Keywords: real - time systems , formal verification, symbolic

  5. The RSV F and G glycoproteins interact to form a complex on the surface of infected cells

    International Nuclear Information System (INIS)

    Low, Kit-Wei; Tan, Timothy; Ng, Ken; Tan, Boon-Huan; Sugrue, Richard J.

    2008-01-01

    In this study, the interaction between the respiratory syncytial virus (RSV) fusion (F) protein, attachment (G) protein, and small hydrophobic (SH) proteins was examined. Immunoprecipitation analysis suggested that the F and G proteins exist as a protein complex on the surface of RSV-infected cells, and this conclusion was supported by ultracentrifugation analysis that demonstrated co-migration of surface-expressed F and G proteins. Although our analysis provided evidence for an interaction between the G and SH proteins, no evidence was obtained for a single protein complex involving all three of the virus proteins. These data suggest the existence of multiple virus glycoprotein complexes within the RSV envelope. Although the stimulus that drives RSV-mediated membrane fusion is unknown, the association between the G and F proteins suggest an indirect role for the G protein in this process

  6. Distorted Fingerprint Verification System

    Directory of Open Access Journals (Sweden)

    Divya KARTHIKAESHWARAN

    2011-01-01

    Full Text Available Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the template fingerprint image and matching score is computed. To improve the performance of the system, a fuzzy clustering based on distance and density has been used to cluster the feature set obtained from the fingerprint matcher. Finally a classifier framework has been developed and found that cost sensitive classifier produces better results. The system has been evaluated on fingerprint database and the experimental result shows that system produces a verification rate of 96%. This system plays an important role in forensic and civilian applications.

  7. Quantitative assessment of the physical potential of proton beam range verification with PET/CT

    Science.gov (United States)

    Knopf, A.; Parodi, K.; Paganetti, H.; Cascio, E.; Bonab, A.; Bortfeld, T.

    2008-08-01

    scanner. PET/CT range verification was found to be able to detect small range modifications in the presence of complex tissue inhomogeneities. This study indicates the physical potential of the PET/CT verification method to detect the full-range characteristic of the delivered dose in the patient.

  8. Quantitative assessment of the physical potential of proton beam range verification with PET/CT

    Energy Technology Data Exchange (ETDEWEB)

    Knopf, A; Paganetti, H; Cascio, E; Bortfeld, T [Department of Radiation Oncology, MGH and Harvard Medical School, Boston, MA 02114 (United States); Parodi, K [Heidelberg Ion Therapy Center, Heidelberg (Germany); Bonab, A [Department of Radiology, MGH and Harvard Medical School, Boston, MA 02114 (United States)

    2008-08-07

    PET scanner. PET/CT range verification was found to be able to detect small range modifications in the presence of complex tissue inhomogeneities. This study indicates the physical potential of the PET/CT verification method to detect the full-range characteristic of the delivered dose in the patient.

  9. Quantitative assessment of the physical potential of proton beam range verification with PET/CT.

    Science.gov (United States)

    Knopf, A; Parodi, K; Paganetti, H; Cascio, E; Bonab, A; Bortfeld, T

    2008-08-07

    PET scanner. PET/CT range verification was found to be able to detect small range modifications in the presence of complex tissue inhomogeneities. This study indicates the physical potential of the PET/CT verification method to detect the full-range characteristic of the delivered dose in the patient.

  10. Enhanced verification test suite for physics simulation codes

    Energy Technology Data Exchange (ETDEWEB)

    Kamm, James R.; Brock, Jerry S.; Brandon, Scott T.; Cotrell, David L.; Johnson, Bryan; Knupp, Patrick; Rider, William J.; Trucano, Timothy G.; Weirs, V. Gregory

    2008-09-01

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations.

  11. Test-driven verification/validation of model transformations

    Institute of Scientific and Technical Information of China (English)

    László LENGYEL; Hassan CHARAF

    2015-01-01

    Why is it important to verify/validate model transformations? The motivation is to improve the quality of the trans-formations, and therefore the quality of the generated software artifacts. Verified/validated model transformations make it possible to ensure certain properties of the generated software artifacts. In this way, verification/validation methods can guarantee different requirements stated by the actual domain against the generated/modified/optimized software products. For example, a verified/ validated model transformation can ensure the preservation of certain properties during the model-to-model transformation. This paper emphasizes the necessity of methods that make model transformation verified/validated, discusses the different scenarios of model transformation verification and validation, and introduces the principles of a novel test-driven method for verifying/ validating model transformations. We provide a solution that makes it possible to automatically generate test input models for model transformations. Furthermore, we collect and discuss the actual open issues in the field of verification/validation of model transformations.

  12. New insights on the structure of the picloram-montmorillonite surface complexes.

    Science.gov (United States)

    Marco-Brown, Jose L; Trinelli, María Alcira; Gaigneaux, Eric M; Sánchez, Rosa M Torres; Afonso, María dos Santos

    2015-04-15

    The environmental mobility and bioavailability of Picloram (PCM) are determined by the amine and carboxylate chemical groups interaction with the soils mineral phases. Clay particles, such as montmorillonite (Mt), and the pH value of the media could play an important role in adsorption processes. Thus, the study of the role of soil components other than organic matter deserves further investigation for a more accurate assessment of the risk of groundwater contamination. Samples with PCM adsorbed on Mt dispersions were prepared at pH 3-9. Subsequently, the dispersions were separated, washed, centrifuged and stored at room temperature. Picloram (PCM) herbicide interaction with surface groups of montmorillonite (Mt) was studied using XRD, DTA, FTIR and XPS techniques. The entrance of PCM into the Mt basal space, in two different arrangements, perpendicular and planar, is proposed and the final arrangement depends on PCM concentration. The interaction of PCM with Mt surface sites through the nitrogen of the pyridine ring and carboxylic group of PCM, forming bidentate and bridge inner-sphere complexes was confirmed by FTIR and XPS analysis. The acidity constant of the PCM adsorbed on the Mt surface was calculated. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. Molecular and electronic structure of osmium complexes confined to Au(111) surfaces using a self-assembled molecular bridge

    Energy Technology Data Exchange (ETDEWEB)

    Llave, Ezequiel de la; Herrera, Santiago E.; Adam, Catherine; Méndez De Leo, Lucila P.; Calvo, Ernesto J.; Williams, Federico J., E-mail: fwilliams@qi.fcen.uba.ar [INQUIMAE-CONICET, Departamento de Química Inorgánica, Analítica y Química-Física, Facultad Ciencias Exactas y Naturales, Universidad de Buenos Aires, Ciudad Universitaria, Pabellón 2, Buenos Aires C1428EHA (Argentina)

    2015-11-14

    The molecular and electronic structure of Os(II) complexes covalently bonded to self-assembled monolayers (SAMs) on Au(111) surfaces was studied by means of polarization modulation infrared reflection absorption spectroscopy, photoelectron spectroscopies, scanning tunneling microscopy, scanning tunneling spectroscopy, and density functional theory calculations. Attachment of the Os complex to the SAM proceeds via an amide covalent bond with the SAM alkyl chain 40° tilted with respect to the surface normal and a total thickness of 26 Å. The highest occupied molecular orbital of the Os complex is mainly based on the Os(II) center located 2.2 eV below the Fermi edge and the LUMO molecular orbital is mainly based on the bipyridine ligands located 1.5 eV above the Fermi edge.

  14. Verification Survey of Uranium Mine Remediation

    International Nuclear Information System (INIS)

    Ron, Stager

    2009-01-01

    The Canadian Nuclear Safety Commission (CNSC) contracted an independent verification of an intensive gamma radiation survey conducted by a mining company to demonstrate that remediation of disturbed areas was complete. This site was the first of the recent mines being decommissioned in Canada and experience gained here may be applied to other mines being decommissioned in the future. The review included examination of the site-specific basis for clean-up criteria and ALARA as required by CNSC guidance. A paper review of the company report was conducted to determine if protocols were followed and that the summarized results could be independently reproduced. An independent verification survey was conducted on parts of the site and comparisons were made between gamma radiation measurements from the verification survey and the original company survey. Some aspects of data collection using rate meters linked to GPS data loggers are discussed as are aspects for data management and analyses methods required for the large amount of data collected during these surveys. Recommendations were made for implementation of future surveys and reporting the data from those surveys in order to ensure that remediation was complete. (authors)

  15. Verification and validation of RADMODL Version 1.0

    International Nuclear Information System (INIS)

    Kimball, K.D.

    1993-03-01

    RADMODL is a system of linked computer codes designed to calculate the radiation environment following an accident in which nuclear materials are released. The RADMODL code and the corresponding Verification and Validation (V ampersand V) calculations (Appendix A), were developed for Westinghouse Savannah River Company (WSRC) by EGS Corporation (EGS). Each module of RADMODL is an independent code and was verified separately. The full system was validated by comparing the output of the various modules with the corresponding output of a previously verified version of the modules. The results of the verification and validation tests show that RADMODL correctly calculates the transport of radionuclides and radiation doses. As a result of this verification and validation effort, RADMODL Version 1.0 is certified for use in calculating the radiation environment following an accident

  16. Verification and validation of RADMODL Version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    Kimball, K.D.

    1993-03-01

    RADMODL is a system of linked computer codes designed to calculate the radiation environment following an accident in which nuclear materials are released. The RADMODL code and the corresponding Verification and Validation (V&V) calculations (Appendix A), were developed for Westinghouse Savannah River Company (WSRC) by EGS Corporation (EGS). Each module of RADMODL is an independent code and was verified separately. The full system was validated by comparing the output of the various modules with the corresponding output of a previously verified version of the modules. The results of the verification and validation tests show that RADMODL correctly calculates the transport of radionuclides and radiation doses. As a result of this verification and validation effort, RADMODL Version 1.0 is certified for use in calculating the radiation environment following an accident.

  17. Independent verification and validation testing of the FLASH computer code, Versiion 3.0

    International Nuclear Information System (INIS)

    Martian, P.; Chung, J.N.

    1992-06-01

    Independent testing of the FLASH computer code, Version 3.0, was conducted to determine if the code is ready for use in hydrological and environmental studies at various Department of Energy sites. This report describes the technical basis, approach, and results of this testing. Verification tests, and validation tests, were used to determine the operational status of the FLASH computer code. These tests were specifically designed to test: correctness of the FORTRAN coding, computational accuracy, and suitability to simulating actual hydrologic conditions. This testing was performed using a structured evaluation protocol which consisted of: blind testing, independent applications, and graduated difficulty of test cases. Both quantitative and qualitative testing was performed through evaluating relative root mean square values and graphical comparisons of the numerical, analytical, and experimental data. Four verification test were used to check the computational accuracy and correctness of the FORTRAN coding, and three validation tests were used to check the suitability to simulating actual conditions. These tests cases ranged in complexity from simple 1-D saturated flow to 2-D variably saturated problems. The verification tests showed excellent quantitative agreement between the FLASH results and analytical solutions. The validation tests showed good qualitative agreement with the experimental data. Based on the results of this testing, it was concluded that the FLASH code is a versatile and powerful two-dimensional analysis tool for fluid flow. In conclusion, all aspects of the code that were tested, except for the unit gradient bottom boundary condition, were found to be fully operational and ready for use in hydrological and environmental studies

  18. Covalent attachment of pyridine-type molecules to glassy carbon surfaces by electrochemical reduction of in situ generated diazonium salts. Formation of ruthenium complexes on ligand-modified surfaces

    Energy Technology Data Exchange (ETDEWEB)

    Yesildag, Ali [Department of Chemistry, Faculty of Sciences, Atatuerk University, 25240 Erzurum (Turkey); Ekinci, Duygu, E-mail: dekin@atauni.edu.t [Department of Chemistry, Faculty of Sciences, Atatuerk University, 25240 Erzurum (Turkey)

    2010-09-30

    In this study, pyridine, quinoline and phenanthroline molecules were covalently bonded to glassy carbon (GC) electrode surfaces for the first time using the diazonium modification method. Then, the complexation ability of the modified films with ruthenium metal cations was investigated. The derivatization of GC surfaces with heteroaromatic molecules was achieved by electrochemical reduction of the corresponding in situ generated diazonium salts. X-ray photoelectron spectroscopy (XPS) was used to confirm the attachment of heteroaromatic molecules to the GC surfaces and to determine the surface concentration of the films. The barrier properties of the modified GC electrodes were studied in the presence of redox probes such as Fe(CN){sub 6}{sup 3-} and Ru(NH{sub 3}){sub 6}{sup 3+} by cyclic voltammetry. Additionally, the presence of the resulting organometallic films on the surfaces was verified by XPS after the chemical transformation of the characterized ligand films to the ruthenium complex films. The electrochemical behavior of these films in acetonitrile solution was investigated using voltammetric methods, and the surface coverage of the organometallic films was determined from the reversible metal-based Ru(II)/Ru(III) oxidation waves.

  19. Online 3D EPID-based dose verification: Proof of concept

    Energy Technology Data Exchange (ETDEWEB)

    Spreeuw, Hanno; Rozendaal, Roel, E-mail: r.rozendaal@nki.nl; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben [Department of Radiation Oncology, The Netherlands Cancer Institute, Amsterdam 1066 CX (Netherlands); Herk, Marcel van [University of Manchester, Manchester Academic Health Science Centre, The Christie NHS Foundation Trust, Manchester M20 4BX (United Kingdom)

    2016-07-15

    Purpose: Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of this study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. Methods: The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. Results: The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame

  20. Online 3D EPID-based dose verification: Proof of concept

    International Nuclear Information System (INIS)

    Spreeuw, Hanno; Rozendaal, Roel; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben; Herk, Marcel van

    2016-01-01

    Purpose: Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of this study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. Methods: The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. Results: The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame