WorldWideScience

Sample records for verification performance analysis

  1. Verification of the CONPAS (CONtainment Performance Analysis System) code package

    Energy Technology Data Exchange (ETDEWEB)

    Kim, See Darl; Ahn, Kwang Il; Song, Yong Man; Choi, Young; Park, Soo Yong; Kim, Dong Ha; Jin, Young Ho

    1997-09-01

    CONPAS is a computer code package to integrate the numerical, graphical, and results-oriented aspects of Level 2 probabilistic safety assessment (PSA) for nuclear power plants under a PC window environment automatically. For the integrated analysis of Level 2 PSA, the code utilizes four distinct, but closely related modules: (1) ET Editor, (2) Computer, (3) Text Editor, and (4) Mechanistic Code Plotter. Compared with other existing computer codes for Level 2 PSA, and CONPAS code provides several advanced features: computational aspects including systematic uncertainty analysis, importance analysis, sensitivity analysis and data interpretation, reporting aspects including tabling and graphic as well as user-friendly interface. The computational performance of CONPAS has been verified through a Level 2 PSA to a reference plant. The results of the CONPAS code was compared with an existing level 2 PSA code (NUCAP+) and the comparison proves that CONPAS is appropriate for Level 2 PSA. (author). 9 refs., 8 tabs., 14 figs.

  2. A Formal Verification Model for Performance Analysis of Reinforcement Learning Algorithms Applied t o Dynamic Networks

    Directory of Open Access Journals (Sweden)

    Shrirang Ambaji KULKARNI

    2017-04-01

    Full Text Available Routing data packets in a dynamic network is a difficult and important problem in computer networks. As the network is dynamic, it is subject to frequent topology changes and is subject to variable link costs due to congestion and bandwidth. Existing shortest path algorithms fail to converge to better solutions under dynamic network conditions. Reinforcement learning algorithms posses better adaptation techniques in dynamic environments. In this paper we apply model based Q-Routing technique for routing in dynamic network. To analyze the correctness of Q-Routing algorithms mathematically, we provide a proof and also implement a SPIN based verification model. We also perform simulation based analysis of Q-Routing for given metrics.

  3. Quality Assurance in Environmental Technology Verification (ETV): Analysis and Impact on the EU ETV Pilot Programme Performance

    Science.gov (United States)

    Molenda, Michał; Ratman-Kłosińska, Izabela

    2018-03-01

    Many innovative environmental technologies never reach the market because they are new and cannot demonstrate a successful track record of previous applications. This fact is a serious obstacle on their way to the market. Lack of credible data on the performance of a technology causes mistrust of investors in innovations, especially from public sector, who seek effective solutions however without compromising the technical and financial risks associated with their implementation. Environmental technology verification (ETV) offers a credible, robust and transparent process that results in a third party confirmation of the claims made by the providers about the performance of the novel environmental technologies. Verifications of performance are supported by high quality, independent test data. In that way ETV as a tool helps establish vendor credibility and buyer confidence. Several countries across the world have implemented ETV in the form of national or regional programmes. ETV in the European Union was implemented as a voluntary scheme if a form of a pilot programme. The European Commission launched the Environmental Technology Pilot Programme of the European Union (EU ETV) in 2011. The paper describes the European model of ETV set up and put to operation under the Pilot Programme of Environmental Technologies Verification of the European Union. The goal, objectives, technological scope, involved entities are presented. An attempt has been made to summarise the results of the EU ETV scheme performance available for the period of 2012 when the programme has become fully operational until the first half of 2016. The study was aimed at analysing the overall organisation and efficiency of the EU ETV Pilot Programme. The study was based on the analysis of the documents the operation of the EU ETV system. For this purpose, a relevant statistical analysis of the data on the performance of the EU ETV system provided by the European Commission was carried out.

  4. Transmutation Fuel Performance Code Thermal Model Verification

    Energy Technology Data Exchange (ETDEWEB)

    Gregory K. Miller; Pavel G. Medvedev

    2007-09-01

    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  5. Verification of method performance for clinical laboratories.

    Science.gov (United States)

    Nichols, James H

    2009-01-01

    Method verification, a one-time process to determine performance characteristics before a test system is utilized for patient testing, is often confused with method validation, establishing the performance of a new diagnostic tool such as an internally developed or modified method. A number of international quality standards (International Organization for Standardization (ISO) and Clinical Laboratory Standards Institute (CLSI)), accreditation agency guidelines (College of American Pathologists (CAP), Joint Commission, U.K. Clinical Pathology Accreditation (CPA)), and regional laws (Clinical Laboratory Improvement Amendments of 1988 (CLIA'88)) exist describing the requirements for method verification and validation. Consumers of marketed test kits should verify method accuracy, precision, analytic measurement range, and the appropriateness of reference intervals to the institution's patient population. More extensive validation may be required for new methods and those manufacturer methods that have been modified by the laboratory, including analytic sensitivity and specificity. This manuscript compares the various recommendations for method verification and discusses the CLSI evaluation protocols (EP) that are available to guide laboratories in performing method verification experiments.

  6. Predicting SMT Solver Performance for Software Verification

    Directory of Open Access Journals (Sweden)

    Andrew Healy

    2017-01-01

    Full Text Available The Why3 IDE and verification system facilitates the use of a wide range of Satisfiability Modulo Theories (SMT solvers through a driver-based architecture. We present Where4: a portfolio-based approach to discharge Why3 proof obligations. We use data analysis and machine learning techniques on static metrics derived from program source code. Our approach benefits software engineers by providing a single utility to delegate proof obligations to the solvers most likely to return a useful result. It does this in a time-efficient way using existing Why3 and solver installations - without requiring low-level knowledge about SMT solver operation from the user.

  7. Performance analysis and experimental verification of mid-range wireless energy transfer through non-resonant magnetic coupling

    DEFF Research Database (Denmark)

    Peng, Liang; Wang, Jingyu; Zhejiang University, Hangzhou, China, L.

    2011-01-01

    In this paper, the efficiency analysis of a mid-range wireless energy transfer system is performed through non-resonant magnetic coupling. It is shown that the self-resistance of the coils and the mutual inductance are critical in achieving a high efficiency, which is indicated by our theoretical...... formulation and verified in our experiments. It is experimentally shown that high efficiency, up to 65%, can be realized even in a non-resonant wireless energy system which employs a device part with moderate or low quality factor. We also address some aspects of a practical wireless energy transfer system...... and show that careful design of the de-tuned system can intrinsically minimize the power dissipated in the source part. Our non-resonant scheme presented in this paper allows flexible design and fabrication of a wireless energy transfer systems with transfer distance being several times of the coils...

  8. Solar Array Verification Analysis Tool (SAVANT) Developed

    Science.gov (United States)

    Bailey, Sheila G.; Long, KIenwyn J.; Curtis, Henry B.; Gardner, Barbara; Davis, Victoria; Messenger, Scott; Walters, Robert

    1999-01-01

    Modeling solar cell performance for a specific radiation environment to obtain the end-of-life photovoltaic array performance has become both increasingly important and, with the rapid advent of new types of cell technology, more difficult. For large constellations of satellites, a few percent difference in the lifetime prediction can have an enormous economic impact. The tool described here automates the assessment of solar array on-orbit end-of-life performance and assists in the development and design of ground test protocols for different solar cell designs. Once established, these protocols can be used to calculate on-orbit end-of-life performance from ground test results. The Solar Array Verification Analysis Tool (SAVANT) utilizes the radiation environment from the Environment Work Bench (EWB) model developed by the NASA Lewis Research Center s Photovoltaic and Space Environmental Effects Branch in conjunction with Maxwell Technologies. It then modifies and combines this information with the displacement damage model proposed by Summers et al. (ref. 1) of the Naval Research Laboratory to determine solar cell performance during the course of a given mission. The resulting predictions can then be compared with flight data. The Environment WorkBench (ref. 2) uses the NASA AE8 (electron) and AP8 (proton) models of the radiation belts to calculate the trapped radiation flux. These fluxes are integrated over the defined spacecraft orbit for the duration of the mission to obtain the total omnidirectional fluence spectra. Components such as the solar cell coverglass, adhesive, and antireflective coatings can slow and attenuate the particle fluence reaching the solar cell. In SAVANT, a continuous slowing down approximation is used to model this effect.

  9. Verification and validation plan for the SFR system analysis module

    Energy Technology Data Exchange (ETDEWEB)

    Hu, R. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2014-12-18

    This report documents the Verification and Validation (V&V) Plan for software verification and validation of the SFR System Analysis Module (SAM), developed at Argonne National Laboratory for sodium fast reactor whole-plant transient analysis. SAM is developed under the DOE NEAMS program and is part of the Reactor Product Line toolkit. The SAM code, the phenomena and computational models of interest, the software quality assurance, and the verification and validation requirements and plans are discussed in this report.

  10. Standard guide for acoustic emission system performance verification

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 System performance verification methods launch stress waves into the examination article on which the sensor is mounted. The resulting stress wave travels in the examination article and is detected by the sensor(s) in a manner similar to acoustic emission. 1.2 This guide describes methods which can be used to verify the response of an Acoustic Emission system including sensors, couplant, sensor mounting devices, cables and system electronic components. 1.3 Acoustic emission system performance characteristics, which may be evaluated using this document, include some waveform parameters, and source location accuracy. 1.4 Performance verification is usually conducted prior to beginning the examination. 1.5 Performance verification can be conducted during the examination if there is any suspicion that the system performance may have changed. 1.6 Performance verification may be conducted after the examination has been completed. 1.7 The values stated in SI units are to be regarded as standard. No other u...

  11. Systems analysis - independent analysis and verification

    Energy Technology Data Exchange (ETDEWEB)

    DiPietro, J.P.; Skolnik, E.G.; Badin, J.S. [Energetics, Inc., Columbia, MD (United States)

    1996-10-01

    The Hydrogen Program of the U.S. Department of Energy (DOE) funds a portfolio of activities ranging from conceptual research to pilot plant testing. The long-term research projects support DOE`s goal of a sustainable, domestically based energy system, and the development activities are focused on hydrogen-based energy systems that can be commercially viable in the near-term. Energetics develops analytic products that enable the Hydrogen Program Manager to assess the potential for near- and long-term R&D activities to satisfy DOE and energy market criteria. This work is based on a pathway analysis methodology. The authors consider an energy component (e.g., hydrogen production from biomass gasification, hybrid hydrogen internal combustion engine (ICE) vehicle) within a complete energy system. The work involves close interaction with the principal investigators to ensure accurate representation of the component technology. Comparisons are made with the current cost and performance of fossil-based and alternative renewable energy systems, and sensitivity analyses are conducted to determine the effect of changes in cost and performance parameters on the projects` viability.

  12. Systems analysis-independent analysis and verification

    Energy Technology Data Exchange (ETDEWEB)

    Badin, J.S.; DiPietro, J.P. [Energetics, Inc., Columbia, MD (United States)

    1995-09-01

    The DOE Hydrogen Program is supporting research, development, and demonstration activities to overcome the barriers to the integration of hydrogen into the Nation`s energy infrastructure. Much work is required to gain acceptance of hydrogen energy system concepts and to develop them for implementation. A systems analysis database has been created that includes a formal documentation of technology characterization profiles and cost and performance information. Through a systematic and quantitative approach, system developers can understand and address important issues and thereby assure effective and timely commercial implementation. This project builds upon and expands the previously developed and tested pathway model and provides the basis for a consistent and objective analysis of all hydrogen energy concepts considered by the DOE Hydrogen Program Manager. This project can greatly accelerate the development of a system by minimizing the risk of costly design evolutions, and by stimulating discussions, feedback, and coordination of key players and allows them to assess the analysis, evaluate the trade-offs, and to address any emerging problem areas. Specific analytical studies will result in the validation of the competitive feasibility of the proposed system and identify system development needs. Systems that are investigated include hydrogen bromine electrolysis, municipal solid waste gasification, electro-farming (biomass gasifier and PEM fuel cell), wind/hydrogen hybrid system for remote sites, home electrolysis and alternate infrastructure options, renewable-based electrolysis to fuel PEM fuel cell vehicle fleet, and geothermal energy used to produce hydrogen. These systems are compared to conventional and benchmark technologies. Interim results and findings are presented. Independent analyses emphasize quality, integrity, objectivity, a long-term perspective, corporate memory, and the merging of technical, economic, operational, and programmatic expertise.

  13. Performance verification of an air solar collector

    Science.gov (United States)

    Miller, D. C.; Romaker, R. F.

    1979-01-01

    Procedures and results of battery of qualification tests performed by independent certification agency on commercial solar collector are presented in report. Reported results were used as basis in judging collector suitable for field installation in residential and commerical buildings.

  14. Performance verification of 3D printers

    DEFF Research Database (Denmark)

    Hansen, Hans Nørgaard; Nielsen, Jakob Skov; Rasmussen, Jakob

    2014-01-01

    Additive Manufacturing continues to gain momentum as the next industrial revolution. While these layering technologies have demonstrated significant time and cost savings for prototype efforts, and enabled new designs with performance benefits, additive manufacturing has not been affiliated...... with 'precision' applications. In order to understand additive manufacturing's capabilities or short comings with regard to precision applications, it is important to understand the mechanics of the process. GE Aviation's Additive Development Center [ADC] is in a unique position to comment on additive metal...... at the ADC. These methodologies were employed to manufacture direct parts, where tolerances are not as tight as the conventional tools that would be used to produce such parts. Readers and attendees should walk away with a better understanding of Additive Manufacturing, specifically direct metal parts...

  15. Time Optimal Reachability Analysis Using Swarm Verification

    DEFF Research Database (Denmark)

    Zhang, Zhengkui; Nielsen, Brian; Larsen, Kim Guldstrand

    2016-01-01

    and planning problems, response time optimization etc. We propose swarm verification to accelerate time optimal reachability using the real-time model-checker Uppaal. In swarm verification, a large number of model checker instances execute in parallel on a computer cluster using different, typically randomized...... search strategies. We develop four swarm algorithms and evaluate them with four models in terms scalability, and time- and memory consumption. Three of these cooperate by exchanging costs of intermediate solutions to prune the search using a branch-and-bound approach. Our results show that swarm...

  16. Performing Verification and Validation in Reuse-Based Software Engineering

    Science.gov (United States)

    Addy, Edward A.

    1999-01-01

    The implementation of reuse-based software engineering not only introduces new activities to the software development process, such as domain analysis and domain modeling, it also impacts other aspects of software engineering. Other areas of software engineering that are affected include Configuration Management, Testing, Quality Control, and Verification and Validation (V&V). Activities in each of these areas must be adapted to address the entire domain or product line rather than a specific application system. This paper discusses changes and enhancements to the V&V process, in order to adapt V&V to reuse-based software engineering.

  17. Analysis and Transformation Tools for Constrained Horn Clause Verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2014-01-01

    Several techniques and tools have been developed for verification of properties expressed as Horn clauses with constraints over a background theory (CHC). Current CHC verification tools implement intricate algorithms and are often limited to certain subclasses of CHC problems. Our aim in this work...... is to investigate the use of a combination of off-the-shelf techniques from the literature in analysis and transformation of Constraint Logic Programs (CLPs) to solve challenging CHC verification problems. We find that many problems can be solved using a combination of tools based on well-known techniques from...... abstract interpretation, semantics-preserving transformations, program specialisation and query-answer transformations. This gives insights into the design of automatic, more general CHC verification tools based on a library of components....

  18. Trajectory Based Behavior Analysis for User Verification

    Science.gov (United States)

    Pao, Hsing-Kuo; Lin, Hong-Yi; Chen, Kuan-Ta; Fadlil, Junaidillah

    Many of our activities on computer need a verification step for authorized access. The goal of verification is to tell apart the true account owner from intruders. We propose a general approach for user verification based on user trajectory inputs. The approach is labor-free for users and is likely to avoid the possible copy or simulation from other non-authorized users or even automatic programs like bots. Our study focuses on finding the hidden patterns embedded in the trajectories produced by account users. We employ a Markov chain model with Gaussian distribution in its transitions to describe the behavior in the trajectory. To distinguish between two trajectories, we propose a novel dissimilarity measure combined with a manifold learnt tuning for catching the pairwise relationship. Based on the pairwise relationship, we plug-in any effective classification or clustering methods for the detection of unauthorized access. The method can also be applied for the task of recognition, predicting the trajectory type without pre-defined identity. Given a trajectory input, the results show that the proposed method can accurately verify the user identity, or suggest whom owns the trajectory if the input identity is not provided.

  19. Nearest-Neighbor Estimation for ROC Analysis under Verification Bias.

    Science.gov (United States)

    Adimari, Gianfranco; Chiogna, Monica

    2015-05-01

    For a continuous-scale diagnostic test, the receiver operating characteristic (ROC) curve is a popular tool for displaying the ability of the test to discriminate between healthy and diseased subjects. In some studies, verification of the true disease status is performed only for a subset of subjects, possibly depending on the test result and other characteristics of the subjects. Estimators of the ROC curve based only on this subset of subjects are typically biased; this is known as verification bias. Methods have been proposed to correct verification bias, in particular under the assumption that the true disease status, if missing, is missing at random (MAR). MAR assumption means that the probability of missingness depends on the true disease status only through the test result and observed covariate information. However, the existing methods require parametric models for the (conditional) probability of disease and/or the (conditional) probability of verification, and hence are subject to model misspecification: a wrong specification of such parametric models can affect the behavior of the estimators, which can be inconsistent. To avoid misspecification problems, in this paper we propose a fully nonparametric method for the estimation of the ROC curve of a continuous test under verification bias. The method is based on nearest-neighbor imputation and adopts generic smooth regression models for both the probability that a subject is diseased and the probability that it is verified. Simulation experiments and an illustrative example show the usefulness of the new method. Variance estimation is also discussed.

  20. Study of space shuttle orbiter system management computer function. Volume 2: Automated performance verification concepts

    Science.gov (United States)

    1975-01-01

    The findings are presented of investigations on concepts and techniques in automated performance verification. The investigations were conducted to provide additional insight into the design methodology and to develop a consolidated technology base from which to analyze performance verification design approaches. Other topics discussed include data smoothing, function selection, flow diagrams, data storage, and shuttle hydraulic systems.

  1. Development of evaluation and performance verification technology for radiotherapy radiation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, J. Y.; Jang, S. Y.; Kim, B. H. and others

    2005-02-15

    No matter how much the importance is emphasized, the exact assessment of the absorbed doses administered to the patients to treat the various diseases such as lately soaring malignant tumors with the radiotherapy practices is the most important factor. In reality, several over-exposed patients from the radiotherapy practice become very serious social issues. Especially, the development of a technology to exactly assess the high doses and high energies (In general, dose administered to the patients with the radiotherapy practices are very huge doses, and they are about three times higher than the lethal doses) generated by the radiation generators and irradiation equipment is a competing issue to be promptly conducted. Over fifty medical centers in Korea operate the radiation generators and irradiation equipment for the radiotherapy practices. However, neither the legal and regulatory systems to implement a quality assurance program are sufficiently stipulated nor qualified personnel who could run a program to maintain the quality assurance and control of those generators and equipment for the radiotherapy practices in the medical facilities are sufficiently employed. To overcome the above deficiencies, a quality assurance program such as those developed in the technically advanced countries should be developed to exactly assess the doses administered to patients with the radiotherapy practices and develop the necessary procedures to maintain the continuing performance of the machine or equipment for the radiotherapy. The QA program and procedures should induce the fluent calibration of the machine or equipment with quality, and definitely establish the safety of patients in the radiotherapy practices. In this study, a methodology for the verification and evaluation of the radiotherapy doses is developed, and several accurate measurements, evaluations of the doses delivered to patients and verification of the performance of the therapy machine and equipment are

  2. SAVANT: Solar Array Verification and Analysis Tool Demonstrated

    Science.gov (United States)

    Chock, Ricaurte

    2000-01-01

    The photovoltaics (PV) industry is now being held to strict specifications, such as end-oflife power requirements, that force them to overengineer their products to avoid contractual penalties. Such overengineering has been the only reliable way to meet such specifications. Unfortunately, it also results in a more costly process than is probably necessary. In our conversations with the PV industry, the issue of cost has been raised again and again. Consequently, the Photovoltaics and Space Environment Effects branch at the NASA Glenn Research Center at Lewis Field has been developing a software tool to address this problem. SAVANT, Glenn's tool for solar array verification and analysis is in the technology demonstration phase. Ongoing work has proven that more efficient and less costly PV designs should be possible by using SAVANT to predict the on-orbit life-cycle performance. The ultimate goal of the SAVANT project is to provide a user-friendly computer tool to predict PV on-orbit life-cycle performance. This should greatly simplify the tasks of scaling and designing the PV power component of any given flight or mission. By being able to predict how a particular PV article will perform, designers will be able to balance mission power requirements (both beginning-of-life and end-of-life) with survivability concerns such as power degradation due to radiation and/or contamination. Recent comparisons with actual flight data from the Photovoltaic Array Space Power Plus Diagnostics (PASP Plus) mission validate this approach.

  3. Triple Modular Redundancy verification via heuristic netlist analysis

    Directory of Open Access Journals (Sweden)

    Giovanni Beltrame

    2015-08-01

    Full Text Available Triple Modular Redundancy (TMR is a common technique to protect memory elements for digital processing systems subject to radiation effects (such as in space, high-altitude, or near nuclear sources. This paper presents an approach to verify the correct implementation of TMR for the memory elements of a given netlist (i.e., a digital circuit specification using heuristic analysis. The purpose is detecting any issues that might incur during the use of automatic tools for TMR insertion, optimization, place and route, etc. Our analysis does not require a testbench and can perform full, exhaustive coverage within less than an hour even for large designs. This is achieved by applying a divide et impera approach, splitting the circuit into smaller submodules without loss of generality, instead of applying formal verification to the whole netlist at once. The methodology has been applied to a production netlist of the LEON2-FT processor that had reported errors during radiation testing, successfully showing a number of unprotected memory elements, namely 351 flip-flops.

  4. Verification and Validation of the General Mission Analysis Tool (GMAT)

    Science.gov (United States)

    Hughes, Steven P.; Qureshi, Rizwan H.; Cooley, D. Steven; Parker, Joel J. K.; Grubb, Thomas G.

    2014-01-01

    This paper describes the processes and results of Verification and Validation (V&V) efforts for the General Mission Analysis Tool (GMAT). We describe the test program and environments, the tools used for independent test data, and comparison results. The V&V effort produced approximately 13,000 test scripts that are run as part of the nightly buildtest process. In addition, we created approximately 3000 automated GUI tests that are run every two weeks. Presenting all test results are beyond the scope of a single paper. Here we present high-level test results in most areas, and detailed test results for key areas. The final product of the V&V effort presented in this paper was GMAT version R2013a, the first Gold release of the software with completely updated documentation and greatly improved quality. Release R2013a was the staging release for flight qualification performed at Goddard Space Flight Center (GSFC) ultimately resulting in GMAT version R2013b.

  5. Handwritten signature verification by independent component analysis

    OpenAIRE

    Camilleri, Kenneth P.; Desira, Marco; 1st Workshop in Information and Communication Technology (WICT 2008)

    2008-01-01

    This study explores a method that learns about the image structure directly from the image ensemble in contrast to other methods where the relevant structure is determined in advance and extracted using hand-engineered techniques. In tasks involving the analysis of image ensembles, important information is often found in the higher-order relationships among the image pixels. Independent Component Analysis (ICA) is a method that learns high-order dependencies found in the input. ICA has been e...

  6. Effect of verification bias on the sensitivity of fecal occult blood testing: a meta-analysis.

    Science.gov (United States)

    Rosman, Alan S; Korsten, Mark A

    2010-11-01

    There is controversy regarding the sensitivity of fecal occult blood tests (FOBT) for detecting colorectal cancer. Many of the published studies failed to correct for verification bias which may have increased the sensitivity. A meta-analysis of published studies evaluating the sensitivity and specificity of chemical-based FOBT for colorectal cancer was performed. Studies were included if both cancer and control subjects underwent confirmatory testing. We also included studies that attempted to correct for verification bias by either performing colonoscopy on all subjects regardless of the FOBT result or by using longitudinal follow-up. We then compared the sensitivity, specificity, and other diagnostic characteristics of the studies that attempted to correct for verification (n=10) vs. those that did not correct for this bias (n=19). The pooled sensitivity of guaiac-based FOBT for colorectal cancer of studies without verification bias was significantly lower than those studies with this bias [0.36 (95% CI 0.25-0.47) vs. 0.70 (95% CI 0.60-0.80), p=0.001]. The pooled specificity of the studies without verification bias was higher [0.96 (95% CI 0.94-0.97) vs. 0.88 (95% CI 0.84-0.91), p<0.005]. There was no significant difference in the area under the summary receiver operating characteristic curves. More sensitive chemical-based FOBT methods (e.g., Hemoccult® SENSA®) had a higher sensitivity but a lower specificity than standard guaiac methods. The sensitivity of guaiac-based FOBT for colorectal cancer has been overestimated as a result of verification bias. This test may not be sensitive enough to serve as an effective screening option for colorectal cancer.

  7. MSFC Turbine Performance Optimization (TPO) Technology Verification Status

    Science.gov (United States)

    Griffin, Lisa W.; Dorney, Daniel J.; Snellgrove, Lauren M.; Zoladz, Thomas F.; Stroud, Richard T.; Turner, James E. (Technical Monitor)

    2002-01-01

    Capability to optimize for turbine performance and accurately predict unsteady loads will allow for increased reliability, Isp, and thrust-to-weight. The development of a fast, accurate, validated aerodynamic design, analysis, and optimization system is required.

  8. HADES: Microprocessor Hazard Analysis via Formal Verification of Parameterized Systems

    Directory of Open Access Journals (Sweden)

    Lukáš Charvát

    2016-12-01

    Full Text Available HADES is a fully automated verification tool for pipeline-based microprocessors that aims at flaws caused by improperly handled data hazards. It focuses on single-pipeline microprocessors designed at the register transfer level (RTL and deals with read-after-write, write-after-write, and write-after-read hazards. HADES combines several techniques, including data-flow analysis, error pattern matching, SMT solving, and abstract regular model checking. It has been successfully tested on several microprocessors for embedded applications.

  9. In-Field Performance Testing of the Fork Detector for Quantitative Spent Fuel Verification

    Energy Technology Data Exchange (ETDEWEB)

    Gauld, Ian C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hu, Jianwei [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); De Baere, P. [European Commission (Luxembourg). DG Energy, Directorate Nuclear Safeguards; Vaccaro, S. [European Commission (Luxembourg). DG Energy, Directorate Nuclear Safeguards; Schwalbach, P. [European Commission (Luxembourg). DG Energy, Directorate Nuclear Safeguards; Liljenfeldt, Henrik [Swedish Nuclear Fuel and Waste Management Company (Sweden); Tobin, Stephen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-01-01

    Expanding spent fuel dry storage activities worldwide are increasing demands on safeguards authorities that perform inspections. The European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) require measurements to verify declarations when spent fuel is transferred to difficult-to-access locations, such as dry storage casks and the repositories planned in Finland and Sweden. EURATOM makes routine use of the Fork detector to obtain gross gamma and total neutron measurements during spent fuel inspections. Data analysis is performed by modules in the integrated Review and Analysis Program (iRAP) software, developed jointly by EURATOM and the IAEA. Under the framework of the US Department of Energy–EURATOM cooperation agreement, a module for automated Fork detector data analysis has been developed by Oak Ridge National Laboratory (ORNL) using the ORIGEN code from the SCALE code system and implemented in iRAP. EURATOM and ORNL recently performed measurements on 30 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel (Clab), operated by the Swedish Nuclear Fuel and Waste Management Company (SKB). The measured assemblies represent a broad range of fuel characteristics. Neutron count rates for 15 measured pressurized water reactor assemblies are predicted with an average relative standard deviation of 4.6%, and gamma signals are predicted on average within 2.6% of the measurement. The 15 measured boiling water reactor assemblies exhibit slightly larger deviations of 5.2% for the gamma signals and 5.7% for the neutron count rates, compared to measurements. These findings suggest that with improved analysis of the measurement data, existing instruments can provide increased verification of operator declarations of the spent fuel and thereby also provide greater ability to confirm integrity of an assembly. These results support the application of the Fork detector as a fully quantitative spent fuel

  10. [Verification/validation of the performances of analytical method].

    Science.gov (United States)

    Vassault, A; Hulin, A; Chapuzet, E; Arnaud, J; Giroud, C

    2010-12-01

    The verification and validation of methods consist in evaluating the precision, the analytical range, the accuracy, the trueness and the detection limit, if appropriate. These measurements must follow a standardized protocol and the obtained results must be compared to defined quality criteria. Each chapter includes, the purpose, the material used, the operating procedures, the collection of results, the calculation and is illustrated by an example. This document aims at simplifying, standardizing and optimizing the evaluation in order to allow a comparison between laboratories and to facilitate method assessment.

  11. New analysis tools and processes for mask repair verification and defect disposition based on AIMS images

    Science.gov (United States)

    Richter, Rigo; Poortinga, Eric; Scheruebl, Thomas

    2009-10-01

    Using AIMSTM to qualify repairs of defects on photomasks is an industry standard. AIMSTM images match the lithographic imaging performance without the need for wafer prints. Utilization of this capability by photomask manufacturers has risen due to the increased complexity of layouts incorporating RET and phase shift technologies. Tighter specifications by end-users have pushed AIMSTM analysis to now include CD performance results in addition to the traditional intensity performance results. Discussed is a new Repair Verification system for automated analysis of AIMSTM images. Newly designed user interfaces and algorithms guide users through predefined analysis routines as to minimize errors. There are two main routines discussed, one allowing multiple reference sites along with a test/defect site within a single image of repeating features. The second routine compares a test/defect measurement image with a reference measurement image. Three evaluation methods possible with the compared images are discussed in the context of providing thorough analysis capability. This paper highlights new functionality for AIMSTM analysis. Using structured analysis processes and innovative analysis tools leads to a highly efficient and more reliable result reporting of repair verification analysis.

  12. A Computational Framework to Control Verification and Robustness Analysis

    Science.gov (United States)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2010-01-01

    This paper presents a methodology for evaluating the robustness of a controller based on its ability to satisfy the design requirements. The framework proposed is generic since it allows for high-fidelity models, arbitrary control structures and arbitrary functional dependencies between the requirements and the uncertain parameters. The cornerstone of this contribution is the ability to bound the region of the uncertain parameter space where the degradation in closed-loop performance remains acceptable. The size of this bounding set, whose geometry can be prescribed according to deterministic or probabilistic uncertainty models, is a measure of robustness. The robustness metrics proposed herein are the parametric safety margin, the reliability index, the failure probability and upper bounds to this probability. The performance observed at the control verification setting, where the assumptions and approximations used for control design may no longer hold, will fully determine the proposed control assessment.

  13. SiSn diodes: Theoretical analysis and experimental verification

    KAUST Repository

    Hussain, Aftab M.

    2015-08-24

    We report a theoretical analysis and experimental verification of change in band gap of silicon lattice due to the incorporation of tin (Sn). We formed SiSn ultra-thin film on the top surface of a 4 in. silicon wafer using thermal diffusion of Sn. We report a reduction of 0.1 V in the average built-in potential, and a reduction of 0.2 V in the average reverse bias breakdown voltage, as measured across the substrate. These reductions indicate that the band gap of the silicon lattice has been reduced due to the incorporation of Sn, as expected from the theoretical analysis. We report the experimentally calculated band gap of SiSn to be 1.11 ± 0.09 eV. This low-cost, CMOS compatible, and scalable process offers a unique opportunity to tune the band gap of silicon for specific applications.

  14. Development of Out-pile Test Technology for Fuel Assembly Performance Verification

    Energy Technology Data Exchange (ETDEWEB)

    Chun, Tae Hyun; In, W. K.; Oh, D. S. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)] (and others)

    2007-03-15

    Out-pile tests with full scale fuel assembly are to verify the design and to evaluate the performance of the final products. HTL for the hydraulic tests and FAMeCT for mechanical/structural tests were constructed in this project. The maximum operating conditions of HTL are 30 bar, 320 .deg. C, and 500 m3/hr. This facility can perform the pressure drop test, fuel assembly uplift test, and flow induced vibration test. FAMeCT can perform the bending and vibration tests. The verification of the developed facilities were carried out by comparing the reference data of the fuel assembly which was obtained at the Westinghouse Co. The compared data showed a good coincidence within uncertainties. FRETONUS was developed for high temperature and high pressure fretting wear simulator and performance test. A performance test was conducted for 500 hours to check the integrity, endurance, data acquisition capability of the simulator. The technology of turbulent flow analysis and finite element analysis by computation was developed. From the establishments of out-pile test facilities for full scale fuel assembly, the domestic infrastructure for PWR fuel development has been greatly upgraded.

  15. Eutrophication Model Accuracy - Comparison of Calibration and Verification Performance of a Model of the Neuse River Estuary, North Carolina

    Science.gov (United States)

    Bowen, J. D.

    2004-12-01

    A modified version of an existing two-dimensional, laterally averaged model (CE-QUAL-W2) was applied to predict water quality conditions in the lower 80-km of the Neuse River Estuary. Separate time periods were modeled for calibration and verification (model testing). The calibration time period ran from June 1997 to December 1999, while the verification time period ran from January to December 2000. During this time the estuary received two periods of unusually high inflows in early 1998 and again in September and October 1999. The latter rainfall event loaded the estuary with the equivalent of nearly two years worth of water and dissolved inorganic nitrogen in just six weeks. Overall, the level of calibration performance achieved by the model was comparable to that attained in other eutrophication model studies of eastern U.S. estuaries. The model most accurately simulated water quality constituents having a consistent spatial variation within the estuary (e.g. nitrate, salinity), and was least accurate for constituents without a consistent spatial variation (e.g. phosphate, chlorophyll-a). Calibration performance varied widely between the three algal groupings modeled (diatoms and dinoflagellates, cryptomonads and chlorophytes, cyanobacteria). Model performance during verification was comparable to the performance seen during calibration. The model's salinity prediction capabilities were somewhat better in the validation, while dissolved oxygen performance in the validation year was slightly poorer compared to calibration performance. Nutrient and chlorophyll-a performance were virtually the same between the calibration and verification exercises. As part of the TMDL analysis, an unsuccessful attempt was made to capture model error as a component of model uncertainty, but it was found that model residuals were neither unbiased nor normally distributed.

  16. 15 CFR 743.2 - High performance computers: Post shipment verification reporting.

    Science.gov (United States)

    2010-01-01

    ... 15 Commerce and Foreign Trade 2 2010-01-01 2010-01-01 false High performance computers: Post... ADMINISTRATION REGULATIONS SPECIAL REPORTING § 743.2 High performance computers: Post shipment verification... certain computers to destinations in Computer Tier 3, see § 740.7(d) for a list of these destinations...

  17. Design, analysis, and test verification of advanced encapsulation system

    Science.gov (United States)

    Garcia, A.; Minning, C.

    1981-01-01

    Procurement of 4 in x 4 in polycrystalline solar cells were proceeded with some delays. A total of 1200 cells were procured for use in both the verification testing and qualification testing. Additional thermal structural analyses were run and the data are presented. An outline of the verification testing is included with information on test specimen construction.

  18. TRACEABILITY OF PRECISION MEASUREMENTS ON COORDINATE MEASURING MACHINES – PERFORMANCE VERIFICATION OF CMMs

    DEFF Research Database (Denmark)

    De Chiffre, Leonardo; Sobiecki, René; Tosello, Guido

    This document is used in connection with one exercise of 30 minutes duration as a part of the course VISION ONLINE – One week course on Precision & Nanometrology. The exercise concerns performance verification of the volumetric measuring capability of a small volume coordinate measuring machine. ....... This section contains reference to the American standard normative ANSI/ASME and a description of the exercise....

  19. Using SPIN for Verification of Multi-agent Data Analysis

    Directory of Open Access Journals (Sweden)

    N. O. Garanina

    2014-01-01

    Full Text Available The paper presents an approach to formal verification of multi-agent data analysis algorithms for ontology population. The system agents correspond to information items of the input data and the rule of ontology population and data processing. They determine values of information objects obtained at the preliminary phase of the analysis. The agents working in parallel check the syntactic and semantic consistency of tuples of information items. Since the agents operate in parallel, it is necessary to verify some important properties of the system related to it, such as the property that the controller agent correctly determines the system termination. In our approach, the model checking tool SPIN is used. The protocols of agents are written in Promela language (the input language of the tool and the properties of the multi-agent data analysis system are expressed in the liner time logic LTL. We carried out several experiments to check this model in various modes of the tool and various numbers of agents.

  20. Power Performance Verification of a Wind Farm Using the Friedman’s Test

    Directory of Open Access Journals (Sweden)

    Wilmar Hernandez

    2016-06-01

    Full Text Available In this paper, a method of verification of the power performance of a wind farm is presented. This method is based on the Friedman’s test, which is a nonparametric statistical inference technique, and it uses the information that is collected by the SCADA system from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. Here, the guaranteed power curve of the wind turbines is used as one more wind turbine of the wind farm under assessment, and a multiple comparison method is used to investigate differences between pairs of wind turbines with respect to their power performance. The proposed method says whether the power performance of the specific wind farm under assessment differs significantly from what would be expected, and it also allows wind farm owners to know whether their wind farm has either a perfect power performance or an acceptable power performance. Finally, the power performance verification of an actual wind farm is carried out. The results of the application of the proposed method showed that the power performance of the specific wind farm under assessment was acceptable.

  1. Power Performance Verification of a Wind Farm Using the Friedman's Test.

    Science.gov (United States)

    Hernandez, Wilmar; López-Presa, José Luis; Maldonado-Correa, Jorge L

    2016-06-03

    In this paper, a method of verification of the power performance of a wind farm is presented. This method is based on the Friedman's test, which is a nonparametric statistical inference technique, and it uses the information that is collected by the SCADA system from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. Here, the guaranteed power curve of the wind turbines is used as one more wind turbine of the wind farm under assessment, and a multiple comparison method is used to investigate differences between pairs of wind turbines with respect to their power performance. The proposed method says whether the power performance of the specific wind farm under assessment differs significantly from what would be expected, and it also allows wind farm owners to know whether their wind farm has either a perfect power performance or an acceptable power performance. Finally, the power performance verification of an actual wind farm is carried out. The results of the application of the proposed method showed that the power performance of the specific wind farm under assessment was acceptable.

  2. Power Performance Verification of a Wind Farm Using the Friedman’s Test

    Science.gov (United States)

    Hernandez, Wilmar; López-Presa, José Luis; Maldonado-Correa, Jorge L.

    2016-01-01

    In this paper, a method of verification of the power performance of a wind farm is presented. This method is based on the Friedman’s test, which is a nonparametric statistical inference technique, and it uses the information that is collected by the SCADA system from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. Here, the guaranteed power curve of the wind turbines is used as one more wind turbine of the wind farm under assessment, and a multiple comparison method is used to investigate differences between pairs of wind turbines with respect to their power performance. The proposed method says whether the power performance of the specific wind farm under assessment differs significantly from what would be expected, and it also allows wind farm owners to know whether their wind farm has either a perfect power performance or an acceptable power performance. Finally, the power performance verification of an actual wind farm is carried out. The results of the application of the proposed method showed that the power performance of the specific wind farm under assessment was acceptable. PMID:27271628

  3. Performance verification of Surface Mapping Instrument developed at CGM

    DEFF Research Database (Denmark)

    Bariani, Paolo

    covering applications in micro-technology and in surface metrology. The paper addresses the description of the stitching procedure, its validation, and a more comprehensive metrological evaluation of the AFM-CMM instrument performance. Experimental validation of the method was performed by the use of...

  4. Verification of spectrophotometric method for nitrate analysis in water samples

    Science.gov (United States)

    Kurniawati, Puji; Gusrianti, Reny; Dwisiwi, Bledug Bernanti; Purbaningtias, Tri Esti; Wiyantoko, Bayu

    2017-12-01

    The aim of this research was to verify the spectrophotometric method to analyze nitrate in water samples using APHA 2012 Section 4500 NO3-B method. The verification parameters used were: linearity, method detection limit, level of quantitation, level of linearity, accuracy and precision. Linearity was obtained by using 0 to 50 mg/L nitrate standard solution and the correlation coefficient of standard calibration linear regression equation was 0.9981. The method detection limit (MDL) was defined as 0,1294 mg/L and limit of quantitation (LOQ) was 0,4117 mg/L. The result of a level of linearity (LOL) was 50 mg/L and nitrate concentration 10 to 50 mg/L was linear with a level of confidence was 99%. The accuracy was determined through recovery value was 109.1907%. The precision value was observed using % relative standard deviation (%RSD) from repeatability and its result was 1.0886%. The tested performance criteria showed that the methodology was verified under the laboratory conditions.

  5. Java bytecode verification via static single assignment form

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian W.; Franz, Michael

    2008-01-01

    Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism that trans......Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism...... the additional benefit of generating SSA as a side effect, which may be immediately useful for a subsequent dynamic compilation stage....

  6. Verification and Performance Evaluation of Timed Game Strategies

    DEFF Research Database (Denmark)

    David, Alexandre; Fang, Huixing; Larsen, Kim Guldstrand

    2014-01-01

    under the synthesized strategy in order to verify additional correctness properties. Secondly, we want to apply statistical model checking to evaluate various performance aspects of the synthesized strategy. For this, the underlying timed game is extended with relevant price and stochastic information......Control synthesis techniques, based on timed games, derive strategies to ensure a given control objective, e.g., time-bounded reachability. Model checking verifies correctness properties of systems. Statistical model checking can be used to analyse performance aspects of systems, e.g., energy......, then verifies and evaluates this strategy with respect to additional properties. We demonstrate the usefulness of this new branch of Uppaal using two case-studies....

  7. Experimental Verification Of Hyper-V Performance Isolation Level

    Directory of Open Access Journals (Sweden)

    Krzysztof Rzecki

    2014-01-01

    Full Text Available The need for cost optimization in a broad sense constitutes the basis of operation of every enterprise. In the case of IT structure, which is present in almost every field of activity these days, one of the most commonly applied technologies leading to good cost-to-profit adjustment is virtualization. It consists in locating several operational systems with IT systems on a single server. In order for such optimization to be carried out correctly it has to be strictly controlled by means of allocating access to resources, which is known as performance isolation. Modern virtualizers allow to set up this allocation in quantitative terms (the number of processors, size of RAM, or disc space. It appears, however, that in qualitative terms (processor's time, RAM or hard disc bandwidth the actual allocation of resources does not always correspond with this configuration. This paper provides an experimental presentation of the achievable level of performance isolation of the Hyper-V virtualizer.

  8. Final tests and performances verification of the European ALMA antennas

    Science.gov (United States)

    Marchiori, Gianpietro; Rampini, Francesco

    2012-09-01

    The Atacama Large Millimeter Array (ALMA) is under erection in Northern Chile. The array consists of a large number (up to 64) of 12 m diameter antennas and a number of smaller antennas, to be operated on the Chajnantor plateau at 5000 m altitude. The antennas will operate up to 950 GHz so that their mechanical performances, in terms of surface accuracy, pointing precision and dimensional stability, are very tight. The AEM consortium constituted by Thales Alenia Space France, Thales Alenia Space Italy, European Industrial Engineering (EIE GROUP), and MT Mechatronics is assembling and testing the 25 antennas. As of today, the first set of antennas have been delivered to ALMA for science. During the test phase with ESO and ALMA, the European antennas have shown excellent performances ensuring the specification requirements widely. The purpose of this paper is to present the different results obtained during the test campaign: surface accuracy, pointing error, fast motion capability and residual delay. Very important was also the test phases that led to the validation of the FE model showing that the antenna is working with a good margin than predicted at design level thanks also to the assembly and integration techniques.

  9. Adding Change Impact Analysis to the Formal Verification of C Programs

    Science.gov (United States)

    Autexier, Serge; Lüth, Christoph

    Handling changes to programs and specifications efficiently is a particular challenge in formal software verification. Change impact analysis is an approach to this challenge where the effects of changes made to a document (such as a program or specification) are described in terms of rules on a semantic representation of the document. This allows to describe and delimit the effects of syntactic changes semantically. This paper presents an application of generic change impact analysis to formal software verification, using the GMoC and SAMS tools. We adapt the GMoC tool for generic change impact analysis to the SAMS verification framework for the formal verification of C programs, and show how a few simple rules are sufficient to capture the essence of change management.

  10. A study on periodic safety verification on MOV performance

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Du Eon; Park, Jong Ho; Han, Jae Seob; Kang, Hyeon Taek; Lee, Jeong Min; Song, Kyu Jo; Shin, Wan Sun; Lee, Taek Sang [Chungnam National Univ., Taejon (Korea, Republic of)

    2000-03-15

    The objectives of this study, therefore, are to define the optimized valve diagnostic variances which early detect the abnormal conditions during the surveillance of the valve and consequently reduce the radiation exposure. The major direction of the development is to detect in advance the valve degradation by monitoring the motor current and power signals which can be obtained remotely at Motor Control Center (MCC). A series of valve operation experiments have been performed under several kinds of abnormal conditions by using the test apparatus which consists of a 3-inch gate valve, a motor(0.33 Hp, 460V, 0.8A, 1560rpm), actuator(SMB-000-2 type), some measuring devices(power analyzer, oscilloscope, data recorder and current transformer, AC current and voltage transducer) and connection cables.

  11. Performance verification of the CMS Phase-1 Upgrade Pixel detector

    Science.gov (United States)

    Veszpremi, V.

    2017-12-01

    The CMS tracker consists of two tracking systems utilizing semiconductor technology: the inner pixel and the outer strip detectors. The tracker detectors occupy the volume around the beam interaction region between 3 cm and 110 cm in radius and up to 280 cm along the beam axis. The pixel detector consists of 124 million pixels, corresponding to about 2 m 2 total area. It plays a vital role in the seeding of the track reconstruction algorithms and in the reconstruction of primary interactions and secondary decay vertices. It is surrounded by the strip tracker with 10 million read-out channels, corresponding to 200 m 2 total area. The tracker is operated in a high-occupancy and high-radiation environment established by particle collisions in the LHC . The current strip detector continues to perform very well. The pixel detector that has been used in Run 1 and in the first half of Run 2 was, however, replaced with the so-called Phase-1 Upgrade detector. The new system is better suited to match the increased instantaneous luminosity the LHC would reach before 2023. It was built to operate at an instantaneous luminosity of around 2×1034 cm‑2s‑1. The detector's new layout has an additional inner layer with respect to the previous one; it allows for more efficient tracking with smaller fake rate at higher event pile-up. The paper focuses on the first results obtained during the commissioning of the new detector. It also includes challenges faced during the first data taking to reach the optimal measurement efficiency. Details will be given on the performance at high occupancy with respect to observables such as data-rate, hit reconstruction efficiency, and resolution.

  12. On-Sky Performance Verification of the CHARIS IFS

    Science.gov (United States)

    Groff, Tyler Dean; Chilcote, Jeffrey K.; Kasdin, Jeremy; Brandt, Timothy; Galvin, Michael; Loomis, Craig; Carr, Michael; Knapp, Gillian R.; Guyon, Olivier; Jovanovic, Nemanja; Lozi, Julien; Takato, Naruhisa; Hayashi, Masahiko

    2017-01-01

    The Coronagraphic High Angular Resolution Imaging Spectrograph (CHARIS) is an integral field spectrograph (IFS) built for the Subaru telescope. CHARIS has been delivered to the observatory and now sits behind the Subaru Coronagraphic Extreme Adaptive Optics (SCExAO) and AO188 adaptive optics systems. CHARIS is designed to detect objects five orders of magnitude dimmer than their parent star down to an 80 milliarcsecond inner working angle. CHARIS is a lenslet-based IFS and has two fundamental operating modes. In characterization mode, CHARIS has a ‘high-resolution’ prism providing an as-built average spectral resolution of R75.2, R65.2, and R77.1 in J, H, and K bands respectively. Unique to CHARIS is a second mode designed for discovery, with a ‘low-resolution’ prism providing an as-built spectral resolution of R18.4 that spans the full J+H+K spectrum (1.15-2.37 microns). This discovery mode has already proven better than 5-sigma detections of HR8799c,d,e when combining ADI+SDI. Using SDI alone, planets c and d have been detected in a single 24 second image. The CHARIS team is optimizing instrument performance and refining ADI+SDI recombination to maximize our contrast detection limit. In addition to the new observing modes, CHARIS has demonstrated a design with high robustness to spectral crosstalk. The integrated spectral cross-contamination has not exceeded 4%, thanks to a combination of post-lenslet tolerances and a carefully designed pinhole grid mask directly printed onto the back side of the lenslet array in black chrome. CHARIS is in the final stages of commissioning, with the instrument open for science observations beginning February 2017. A Wollaston prism upgrade to the instrument will be commissioned later in 2017. Here we review the science case, design, on-sky performance, and lessons learned both in hardware and operationally that are directly applicable to future exoplanet instruments such as the WFIRST CGI IFS.

  13. Measurement and Verification of Energy Savings and Performance from Advanced Lighting Controls

    Energy Technology Data Exchange (ETDEWEB)

    PNNL

    2016-02-21

    This document provides a framework for measurement and verification (M&V) of energy savings, performance, and user satisfaction from lighting retrofit projects involving occupancy-sensor-based, daylighting, and/or other types of automatic lighting. It was developed to provide site owners, contractors, and other involved organizations with the essential elements of a robust M&V plan for retrofit projects and to assist in developing specific project M&V plans.

  14. On Demand Internal Short Circuit Device Enables Verification of Safer, Higher Performing Battery Designs

    Energy Technology Data Exchange (ETDEWEB)

    Darcy, Eric; Keyser, Matthew

    2017-05-15

    The Internal Short Circuit (ISC) device enables critical battery safety verification. With the aluminum interstitial heat sink between the cells, normal trigger cells cannot be driven into thermal runaway without excessive temperature bias of adjacent cells. With an implantable, on-demand ISC device, thermal runaway tests show that the conductive heat sinks protected adjacent cells from propagation. High heat dissipation and structural support of Al heat sinks show high promise for safer, higher performing batteries.

  15. TRACEABILITY OF ON COORDINATE MEASURING MACHINES – CALIBRATION AND PERFORMANCE VERIFICATION

    DEFF Research Database (Denmark)

    De Chiffre, Leonardo; Savio, Enrico; Bariani, Paolo

    This document is used in connection with three exercises each of 45 minutes duration as a part of the course GEOMETRICAL METROLOGY AND MACHINE TESTING. The exercises concern three aspects of coordinate measurement traceability: 1) Performance verification of a CMM using a ball bar; 2) Calibration...... of an optical coordinate measuring machine; 3) Uncertainty assessment using the ISO 15530-3 “Calibrated workpieces” procedure....

  16. Verification of temporal-causal network models by mathematical analysis

    Directory of Open Access Journals (Sweden)

    Jan Treur

    2016-04-01

    Full Text Available Abstract Usually dynamic properties of models can be analysed by conducting simulation experiments. But sometimes, as a kind of prediction properties can also be found by calculations in a mathematical manner, without performing simulations. Examples of properties that can be explored in such a manner are: whether some values for the variables exist for which no change occurs (stationary points or equilibria, and how such values may depend on the values of the parameters of the model and/or the initial values for the variables whether certain variables in the model converge to some limit value (equilibria and how this may depend on the values of the parameters of the model and/or the initial values for the variables whether or not certain variables will show monotonically increasing or decreasing values over time (monotonicity how fast a convergence to a limit value takes place (convergence speed whether situations occur in which no convergence takes place but in the end a specific sequence of values is repeated all the time (limit cycle Such properties found in an analytic mathematical manner can be used for verification of the model by checking them for the values observed in simulation experiments. If one of these properties is not fulfilled, then there will be some error in the implementation of the model. In this paper some methods to analyse such properties of dynamical models will be described and illustrated for the Hebbian learning model, and for dynamic connection strengths in social networks. The properties analysed by the methods discussed cover equilibria, increasing or decreasing trends, recurring patterns (limit cycles, and speed of convergence to equilibria.

  17. Modeling and Verification of Insider Threats Using Logical Analysis

    DEFF Research Database (Denmark)

    Kammuller, Florian; Probst, Christian W.

    2017-01-01

    and use a common trick from the formal verification of security protocols, showing that it is applicable to insider threats. We introduce briefly a three-step process of social explanation, illustrating that it can be applied fruitfully to the characterization of insider threats. We introduce the insider...

  18. Tempered Water Lower Port Connector Structural Analysis Verification

    Energy Technology Data Exchange (ETDEWEB)

    CREA, B.A.

    2000-05-05

    Structural analysis of the lower port connection of the Tempered Water System of the Cold Vacuum Drying Facility was performed. Subsequent detailed design changes to enhance operability resulted in the need to re-evaluate the bases of the original analysis to verify its continued validity. This evaluation is contained in Appendix A of this report. The original evaluation is contained in Appendix B.

  19. Wind turbine power performance verification in complex terrain and wind farms

    Energy Technology Data Exchange (ETDEWEB)

    Friis Pedersen, T.; Gjerding, S.; Ingham, P.; Enevoldsen, P.; Kjaer Hansen, J.; Kanstrup Joergensen, H.

    2002-04-01

    The IEC/EN 61400-12 Ed 1 standard for wind turbine power performance testing is being revised. The standard will be divided into four documents. The first one of these is more or less a revision of the existing document on power performance measurements on individual wind turbines. The second one is a power performance verification procedure for individual wind turbines. The third is a power performance measurement procedure of whole wind farms, and the fourth is a power performance measurement procedure for non-grid (small) wind turbines. This report presents work that was made to support the basis for this standardisation work. The work addressed experience from several national and international research projects and contractual and field experience gained within the wind energy community on this matter. The work was wide ranging and addressed 'grey' areas of knowledge regarding existing methodologies, which has then been investigated in more detail. The work has given rise to a range of conclusions and recommendations regarding: guaranties on power curves in complex terrain; investors and bankers experience with verification of power curves; power performance in relation to regional correction curves for Denmark; anemometry and the influence of inclined flow. (au)

  20. Experimental verification and analytical approach to influence stator skew on electromagnetic performance of permanent magnet generators with multipole rotor

    Science.gov (United States)

    Choi, Jang-Young; Jang, Seok-Myeong; Ko, Kyoung-Jin

    2009-04-01

    This paper deals with experimental verification and analytical approach to influence stator skew on electromagnetic performance of a permanent magnet generator (PMG) with multipole rotor. The analytical expressions for magnetic field distributions are due to permanent magnets and the two-dimensional permeance function considering skew effects are established. On the basis of these analytical solutions, the analytical solutions for cogging torque and back-emf considering skew effects are also derived. Then, by applying estimated electrical parameters to a simple equivalent circuit of one phase for the PMG, output performances of the PMG with/without a skewed stator are investigated. Finally, by confirming that all analytical results are validated extensively by nonlinear finite element calculations and measurements, the validity of analysis methods presented in this paper is verified, and the influence stator skew on cogging torque, back-emf, and output performances of the PMG is also clearly described.

  1. The relative importance of managerial competencies for predicting the perceived job performance of Broad-Based Black Economic Empowerment verification practitioners

    Directory of Open Access Journals (Sweden)

    Barbara M. Seate

    2016-02-01

    Full Text Available Orientation: There is a need for the growing Broad-Based Black Economic Empowerment (B-BBEE verification industry to assess competencies and determine skills gaps for the management of the verification practitioners’ perceived job performance. Knowing which managerial competencies are important for different managerial functions is vital for developing and improving training and development programmes.Research purpose: The purpose of this study was to determine the managerial capabilities that are required of the B-BBEE verification practitioners, in order to improve their perceived job performance.Motivation for the study: The growing number of the B-BBEE verification practitioners calls for more focused training and development. Generating such a training and development programme demands empirical research into the relative importance of managerial competencies.Research approach, design and method: A quantitative design using the survey approach was adopted. A questionnaire was administered to a stratified sample of 87 B-BBEE verification practitioners. Data were analysed using the Statistical Package for Social Sciences (version 22.0 and Smart Partial Least Squares software.Main findings: The results of the correlation analysis revealed that there were strong and positive associations between technical skills, interpersonal skills, compliance to standards and ethics, managerial skills and perceived job performance. Results of the regression analysis showed that managerial skills, compliance to standards and ethics and interpersonal skills were statistically significant in predicting perceived job performance. However, technical skills were insignificant in predicting perceived job performance.Practical/managerial implications: The study has shown that the B-BBEE verification industry, insofar as the technical skills of the practitioners are concerned, does have suitably qualified staff with the requisite educational qualifications. At the

  2. Wind turbine power performance verification in complex terrain and wind farms

    DEFF Research Database (Denmark)

    Friis Pedersen, Troels; Gjerding, S.; Enevoldsen, P.

    2002-01-01

    is a power performance verification procedure for individual wind turbines. The third is a power performance measurement procedure of whole wind farms, and the fourth is a power performance measurement procedurefor non-grid (small) wind turbines. This report presents work that was made to support the basis...... for this standardisation work. The work addressed experience from several national and international research projects and contractual and field experiencegained within the wind energy community on this matter. The work was wide ranging and addressed 'grey' areas of knowledge regarding existing methodologies, which has......The IEC/EN 61400-12 Ed 1 standard for wind turbine power performance testing is being revised. The standard will be divided into four documents. The first one of these is more or less a revision of the existing document on power performance measurementson individual wind turbines. The second one...

  3. Analysis, Test and Verification in The Presence of Variability (Dagstuhl Seminar 13091)

    DEFF Research Database (Denmark)

    2014-01-01

    This report documents the program and the outcomes of Dagstuhl Seminar 13091 “Analysis, Test and Verification in The Presence of Variability”. The seminar had the goal of consolidating and stimulating research on analysis of software models with variability, enabling the design of variability-awa...

  4. Linear models to perform treaty verification tasks for enhanced information security

    Energy Technology Data Exchange (ETDEWEB)

    MacGahan, Christopher J., E-mail: cmacgahan@optics.arizona.edu [College of Optical Sciences, The University of Arizona, 1630 E. University Blvd, Tucson, AZ 85721 (United States); Sandia National Laboratories, Livermore, CA 94551 (United States); Kupinski, Matthew A. [College of Optical Sciences, The University of Arizona, 1630 E. University Blvd, Tucson, AZ 85721 (United States); Brubaker, Erik M.; Hilton, Nathan R.; Marleau, Peter A. [Sandia National Laboratories, Livermore, CA 94551 (United States)

    2017-02-01

    Linear mathematical models were applied to binary-discrimination tasks relevant to arms control verification measurements in which a host party wishes to convince a monitoring party that an item is or is not treaty accountable. These models process data in list-mode format and can compensate for the presence of variability in the source, such as uncertain object orientation and location. The Hotelling observer applies an optimal set of weights to binned detector data, yielding a test statistic that is thresholded to make a decision. The channelized Hotelling observer applies a channelizing matrix to the vectorized data, resulting in a lower dimensional vector available to the monitor to make decisions. We demonstrate how incorporating additional terms in this channelizing-matrix optimization offers benefits for treaty verification. We present two methods to increase shared information and trust between the host and monitor. The first method penalizes individual channel performance in order to maximize the information available to the monitor while maintaining optimal performance. Second, we present a method that penalizes predefined sensitive information while maintaining the capability to discriminate between binary choices. Data used in this study was generated using Monte Carlo simulations for fission neutrons, accomplished with the GEANT4 toolkit. Custom models for plutonium inspection objects were measured in simulation by a radiation imaging system. Model performance was evaluated and presented using the area under the receiver operating characteristic curve.

  5. Integrated Java Bytecode Verification

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian; Franz, Michael

    2005-01-01

    Existing Java verifiers perform an iterative data-flow analysis to discover the unambiguous type of values stored on the stack or in registers. Our novel verification algorithm uses abstract interpretation to obtain definition/use information for each register and stack location in the program...

  6. Automation and uncertainty analysis of a method for in-vivo range verification in particle therapy.

    Science.gov (United States)

    Frey, K; Unholtz, D; Bauer, J; Debus, J; Min, C H; Bortfeld, T; Paganetti, H; Parodi, K

    2014-10-07

    We introduce the automation of the range difference calculation deduced from particle-irradiation induced β(+)-activity distributions with the so-called most-likely-shift approach, and evaluate its reliability via the monitoring of algorithm- and patient-specific uncertainty factors. The calculation of the range deviation is based on the minimization of the absolute profile differences in the distal part of two activity depth profiles shifted against each other. Depending on the workflow of positron emission tomography (PET)-based range verification, the two profiles under evaluation can correspond to measured and simulated distributions, or only measured data from different treatment sessions. In comparison to previous work, the proposed approach includes an automated identification of the distal region of interest for each pair of PET depth profiles and under consideration of the planned dose distribution, resulting in the optimal shift distance. Moreover, it introduces an estimate of uncertainty associated to the identified shift, which is then used as weighting factor to 'red flag' problematic large range differences. Furthermore, additional patient-specific uncertainty factors are calculated using available computed tomography (CT) data to support the range analysis. The performance of the new method for in-vivo treatment verification in the clinical routine is investigated with in-room PET images for proton therapy as well as with offline PET images for proton and carbon ion therapy. The comparison between measured PET activity distributions and predictions obtained by Monte Carlo simulations or measurements from previous treatment fractions is performed. For this purpose, a total of 15 patient datasets were analyzed, which were acquired at Massachusetts General Hospital and Heidelberg Ion-Beam Therapy Center with in-room PET and offline PET/CT scanners, respectively. Calculated range differences between the compared activity distributions are reported in a

  7. A Framework for Performing Verification and Validation in Reuse Based Software Engineering

    Science.gov (United States)

    Addy, Edward A.

    1997-01-01

    Verification and Validation (V&V) is currently performed during application development for many systems, especially safety-critical and mission- critical systems. The V&V process is intended to discover errors, especially errors related to critical processing, as early as possible during the development process. The system application provides the context under which the software artifacts are validated. This paper describes a framework that extends V&V from an individual application system to a product line of systems that are developed within an architecture-based software engineering environment. This framework includes the activities of traditional application-level V&V, and extends these activities into domain engineering and into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for the activities.

  8. Improving Speaker Verification Performance in Presence of Spoofing Attacks Using Out-of-Domain Spoofed Data

    DEFF Research Database (Denmark)

    Sarkar, Achintya Kumar; Sahidullah, Md; Tan, Zheng-Hua

    2017-01-01

    Automatic speaker verification (ASV) systems are vulnerable to spoofing attacks using speech generated by voice conversion and speech synthesis techniques. Commonly, a countermeasure (CM) system is integrated with an ASV system for improved protection against spoofing attacks. But integration...... of the two systems is challenging and often leads to increased false rejection rates. Furthermore, the performance of CM severely degrades if in-domain development data are unavailable. In this study, therefore, we propose a solution that uses two separate background models – one from human speech...... of ASVspoof 2015 corpus consisting of text-independent ASV tasks with short utterances. Our proposed system reduces error rates in the presence of spoofing attacks by using out-of-domain spoofed data for system development, while maintaining the performance for zero-effort imposter attacks compared...

  9. Verification and validation of a Work Domain Analysis with turing machine task analysis.

    Science.gov (United States)

    Rechard, J; Bignon, A; Berruet, P; Morineau, T

    2015-03-01

    While the use of Work Domain Analysis as a methodological framework in cognitive engineering is increasing rapidly, verification and validation of work domain models produced by this method are becoming a significant issue. In this article, we propose the use of a method based on Turing machine formalism named "Turing Machine Task Analysis" to verify and validate work domain models. The application of this method on two work domain analyses, one of car driving which is an "intentional" domain, and the other of a ship water system which is a "causal domain" showed the possibility of highlighting improvements needed by these models. More precisely, the step by step analysis of a degraded task scenario in each work domain model pointed out unsatisfactory aspects in the first modelling, like overspecification, underspecification, omission of work domain affordances, or unsuitable inclusion of objects in the work domain model. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  10. INTERNAL MEASUREMENTS FOR FAILURE ANALYSIS AND CHIP VERIFICATION OF VLSI CIRCUITS

    OpenAIRE

    KÖlzer, J.; Otto, J.

    1989-01-01

    Chip verification and failure analysis during the design evaluation of very large scale integrated (VLSI) devices call for highly accurate internal analysis methods. After having characterized the first silicon by automated functional testing, classification and statistical analysis can be carried out : In this way a rough electrical evaluation of the material under investigation can be made. Further clues to a faulty device behavior can only be obtained by internal measurements. Serious malf...

  11. Verification of HYDRASTAR: Analysis of hydraulic conductivity fields and dispersion

    Energy Technology Data Exchange (ETDEWEB)

    Morris, S.T.; Cliffe, K.A. [AEA Technology, Harwell (United Kingdom)

    1994-10-01

    HYDRASTAR is a code for the stochastic simulation of groundwater flow. It can be used to simulate both time-dependent and steady-state groundwater flow at constant density. Realizations of the hydraulic conductivity field are generated using the Turning Bands algorithm. The realizations can be conditioned on measured values of the hydraulic conductivity using Kriging. This report describes a series of verification studies that have been carried out on the code. The first study concerns the accuracy of the implementation of the Turning Bands algorithm in HYDRASTAR. The implementation has been examined by evaluating the ensemble mean and covariance of the generated fields analytically and comparing them with their prescribed values. Three other studies were carried out in which HYDRASTAR was used to solve problems of uniform mean flow and to calculate the transport and dispersion of fluid particles. In all three cases the hydraulic conductivity fields were unconditioned. The first two were two-dimensional: one at small values of the variance of the logarithm of the hydraulic conductivity for which there exists analytical results that the code can be compared with, and one at moderate variance where the results can only be compared with those obtained by another code. The third problem was three dimensional with a small variance and again analytical results are available for comparison. 14 refs, 24 figs.

  12. Performance Assessment and Scooter Verification of Nano-Alumina Engine Oil

    Directory of Open Access Journals (Sweden)

    Yu-Feng Lue

    2016-09-01

    Full Text Available The performance assessment and vehicle verification of nano-alumina (Al2O3 engine oil (NAEO were conducted in this study. The NAEO was produced by mixing Al2O3 nanoparticles with engine oil using a two-step synthesis method. The weight fractions of the Al2O3 nanoparticles in the four test samples were 0 (base oil, 0.5, 1.5, and 2.5 wt. %. The measurement of basic properties included: (1 density; (2 viscosity at various sample temperatures (20–80 °C. A rotary tribology testing machine with a pin-on-disk apparatus was used for the wear test. The measurement of the before-and-after difference of specimen (disk weight (wear test indicates that the NAEO with 1.5 wt. % Al2O3 nanoparticles (1.5 wt. % NAEO was the chosen candidate for further study. For the scooter verification on an auto-pilot dynamometer, there were three tests, including: (1 the European Driving Cycle (ECE40 driving cycle; (2 constant speed (50 km/h; and (3 constant throttle positions (20%, 40%, 60%, and 90%. For the ECE40 driving cycle and the constant speed tests, the fuel consumption was decreased on average by 2.75%, while it was decreased by 3.57% for the constant throttle case. The experimental results prove that the engine oil with added Al2O3 nanoparticles significantly decreased the fuel consumption. In the future, experiments with property tests of other nano-engine oils and a performance assessment of the nano-engine-fuel will be conducted.

  13. Verification of Ceramic Structures

    Science.gov (United States)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit

    2012-07-01

    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  14. Verification of HELIOS/MASTER Nuclear Analysis System for SMART Research Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Kang Seog; Cho, Jin Young; Lee, Chung Chan; Zee, Sung Quun

    2005-07-15

    Nuclear design for the SMART reactor is performed by using the transport lattice code HELIOS and the core analysis code MASTER. HELIOS code developed by Studsvik Scandpower in Norway is a transport lattice code for the neutron and gamma behavior, and is used to generate few group constants. MASTER code is a nodal diffusion code developed by KAERI, and is used to analyze reactor physics. This nuclear design code package requires verification. Since the SMART reactor is unique, it is impossible to verify this code system through the comparison of the calculation results with the measured ones. Therefore, the uncertainties for the nuclear physics parameters calculated by HELIOS/MASTER have been evaluated indirectly. Since Monte Carlo calculation includes least approximations an assumptions to simulate a neutron behavior, HELIOS/MASTER has been verified by this one. Monte Carlo code has been verified by the Kurchatov critical experiments similar to SMART reactor, and HELIOS/MASTER code package has been verified by Monte Carlo calculations for the SMART research reactor.

  15. Verification of HELIOS/MASTER Nuclear Analysis System for SMART Research Reactor, Rev. 1.0

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Kyung Hoon; Kim, Kang Seog; Cho, Jin Young; Lee, Chung Chan; Zee, Sung Quun

    2005-12-15

    Nuclear design for the SMART reactor is performed by using the transport lattice code HELIOS and the core analysis code MASTER. HELIOS code developed by Studsvik Scandpower in Norway is a transport lattice code for the neutron and gamma behavior, and is used to generate few group constants. MASTER code is a nodal diffusion code developed by KAERI, and is used to analyze reactor physics. This nuclear design code package requires verification. Since the SMART reactor is unique, it is impossible to verify this code system through the comparison of the calculation results with the measured ones. Therefore, the uncertainties for the nuclear physics parameters calculated by HELIOS/MASTER have been evaluated indirectly. Since Monte Carlo calculation includes least approximations an assumptions to simulate a neutron behavior, HELIOS/MASTER has been verified by this one. Monte Carlo code has been verified by the Kurchatov critical experiments similar to SMART reactor, and HELIOS/MASTER code package has been verified by Monte Carlo calculations for the SMART research reactor.

  16. Modeling and verification of insider threats using logical analysis

    NARCIS (Netherlands)

    Kammüller, Florian; Probst, Christian W.

    2016-01-01

    In this paper, we combine formal modeling and analysis of infrastructures of organizations with sociological explanation to provide a framework for insider threat analysis. We use the higher order logic (HOL) proof assistant Isabelle/HOL to support this framework. In the formal model, we exhibit and

  17. Analysis-Based Verification: A Programmer-Oriented Approach to the Assurance of Mechanical Program Properties

    Science.gov (United States)

    2010-05-27

    verifying analyses and allow users to understand how the tool reached its conclusions. Bandera [30] is a system that extracts models from Java source for...verification by a model checker and maps verifier outputs back to the original source code. Bandera represents, simi- lar to drop-sea, an effort to...establish an effective architecture for assurance but is focused on model checking rather than program analysis. Similar to our work, Bandera , and other

  18. Flammable Gas Refined Safety Analysis Tool Software Verification and Validation Report for Resolve Version 2.5

    Energy Technology Data Exchange (ETDEWEB)

    BRATZEL, D.R.

    2000-09-28

    The purpose of this report is to document all software verification and validation activities, results, and findings related to the development of Resolve Version 2.5 for the analysis of flammable gas accidents in Hanford Site waste tanks.

  19. Performance evaluation of a right atrial automatic capture verification algorithm using two different sensing configurations.

    Science.gov (United States)

    Sperzel, Johannes; Goetze, Stephan; Kennergren, Charles; Biffi, Mauro; Brooke, M Jason; Vireca, Elisa; Saha, Sunipa; Schubert, Bernd; Butter, Christian

    2009-05-01

    This acute data collection study evaluated the performance of a right atrial (RA) automatic capture verification (ACV) algorithm based on evoked response sensing from two electrode configurations during independent unipolar pacing. RA automatic threshold tests were conducted. Evoked response signals were simultaneously recorded between the RA(Ring) electrode and an empty pacemaker housing electrode (RA(Ring)-->Can) and the electrically isolated Indifferent header electrode (RA(Ring)-->Ind). The atrial evoked response (AER) and the performance of the ACV algorithm were evaluated off-line using each sensing configuration. An accurate threshold measurement was defined as within 0.2 V of the unipolar threshold measured manually. Threshold tests were designed to fail for small AER (AER signals were analyzed from 34 patients who were indicated for a pacemaker (five), implantable cardioverter-defibrillator (11), or cardiac resynchronization therapy pacemaker (six) or defibrillator (12). The minimum AER amplitude was larger (P Can (1.6+/-0.9 mV) than from RA(Ring)-->Ind (1.3+/-0.8 mV). The algorithm successfully measured the pacing threshold in 96.8% and 91.0% of tests for RA(Ring)-->Can and RA(Ring)-->Ind, respectively. No statistical difference between the unipolar and bipolar pacing threshold was observed. The RA(Ring)-->Can AER sensing configuration may provide a means of implementing an independent pacing/sensing method for ACV in the RA. RA bipolar pacing therapy based on measured RA unipolar pacing thresholds may be feasible.

  20. Performance Verification of the Gravity and Extreme Magnetism Small Explorer GEMS X-Ray Polarimeter

    Science.gov (United States)

    Enoto, Teruaki; Black, J. Kevin; Kitaguchi, Takao; Hayato, Asami; Hill, Joanne E.; Jahoda, Keith; Tamagawa, Toru; Kanako, Kenta; Takeuchi, Yoko; Yoshikawa, Akifumi; hide

    2014-01-01

    olarimetry is a powerful tool for astrophysical observations that has yet to be exploited in the X-ray band. For satellite-borne and sounding rocket experiments, we have developed a photoelectric gas polarimeter to measure X-ray polarization in the 2-10 keV range utilizing a time projection chamber (TPC) and advanced micro-pattern gas electron multiplier (GEM) techniques. We carried out performance verification of a flight equivalent unit (1/4 model) which was planned to be launched on the NASA Gravity and Extreme Magnetism Small Explorer (GEMS) satellite. The test was performed at Brookhaven National Laboratory, National Synchrotron Light Source (NSLS) facility in April 2013. The polarimeter was irradiated with linearly-polarized monochromatic X-rays between 2.3 and 10.0 keV and scanned with a collimated beam at 5 different detector positions. After a systematic investigation of the detector response, a modulation factor greater than or equal to 35% above 4 keV was obtained with the expected polarization angle. At energies below 4 keV where the photoelectron track becomes short, diffusion in the region between the GEM and readout strips leaves an asymmetric photoelectron image. A correction method retrieves an expected modulation angle, and the expected modulation factor, approximately 20% at 2.7 keV. Folding the measured values of modulation through an instrument model gives sensitivity, parameterized by minimum detectable polarization (MDP), nearly identical to that assumed at the preliminary design review (PDR).

  1. Capability Portfolio Analysis Tool (CPAT) Verification and Validation Report

    Science.gov (United States)

    2013-01-01

    Med Evac Vehicle MGS Mobile Gun System MILPRS Military Personnel MILCON Military Construction MODA Multiple Objective Decision Analysis...Analysis ( MODA ) approach for assessing the value of vehicle modernization in the HBCT and SBCT combat fleets. The MODA approach provides insight to...used to measure the returns of scale for a given attribute. The MODA approach promotes buy-in from multiple stakeholders. The CPAT team held an SME

  2. Structural Dynamics Verification of Rotorcraft Comprehensive Analysis System (RCAS)

    Energy Technology Data Exchange (ETDEWEB)

    Bir, G. S.

    2005-02-01

    The Rotorcraft Comprehensive Analysis System (RCAS) was acquired and evaluated as part of an ongoing effort by the U.S Department of Energy (DOE) and the National Renewable Energy Laboratory (NREL) to provide state-of-the-art wind turbine modeling and analysis technology for Government and industry. RCAS is an interdisciplinary tool offering aeroelastic modeling and analysis options not supported by current codes. RCAS was developed during a 4-year joint effort among the U.S. Army's Aeroflightdynamics Directorate, Advanced Rotorcraft Technology Inc., and the helicopter industry. The code draws heavily from its predecessor 2GCHAS (Second Generation Comprehensive Helicopter Analysis System), which required an additional 14 years to develop. Though developed for the rotorcraft industry, its general-purpose features allow it to model or analyze a general dynamic system. Its key feature is a specialized finite element that can model spinning flexible parts. The code, therefore, appears particularly suited for wind turbines whose dynamics is dominated by massive flexible spinning rotors. In addition to the simulation capability of the existing codes, RCAS [1-3] offers a range of unique capabilities, including aeroelastic stability analysis, trim, state-space modeling, operating modes, modal reduction, multi-blade coordinate transformation, periodic-system-specific analysis, choice of aerodynamic models, and a controls design/implementation graphical interface.

  3. Improving Speaker Verification Performance in Presence of Spoofing Attacks Using Out-of-Domain Spoofed Data

    DEFF Research Database (Denmark)

    Sarkar, Achintya Kumar; Sahidullah, Md; Tan, Zheng-Hua

    2017-01-01

    Automatic speaker verification (ASV) systems are vulnerable to spoofing attacks using speech generated by voice conversion and speech synthesis techniques. Commonly, a countermeasure (CM) system is integrated with an ASV system for improved protection against spoofing attacks. But integration...

  4. M&V Guidelines: Measurement and Verification for Performance-Based Contracts Version 4.0

    Energy Technology Data Exchange (ETDEWEB)

    None

    2015-11-02

    Document outlines the Federal Energy Management Program's standard procedures and guidelines for measurement and verification (M&V) for federal energy managers, procurement officials, and energy service providers.

  5. Development and performance validation of a cryogenic linear stage for SPICA-SAFARI verification

    Science.gov (United States)

    Ferrari, Lorenza; Smit, H. P.; Eggens, M.; Keizer, G.; de Jonge, A. W.; Detrain, A.; de Jonge, C.; Laauwen, W. M.; Dieleman, P.

    2014-07-01

    In the context of the SAFARI instrument (SpicA FAR-infrared Instrument) SRON is developing a test environment to verify the SAFARI performance. The characterization of the detector focal plane will be performed with a backilluminated pinhole over a reimaged SAFARI focal plane by an XYZ scanning mechanism that consists of three linear stages stacked together. In order to reduce background radiation that can couple into the high sensitivity cryogenic detectors (goal NEP of 2•10-19 W/√Hz and saturation power of few femtoWatts) the scanner is mounted inside the cryostat in the 4K environment. The required readout accuracy is 3 μm and reproducibility of 1 μm along the total travel of 32 mm. The stage will be operated in "on the fly" mode to prevent vibrations of the scanner mechanism and will move with a constant speed varying from 60 μm/s to 400 μm/s. In order to meet the requirements of large stroke, low dissipation (low friction) and high accuracy a DC motor plus spindle stage solution has been chosen. In this paper we will present the stage design and stage characterization, describing also the measurements setup. The room temperature performance has been measured with a 3D measuring machine cross calibrated with a laser interferometer and a 2-axis tilt sensor. The low temperature verification has been performed in a wet 4K cryostat using a laser interferometer for measuring the linear displacements and a theodolite for measuring the angular displacements. The angular displacements can be calibrated with a precision of 4 arcsec and the position could be determined with high accuracy. The presence of friction caused higher values of torque than predicted and consequently higher dissipation. The thermal model of the stage has also been verified at 4K.

  6. Development and Verification of Smoothed Particle Hydrodynamics Code for Analysis of Tsunami near NPP

    Energy Technology Data Exchange (ETDEWEB)

    Jo, Young Beom; Kim, Eung Soo [Seoul National Univ., Seoul (Korea, Republic of)

    2014-10-15

    It becomes more complicated when considering the shape and phase of the ground below the seawater. Therefore, some different attempts are required to precisely analyze the behavior of tsunami. This paper introduces an on-going activities on code development in SNU based on an unconventional mesh-free fluid analysis method called Smoothed Particle Hydrodynamics (SPH) and its verification work with some practice simulations. This paper summarizes the on-going development and verification activities on Lagrangian mesh-free SPH code in SNU. The newly developed code can cover equation of motions and heat conduction equation so far, and verification of each models is completed. In addition, parallel computation using GPU is now possible, and GUI is also prepared. If users change input geometry or input values, they can simulate for various conditions geometries. A SPH method has large advantages and potential in modeling of free surface, highly deformable geometry and multi-phase problems that traditional grid-based code has difficulties in analysis. Therefore, by incorporating more complex physical models such as turbulent flow, phase change, two-phase flow, and even solid mechanics, application of the current SPH code is expected to be much more extended including molten fuel behaviors in the sever accident.

  7. Applications of Bayesian Procrustes shape analysis to ensemble radar reflectivity nowcast verification

    Science.gov (United States)

    Fox, Neil I.; Micheas, Athanasios C.; Peng, Yuqiang

    2016-07-01

    This paper introduces the use of Bayesian full Procrustes shape analysis in object-oriented meteorological applications. In particular, the Procrustes methodology is used to generate mean forecast precipitation fields from a set of ensemble forecasts. This approach has advantages over other ensemble averaging techniques in that it can produce a forecast that retains the morphological features of the precipitation structures and present the range of forecast outcomes represented by the ensemble. The production of the ensemble mean avoids the problems of smoothing that result from simple pixel or cell averaging, while producing credible sets that retain information on ensemble spread. Also in this paper, the full Bayesian Procrustes scheme is used as an object verification tool for precipitation forecasts. This is an extension of a previously presented Procrustes shape analysis based verification approach into a full Bayesian format designed to handle the verification of precipitation forecasts that match objects from an ensemble of forecast fields to a single truth image. The methodology is tested on radar reflectivity nowcasts produced in the Warning Decision Support System - Integrated Information (WDSS-II) by varying parameters in the K-means cluster tracking scheme.

  8. Interprocedural Analysis and the Verification of Concurrent Programs

    Science.gov (United States)

    2009-01-01

    first k execution contexts. Using that table, it figures out a valuation of Vark +1G to continue the analysis of T s 1 , and stores the effect that T s1...reached when T s1 is started in state (1, g1, · · · , gk+1) because T s1 could not have touched Vark +1G before the increment that changed k to k + 1. The

  9. Performance evaluation of wavelet-based face verification on a PDA recorded database

    Science.gov (United States)

    Sellahewa, Harin; Jassim, Sabah A.

    2006-05-01

    The rise of international terrorism and the rapid increase in fraud and identity theft has added urgency to the task of developing biometric-based person identification as a reliable alternative to conventional authentication methods. Human Identification based on face images is a tough challenge in comparison to identification based on fingerprints or Iris recognition. Yet, due to its unobtrusive nature, face recognition is the preferred method of identification for security related applications. The success of such systems will depend on the support of massive infrastructures. Current mobile communication devices (3G smart phones) and PDA's are equipped with a camera which can capture both still and streaming video clips and a touch sensitive display panel. Beside convenience, such devices provide an adequate secure infrastructure for sensitive & financial transactions, by protecting against fraud and repudiation while ensuring accountability. Biometric authentication systems for mobile devices would have obvious advantages in conflict scenarios when communication from beyond enemy lines is essential to save soldier and civilian life. In areas of conflict or disaster the luxury of fixed infrastructure is not available or destroyed. In this paper, we present a wavelet-based face verification scheme that have been specifically designed and implemented on a currently available PDA. We shall report on its performance on the benchmark audio-visual BANCA database and on a newly developed PDA recorded audio-visual database that take include indoor and outdoor recordings.

  10. Verification and Validation of the BISON Fuel Performance Code for PCMI Applications

    Energy Technology Data Exchange (ETDEWEB)

    Gamble, Kyle Allan Lawrence [Idaho National Laboratory; Novascone, Stephen Rhead [Idaho National Laboratory; Gardner, Russell James [Idaho National Laboratory; Perez, Danielle Marie [Idaho National Laboratory; Pastore, Giovanni [Idaho National Laboratory; Hales, Jason Dean [Idaho National Laboratory

    2016-06-01

    BISON is a modern finite element-based nuclear fuel performance code that has been under development at Idaho National Laboratory (INL) since 2009. The code is applicable to both steady and transient fuel behavior and has been used to analyze a variety of fuel forms in 1D spherical, 2D axisymmetric, or 3D geometries. A brief overview of BISON’s computational framework, governing equations, and general material and behavioral models is provided. BISON code and solution verification procedures are described. Validation for application to light water reactor (LWR) PCMI problems is assessed by comparing predicted and measured rod diameter following base irradiation and power ramps. Results indicate a tendency to overpredict clad diameter reduction early in life, when clad creepdown dominates, and more significantly overpredict the diameter increase late in life, when fuel expansion controls the mechanical response. Initial rod diameter comparisons have led to consideration of additional separate effects experiments to better understand and predict clad and fuel mechanical behavior. Results from this study are being used to define priorities for ongoing code development and validation activities.

  11. Calibrations and verifications performed in view of the ILA reinstatement at JET

    Energy Technology Data Exchange (ETDEWEB)

    Dumortier, P., E-mail: pierre.dumortier@rma.ac.be; Durodié, F. [LPP-ERM-KMS, TEC partner, Brussels (Belgium); Helou, W. [CEA, IRFM, F-13108 St-Paul-Lez-Durance (France); Monakhov, I.; Noble, C.; Wooldridge, E.; Blackman, T.; Graham, M. [CCFE, Culham Science Centre, Abingdon (United Kingdom); Collaboration: EUROfusion Consortium

    2015-12-10

    The calibrations and verifications that are performed in preparation of the ITER-Like antenna (ILA) reinstatement at JET are reviewed. A brief reminder of the ILA system layout is given. The different calibration methods and results are then discussed. They encompass the calibrations of the directional couplers present in the system, the determination of the relation between the capacitor position readings and the capacitance value, the voltage probes calibration inside the antenna housing, the RF cables characterization and the acquisition electronics circuit calibration. Earlier experience with the ILA has shown that accurate calibrations are essential for the control of the full ILA close-packed antenna array, its protection through the S-Matrix Arc Detection and the new second stage matching algorithm to be implemented. Finally the voltage stand-off of the capacitors is checked and the phase range achievable with the system is verified. The system layout is modified as to allow dipole operation over the whole operating frequency range when operating with the 3dB combiner-splitters.

  12. Calibrations and verifications performed in view of the ILA reinstatement at JET

    Science.gov (United States)

    Dumortier, P.; Durodié, F.; Helou, W.; Monakhov, I.; Noble, C.; Wooldridge, E.; Blackman, T.; Graham, M.

    2015-12-01

    The calibrations and verifications that are performed in preparation of the ITER-Like antenna (ILA) reinstatement at JET are reviewed. A brief reminder of the ILA system layout is given. The different calibration methods and results are then discussed. They encompass the calibrations of the directional couplers present in the system, the determination of the relation between the capacitor position readings and the capacitance value, the voltage probes calibration inside the antenna housing, the RF cables characterization and the acquisition electronics circuit calibration. Earlier experience with the ILA has shown that accurate calibrations are essential for the control of the full ILA close-packed antenna array, its protection through the S-Matrix Arc Detection and the new second stage matching algorithm to be implemented. Finally the voltage stand-off of the capacitors is checked and the phase range achievable with the system is verified. The system layout is modified as to allow dipole operation over the whole operating frequency range when operating with the 3dB combiner-splitters.

  13. The USP Performance Verification Test, Part II: collaborative study of USP's Lot P Prednisone Tablets.

    Science.gov (United States)

    Glasgow, Maria; Dressman, Shawn; Brown, William; Foster, Thomas; Schuber, Stefan; Manning, Ronald G; Wahab, Samir Z; Williams, Roger L; Hauck, Walter W

    2008-05-01

    Periodic performance verification testing (PVT) is used by laboratories to assess and demonstrate proficiency and for other purposes as well. For dissolution, the PVT is specified in the US Pharmacopeia General Chapter Dissolution under the title Apparatus Suitability Test. For Apparatus 1 and 2, USP provides two reference standard tablets for this purpose. For each new lot of these reference standards, USP conducts a collaborative study. For new USP Lot P Prednisone Tablets, 28 collaborating laboratories provided data. The study was conducted with three sets of tablets: Lot O open label, Lot O blinded, and Lot P blinded. The blinded Lot O data were used for apparatus suitability testing. Acceptance limits were determined after dropping data due to failure of apparatus suitability, identification of data as unusual on control charts, or protocol violations. Results yielded acceptance criteria of (47, 82) for Apparatus 1 and (37, 70) for Apparatus 2. Results generally were similar for Lot P compared to results from Lot O except that the average percent dissolved for Lot P is greater than for Lot O with Apparatus 2.

  14. Development and verification test of integral reactor major components - Development of MCP impeller design, performance prediction code and experimental verification

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Myung Kyoon; Oh, Woo Hyoung; Song, Jae Wook [Korea Advanced Institute of Science and Technology, Taejon (Korea)

    1999-03-01

    The present study is aimed at developing a computational code for design and performance prediction of an axial-flow pump. The proposed performance prediction method is tested against a model axial-flow pump streamline curvature method. The preliminary design is made by using the ideal velocity triangles at inlet and exit and the three dimensional blade shape is calculated by employing the free vortex design method. Then the detailed blading design is carried out by using experimental database of double circular arc cambered hydrofoils. To computationally determine the design incidence, deviation, blade camber, solidity and stagger angle, a number of correlation equations are developed form the experimental database and a theorical formula for the lift coefficient is adopted. A total of 8 equations are solved iteratively using an under-relaxation factor. An experimental measurement is conducted under a non-cavitating condition to obtain the off-design performance curve and also a cavitation test is carried out by reducing the suction pressure. The experimental results are very satisfactorily compared with the predictions by the streamline curvature method. 28 refs., 26 figs., 11 tabs. (Author)

  15. Verification testing of the compression performance of the HEVC screen content coding extensions

    Science.gov (United States)

    Sullivan, Gary J.; Baroncini, Vittorio A.; Yu, Haoping; Joshi, Rajan L.; Liu, Shan; Xiu, Xiaoyu; Xu, Jizheng

    2017-09-01

    This paper reports on verification testing of the coding performance of the screen content coding (SCC) extensions of the High Efficiency Video Coding (HEVC) standard (Rec. ITU-T H.265 | ISO/IEC 23008-2 MPEG-H Part 2). The coding performance of HEVC screen content model (SCM) reference software is compared with that of the HEVC test model (HM) without the SCC extensions, as well as with the Advanced Video Coding (AVC) joint model (JM) reference software, for both lossy and mathematically lossless compression using All-Intra (AI), Random Access (RA), and Lowdelay B (LB) encoding structures and using similar encoding techniques. Video test sequences in 1920×1080 RGB 4:4:4, YCbCr 4:4:4, and YCbCr 4:2:0 colour sampling formats with 8 bits per sample are tested in two categories: "text and graphics with motion" (TGM) and "mixed" content. For lossless coding, the encodings are evaluated in terms of relative bit-rate savings. For lossy compression, subjective testing was conducted at 4 quality levels for each coding case, and the test results are presented through mean opinion score (MOS) curves. The relative coding performance is also evaluated in terms of Bjøntegaard-delta (BD) bit-rate savings for equal PSNR quality. The perceptual tests and objective metric measurements show a very substantial benefit in coding efficiency for the SCC extensions, and provided consistent results with a high degree of confidence. For TGM video, the estimated bit-rate savings ranged from 60-90% relative to the JM and 40-80% relative to the HM, depending on the AI/RA/LB configuration category and colour sampling format.

  16. Infrared thermal facial image sequence registration analysis and verification

    Science.gov (United States)

    Chen, Chieh-Li; Jian, Bo-Lin

    2015-03-01

    To study the emotional responses of subjects to the International Affective Picture System (IAPS), infrared thermal facial image sequence is preprocessed for registration before further analysis such that the variance caused by minor and irregular subject movements is reduced. Without affecting the comfort level and inducing minimal harm, this study proposes an infrared thermal facial image sequence registration process that will reduce the deviations caused by the unconscious head shaking of the subjects. A fixed image for registration is produced through the localization of the centroid of the eye region as well as image translation and rotation processes. Thermal image sequencing will then be automatically registered using the two-stage genetic algorithm proposed. The deviation before and after image registration will be demonstrated by image quality indices. The results show that the infrared thermal image sequence registration process proposed in this study is effective in localizing facial images accurately, which will be beneficial to the correlation analysis of psychological information related to the facial area.

  17. Spectral signature verification using statistical analysis and text mining

    Science.gov (United States)

    DeCoster, Mallory E.; Firpi, Alexe H.; Jacobs, Samantha K.; Cone, Shelli R.; Tzeng, Nigel H.; Rodriguez, Benjamin M.

    2016-05-01

    In the spectral science community, numerous spectral signatures are stored in databases representative of many sample materials collected from a variety of spectrometers and spectroscopists. Due to the variety and variability of the spectra that comprise many spectral databases, it is necessary to establish a metric for validating the quality of spectral signatures. This has been an area of great discussion and debate in the spectral science community. This paper discusses a method that independently validates two different aspects of a spectral signature to arrive at a final qualitative assessment; the textual meta-data and numerical spectral data. Results associated with the spectral data stored in the Signature Database1 (SigDB) are proposed. The numerical data comprising a sample material's spectrum is validated based on statistical properties derived from an ideal population set. The quality of the test spectrum is ranked based on a spectral angle mapper (SAM) comparison to the mean spectrum derived from the population set. Additionally, the contextual data of a test spectrum is qualitatively analyzed using lexical analysis text mining. This technique analyzes to understand the syntax of the meta-data to provide local learning patterns and trends within the spectral data, indicative of the test spectrum's quality. Text mining applications have successfully been implemented for security2 (text encryption/decryption), biomedical3 , and marketing4 applications. The text mining lexical analysis algorithm is trained on the meta-data patterns of a subset of high and low quality spectra, in order to have a model to apply to the entire SigDB data set. The statistical and textual methods combine to assess the quality of a test spectrum existing in a database without the need of an expert user. This method has been compared to other validation methods accepted by the spectral science community, and has provided promising results when a baseline spectral signature is

  18. Performance Analysis of MYSEA

    Science.gov (United States)

    2012-09-01

    Services FSD Federated Services Daemon I&A Identification and Authentication IKE Internet Key Exchange KPI Key Performance Indicator LAN Local Area...spection takes place in different processes in the server architecture. Key Performance Indica- tor ( KPI )s associated with the system need to be...application and risk analysis of security controls. Thus, measurement of the KPIs is needed before an informed tradeoff between the performance penalties

  19. Precision Cleaning Verification of Fluid Components by Air/Water Impingement and Total Carbon Analysis

    Science.gov (United States)

    Barile, Ronald G.; Fogarty, Chris; Cantrell, Chris; Melton, Gregory S.

    1995-01-01

    NASA personnel at Kennedy Space Center's Material Science Laboratory have developed new environmentally sound precision cleaning and verification techniques for systems and components found at the center. This technology is required to replace existing methods traditionally employing CFC-113. The new patent-pending technique of precision cleaning verification is for large components of cryogenic fluid systems. These are stainless steel, sand cast valve bodies with internal surface areas ranging from 0.2 to 0.9 m(exp 2). Extrapolation of this technique to components of even larger sizes (by orders of magnitude) is planned. Currently, the verification process is completely manual. In the new technique, a high velocity, low volume water stream impacts the part to be verified. This process is referred to as Breathing Air/Water Impingement and forms the basis for the Impingement Verification System (IVS). The system is unique in that a gas stream is used to accelerate the water droplets to high speeds. Water is injected into the gas stream in a small, continuous amount. The air/water mixture is then passed through a converging-diverging nozzle where the gas is accelerated to supersonic velocities. These droplets impart sufficient energy to the precision cleaned surface to place non-volatile residue (NVR) contaminants into suspension in the water. The sample water is collected and its NVR level is determined by total organic carbon (TOC) analysis at 880 C. The TOC, in ppm carbon, is used to establish the NVR level. A correlation between the present gravimetric CFC-113 NVR and the IVS NVR is found from experimental sensitivity factors measured for various contaminants. The sensitivity has the units of ppm of carbon per mg-ft(exp 2) of contaminant. In this paper, the equipment is described and data are presented showing the development of the sensitivity factors from a test set including four NVR's impinged from witness plates of 0.05 to 0.75 m(exp 2).

  20. Acquisition System Verification for Energy Efficiency Analysis of Building Materials

    Directory of Open Access Journals (Sweden)

    Natalia Cid

    2017-08-01

    Full Text Available Climate change and fossil fuel depletion foster interest in improving energy efficiency in buildings. There are different methods to achieve improved efficiency; one of them is the use of additives, such as phase change materials (PCMs. To prove this method’s effectiveness, a building’s behaviour should be monitored and analysed. This paper describes an acquisition system developed for monitoring buildings based on Supervisory Control and Data Acquisition (SCADA and with a 1-wire bus network as the communication system. The system is empirically tested to prove that it works properly. With this purpose, two experimental cubicles are made of self-compacting concrete panels, one of which has a PCM as an additive to improve its energy storage properties. Both cubicles have the same dimensions and orientation, and they are separated by six feet to avoid shadows. The behaviour of the PCM was observed with the acquisition system, achieving results that illustrate the differences between the cubicles directly related to the PCM’s characteristics. Data collection devices included in the system were temperature sensors, some of which were embedded in the walls, as well as humidity sensors, heat flux density sensors, a weather station and energy counters. The analysis of the results shows agreement with previous studies of PCM addition; therefore, the acquisition system is suitable for this application.

  1. Mass spectral analysis of synthones of nerve agents for verification of the Chemical Weapons Convention.

    Science.gov (United States)

    Gupta, Arvind K; Shakya, Purushottam D; Pardasani, Deepak; Palit, Meehir; Dubey, Devendra K

    2005-01-01

    This communication describes the synthesis and gas chromatography/mass spectrometric (GC/MS) analysis of N,N-dialkylphosphoramidic dihalides and alkylphosphonic difluorides, which are synthones of nerve agents. The study was undertaken with a view to developing a spectral database of these compounds for verification purposes of the Chemical Weapons Convention (CWC). The modified synthetic approach reported here has advantages over traditional syntheses in terms of time and yield. GC/MS analysis of these synthones yielded electron ionization (EI) mass spectra and, based on these spectra, generalized fragmentation routes are proposed that rationalize most of the characteristic ions. Copyright 2005 John Wiley & Sons, Ltd.

  2. Validation and Verification of the Operational Land Analysis Activities at the Air Force Weather Agency

    Science.gov (United States)

    Shaw, M.; Kumar, S.; Peters-Lidard, C. D.; Cetola, J.

    2011-12-01

    The importance of operational benchmarking and uncertainty characterization of land surface modeling can be clear upon considering the wide range of performance characteristics of numerical land surface models realizable through various combinations of factors. Such factors might include model physics and numerics, resolution, and forcing datasets used in operational implementation versus those that might have been involved in any prior development benchmarking. Of course, decisions concerning operational implementation may be better informed through more effective benchmarking of performance under various blends of such aforementioned operational factors. To facilitate this and other needs for land analysis activities at the Air Force Weather Agency (AFWA), the Model Evaluation Toolkit (MET) - a joint product of the National Center for Atmospheric Research Developmental Testbed Center (NCAR DTC), AFWA, and the user community - and the land information system (LIS) Verification Toolkit (LVT) - developed at the Goddard Space Flight Center (GSFC) - have been adapted to the operational benchmarking needs of AFWA's land characterization activities in order to compare the performance of new land modeling and related activities with that of previous activities as well as observational or analyzed datasets. In this talk, three examples of adaptations of MET and LVT to evaluation of LIS-related operations at AFWA will be presented. One example will include comparisons of new surface rainfall analysis capabilities, towards forcing of AFWA's LIS, with previous capabilities. Comparisons will be relative to retrieval-, model-, and measurement-based precipitation fields. Results generated via MET's grid-stat, neighborhood, wavelet, and object based evaluation (MODE) utilities adapted to AFWA's needs will be discussed. This example will be framed in the context of better informing optimal blends of land surface model (LSM) forcing data sources - namely precipitation data- under

  3. Baghouse filtration products verification

    Energy Technology Data Exchange (ETDEWEB)

    Mycock, J.C.; Turner, J.H.; VanOsdell, D.W.; Farmer, J.R.; Brna, T.G.

    1998-11-01

    The paper introduces EPA`s Air Pollution Control Technology Verification (APCT) program and then focuses on the immediate objective of the program: laboratory performance verification of cleanable filter media intended for the control of fine particulate emissions. Data collected during the laboratory verification testing, which simulates operation in full-scale fabric filters, will be used to show expected performance for collection of particles {le} 2.5 micrometers in diameter.

  4. Prompt {gamma}-ray activation analysis of Martian analogues at the FRM II neutron reactor and the verification of a Monte Carlo planetary radiation environment model

    Energy Technology Data Exchange (ETDEWEB)

    Skidmore, M.S. [Space Research Centre, Department of Physics and Astronomy, University of Leicester, University Road, Leicester LE1 7RH (United Kingdom)], E-mail: mss16@star.le.ac.uk; Ambrosi, R.M.; Vernon, D. [Space Research Centre, Department of Physics and Astronomy, University of Leicester, University Road, Leicester LE1 7RH (United Kingdom); Calzada, E. [Neutronentomographie ANTARES, Forschungsreaktor FRM II, Technische Universitaet Muenchen, D-85747 Garching (Germany); Benedix, G.K. [Department of Mineralogy, Natural History Museum, Cromwell Road, London SW7 5BD (United Kingdom); Buecherl, T. [Lehrstuhl fuer Radiochemie, TU Muenchen, Walther-Meissner-Str. 3, Garching 85748 (Germany); Schillinger, B. [Neutronentomographie ANTARES, Forschungsreaktor FRM II, Technische Universitaet Muenchen, D-85747 Garching (Germany)

    2009-08-11

    Planetary radiation environment modelling is important to assess the habitability of a planetary body. It is also useful when interpreting the {gamma}-ray data produced by natural emissions from radioisotopes or prompt {gamma}-ray activation analysis. {gamma}-ray spectra acquired in orbit or in-situ by a suitable detector can be converted into meaningful estimates of the concentration of certain elements on the surface of a planet. This paper describes the verification of a Monte Carlo model developed using the MCNPX code at University of Leicester. The model predicts the performance of a geophysical package containing a {gamma}-ray spectrometer operating at a depth of up to 5 m. The experimental verification of the Monte Carlo model was performed at the FRM II facility in Munich, Germany. The paper demonstrates that the model is in good agreement with the experimental data and can be used to model the performance of an in-situ {gamma}-ray spectrometer.

  5. Load Model Verification, Validation and Calibration Framework by Statistical Analysis on Field Data

    Science.gov (United States)

    Jiao, Xiangqing; Liao, Yuan; Nguyen, Thai

    2017-11-01

    Accurate load models are critical for power system analysis and operation. A large amount of research work has been done on load modeling. Most of the existing research focuses on developing load models, while little has been done on developing formal load model verification and validation (V&V) methodologies or procedures. Most of the existing load model validation is based on qualitative rather than quantitative analysis. In addition, not all aspects of model V&V problem have been addressed by the existing approaches. To complement the existing methods, this paper proposes a novel load model verification and validation framework that can systematically and more comprehensively examine load model's effectiveness and accuracy. Statistical analysis, instead of visual check, quantifies the load model's accuracy, and provides a confidence level of the developed load model for model users. The analysis results can also be used to calibrate load models. The proposed framework can be used as a guidance to systematically examine load models for utility engineers and researchers. The proposed method is demonstrated through analysis of field measurements collected from a utility system.

  6. HYDRA-II: A hydrothermal analysis computer code: Volume 3, Verification/validation assessments

    Energy Technology Data Exchange (ETDEWEB)

    McCann, R.A.; Lowery, P.S.

    1987-10-01

    HYDRA-II is a hydrothermal computer code capable of three-dimensional analysis of coupled conduction, convection, and thermal radiation problems. This code is especially appropriate for simulating the steady-state performance of spent fuel storage systems. The code has been evaluated for this application for the US Department of Energy's Commercial Spent Fuel Management Program. HYDRA-II provides a finite difference solution in cartesian coordinates to the equations governing the conservation of mass, momentum, and energy. A cylindrical coordinate system may also be used to enclose the cartesian coordinate system. This exterior coordinate system is useful for modeling cylindrical cask bodies. The difference equations for conservation of momentum are enhanced by the incorporation of directional porosities and permeabilities that aid in modeling solid structures whose dimensions may be smaller than the computational mesh. The equation for conservation of energy permits modeling of orthotropic physical properties and film resistances. Several automated procedures are available to model radiation transfer within enclosures and from fuel rod to fuel rod. The documentation of HYDRA-II is presented in three separate volumes. Volume I - Equations and Numerics describes the basic differential equations, illustrates how the difference equations are formulated, and gives the solution procedures employed. Volume II - User's Manual contains code flow charts, discusses the code structure, provides detailed instructions for preparing an input file, and illustrates the operation of the code by means of a model problem. This volume, Volume III - Verification/Validation Assessments, provides a comparison between the analytical solution and the numerical simulation for problems with a known solution. This volume also documents comparisons between the results of simulations of single- and multiassembly storage systems and actual experimental data. 11 refs., 55 figs., 13 tabs.

  7. Theoretical analysis and experimental verification of parabolic trough solar collector with hot water generation system

    Directory of Open Access Journals (Sweden)

    Valan-Arasu Amirtham

    2007-01-01

    Full Text Available The modeling of a parabolic trough collector with hot water generation system with a well-mixed type storage tank using a computer simulation program is presented in this paper. This is followed by an experimental verification of the model and an analysis of the experimental results. The maximum difference between the predicted and the actual storage tank water temperature values is found as 9.59% only. This variation is due to the difference between the actual weather during the test period compared to hourly values and the convection losses from the collector receiver, which were not constant as accounted by the computer simulation program. .

  8. Verification of Large State/Event Systems using Compositionality and Dependency Analysis

    DEFF Research Database (Denmark)

    Lind-Nielsen, Jørn; Andersen, Henrik Reif; Hulgaard, Henrik

    2001-01-01

    A state/event model is a concurrent version of Mealy machines used for describing embedded reactive systems. This paper introduces a technique that uses compositionality and dependency analysis to significantly improve the efficiency of symbolic model checking of state/event models. It makes poss...... possible automated verification of large industrial designs with the use of only modest resources (less than 5 minutes on a standard PC for a model with 1421 concurrent machines). The results of the paper are being implemented in the next version of the commercial tool visualSTATETM....

  9. Integrated Advanced Microwave Sounding Unit-A (AMSU-A). Performance Verification Report: Antenna Drive Subsystem METSAT AMSU-A2 (PN:1331200-2, SN:108)

    Science.gov (United States)

    Haapala, C.

    1999-01-01

    This is the Performance Verification Report, Antenna Drive Subassembly, Antenna Drive Subsystem, METSAT AMSU-A2 (P/N 1331200-2, SN: 108), for the Integrated Advanced Microwave Sounding Unit-A (AMSU-A).

  10. Verification of combined thermal-hydraulic and heat conduction analysis code FLOWNET/TRUMP

    Science.gov (United States)

    Maruyama, Soh; Fujimoto, Nozomu; Kiso, Yoshihiro; Murakami, Tomoyuki; Sudo, Yukio

    1988-09-01

    This report presents the verification results of the combined thermal-hydraulic and heat conduction analysis code, FLOWNET/TRUMP which has been utilized for the core thermal hydraulic design, especially for the analysis of flow distribution among fuel block coolant channels, the determination of thermal boundary conditions for fuel block stress analysis and the estimation of fuel temperature in the case of fuel block coolant channel blockage accident in the design of the High Temperature Engineering Test Reactor(HTTR), which the Japan Atomic Energy Research Institute has been planning to construct in order to establish basic technologies for future advanced very high temperature gas-cooled reactors and to be served as an irradiation test reactor for promotion of innovative high temperature new frontier technologies. The verification of the code was done through the comparison between the analytical results and experimental results of the Helium Engineering Demonstration Loop Multi-channel Test Section(HENDEL T(sub 1-M)) with simulated fuel rods and fuel blocks.

  11. Underground verification of the large deflection performance of fibre reinforced shotcrete subjected to high stresses and convergence and to dynamic loading.

    CSIR Research Space (South Africa)

    Joughin, WC

    2002-04-01

    Full Text Available Committee Final Project Report Underground verification of the large deflection performance of fibre reinforced shotcrete subjected to high stresses and convergence and to dynamic loading W.C. Joughin, J.L. Human and P.J. Terbrugge Research agency...: Steffen, Robertson and Kirsten Project number: GAP 710 Date: April 2002 2 Executive summary The underground verification of the performance of fibre reinforced shotcrete, subject to high stresses, convergence and dynamic loading, was identified...

  12. Methodology and Toolset for Model Verification, Hardware/Software co-simulation, Performance Optimisation and Customisable Source-code generation

    DEFF Research Database (Denmark)

    Berger, Michael Stübert; Soler, José; Yu, Hao

    2013-01-01

    The MODUS project aims to provide a pragmatic and viable solution that will allow SMEs to substantially improve their positioning in the embedded-systems development market. The MODUS tool will provide a model verification and Hardware/Software co-simulation tool (TRIAL) and a performance...... of system properties, and producing inputs to be fed into these engines, interfacing with standard (SystemC) simulation platforms for HW/SW co-simulation, customisable source-code generation towards respecting coding standards and conventions and software performance-tuning optimisation through automated...

  13. Physics Verification Overview

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-12

    The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.

  14. Network performance analysis

    CERN Document Server

    Bonald, Thomas

    2013-01-01

    The book presents some key mathematical tools for the performance analysis of communication networks and computer systems.Communication networks and computer systems have become extremely complex. The statistical resource sharing induced by the random behavior of users and the underlying protocols and algorithms may affect Quality of Service.This book introduces the main results of queuing theory that are useful for analyzing the performance of these systems. These mathematical tools are key to the development of robust dimensioning rules and engineering methods. A number of examples i

  15. Analysis of an Indirect Neutron Signature for Enhanced UF6 Cylinder Verification

    Energy Technology Data Exchange (ETDEWEB)

    Kulisek, Jonathan A.; McDonald, Benjamin S.; Smith, Leon E.; Zalavadia, Mital A.; Webster, Jennifer B.

    2017-02-21

    The International Atomic Energy Agency (IAEA) currently uses handheld gamma-ray spectrometers combined with ultrasonic wall-thickness gauges to verify the declared enrichment of uranium hexafluoride (UF6) cylinders. The current method provides relatively low accuracy for the assay of 235U enrichment, especially for natural and depleted UF6. Furthermore, the current method provides no capability to assay the absolute mass of 235U in the cylinder due to the localized instrument geometry and limited penetration of the 186-keV gamma-ray signature from 235U. Also, the current verification process is a time-consuming component of on-site inspections at uranium enrichment plants. Toward the goal of a more-capable cylinder assay method, the Pacific Northwest National Laboratory has developed the hybrid enrichment verification array (HEVA). HEVA measures both the traditional 186-keV direct signature and a non-traditional, high-energy neutron-induced signature (HEVANT). HEVANT enables full-volume assay of UF6 cylinders by exploiting the relatively larger mean free paths of the neutrons emitted from the UF6. In this work, Monte Carlo modeling is used as the basis for characterizing HEVANT in terms of the individual contributions to HEVANT from nuclides and hardware components. Monte Carlo modeling is also used to quantify the intrinsic efficiency of HEVA for neutron detection in a cylinder-assay geometry. Modeling predictions are validated against neutron-induced gamma-ray spectra from laboratory measurements and a relatively large population of Type 30B cylinders spanning a range of enrichments. Implications of the analysis and findings on the viability of HEVA for cylinder verification are discussed, such as the resistance of the HEVANT signature to manipulation by the nearby placement of neutron-conversion materials.

  16. Analysis of an indirect neutron signature for enhanced UF{sub 6} cylinder verification

    Energy Technology Data Exchange (ETDEWEB)

    Kulisek, J.A., E-mail: Jonathan.Kulisek@pnnl.gov; McDonald, B.S.; Smith, L.E.; Zalavadia, M.A.; Webster, J.B.

    2017-02-21

    The International Atomic Energy Agency (IAEA) currently uses handheld gamma-ray spectrometers combined with ultrasonic wall-thickness gauges to verify the declared enrichment of uranium hexafluoride (UF{sub 6}) cylinders. The current method provides relatively low accuracy for the assay of {sup 235}U enrichment, especially for natural and depleted UF{sub 6}. Furthermore, the current method provides no capability to assay the absolute mass of {sup 235}U in the cylinder due to the localized instrument geometry and limited penetration of the 186-keV gamma-ray signature from {sup 235}U. Also, the current verification process is a time-consuming component of on-site inspections at uranium enrichment plants. Toward the goal of a more-capable cylinder assay method, the Pacific Northwest National Laboratory has developed the hybrid enrichment verification array (HEVA). HEVA measures both the traditional 186-keV direct signature and a non-traditional, high-energy neutron-induced signature (HEVA{sub NT}). HEVA{sub NT} enables full-volume assay of UF{sub 6} cylinders by exploiting the relatively larger mean free paths of the neutrons emitted from the UF{sub 6}. In this work, Monte Carlo modeling is used as the basis for characterizing HEVA{sub NT} in terms of the individual contributions to HEVA{sub NT} from nuclides and hardware components. Monte Carlo modeling is also used to quantify the intrinsic efficiency of HEVA for neutron detection in a cylinder-assay geometry. Modeling predictions are validated against neutron-induced gamma-ray spectra from laboratory measurements and a relatively large population of Type 30B cylinders spanning a range of enrichments. Implications of the analysis and findings on the viability of HEVA for cylinder verification are discussed, such as the resistance of the HEVA{sub NT} signature to manipulation by the nearby placement of neutron-conversion materials.

  17. On-ground electrical performance verification strategies for large deployable reflector antennas

    DEFF Research Database (Denmark)

    Pivnenko, Sergey; Kim, Oleksiy S.; Breinbjerg, Olav

    2012-01-01

    In this paper, possible verification strategies for large deployable reflector antennas are reviewed and analysed. One of the approaches considered to be the most feasible and promising is based on measurements of the feed characteristics, such as pattern and gain, and then calculation of the ove......In this paper, possible verification strategies for large deployable reflector antennas are reviewed and analysed. One of the approaches considered to be the most feasible and promising is based on measurements of the feed characteristics, such as pattern and gain, and then calculation...... of the overall reflector antenna pattern and gain. The approach is further investigated by computer simulations. A reference result is obtained by simulation of the entire antenna using the Method of Moments. The measurements of the feed are then simulated in several configurations, including the feed alone...

  18. Electrical performance verification methodology for large reflector antennas: based on the P-band SAR payload of the ESA BIOMASS candidate mission

    DEFF Research Database (Denmark)

    Pivnenko, Sergey; Kim, Oleksiy S.; Nielsen, Jeppe Majlund

    2013-01-01

    In this paper, an electrical performance verification methodology for large reflector antennas is proposed. The verification methodology was developed for the BIOMASS P-band (435 MHz) synthetic aperture radar (SAR), but can be applied to other large deployable or fixed reflector antennas for which...... pattern and gain of the entire antenna including support and satellite structure with an appropriate computational software. A preliminary investigation of the proposed methodology was carried out by performing extensive simulations of different verification approaches. The experimental validation...... the verification of the entire antenna or payload is impossible. The two-step methodology is based on accurate measurement of the feed structure characteristics, such as complex radiation pattern and radiation efficiency, with an appropriate Measurement technique, and then accurate calculation of the radiation...

  19. Environmental technology verification methods

    CSIR Research Space (South Africa)

    Szewczuk, S

    2016-03-01

    Full Text Available the performance of innovative environmental technologies can be verified by qualified third parties called "Verification Bodies". The "Statement of Verification" delivered at the end of the ETV process can be used as evidence that the claims made about...

  20. A Visual Analytics Approach to Structured Data Analysis to Enhance Nonproliferation and Arms Control Verification Activities

    Energy Technology Data Exchange (ETDEWEB)

    Gillen, David S.

    2014-08-07

    Analysis activities for Nonproliferation and Arms Control verification require the use of many types of data. Tabular structured data, such as Excel spreadsheets and relational databases, have traditionally been used for data mining activities, where specific queries are issued against data to look for matching results. The application of visual analytics tools to structured data enables further exploration of datasets to promote discovery of previously unknown results. This paper discusses the application of a specific visual analytics tool to datasets related to the field of Arms Control and Nonproliferation to promote the use of visual analytics more broadly in this domain. Visual analytics focuses on analytical reasoning facilitated by interactive visual interfaces (Wong and Thomas 2004). It promotes exploratory analysis of data, and complements data mining technologies where known patterns can be mined for. Also with a human in the loop, they can bring in domain knowledge and subject matter expertise. Visual analytics has not widely been applied to this domain. In this paper, we will focus on one type of data: structured data, and show the results of applying a specific visual analytics tool to answer questions in the Arms Control and Nonproliferation domain. We chose to use the T.Rex tool, a visual analytics tool developed at PNNL, which uses a variety of visual exploration patterns to discover relationships in structured datasets, including a facet view, graph view, matrix view, and timeline view. The facet view enables discovery of relationships between categorical information, such as countries and locations. The graph tool visualizes node-link relationship patterns, such as the flow of materials being shipped between parties. The matrix visualization shows highly correlated categories of information. The timeline view shows temporal patterns in data. In this paper, we will use T.Rex with two different datasets to demonstrate how interactive exploration of

  1. Verification and Validation of the PLTEMP/ANL Code for Thermal-Hydraulic Analysis of Experimental and Test Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Kalimullah, M. [Argonne National Lab. (ANL), Argonne, IL (United States); Olson, Arne P. [Argonne National Lab. (ANL), Argonne, IL (United States); Feldman, E. E. [Argonne National Lab. (ANL), Argonne, IL (United States); Hanan, N. [Argonne National Lab. (ANL), Argonne, IL (United States); Dionne, B. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-04-07

    The document compiles in a single volume several verification and validation works done for the PLTEMP/ANL code during the years of its development and improvement. Some works that are available in the open literature are simply referenced at the outset, and are not included in the document. PLTEMP has been used in conversion safety analysis reports of several US and foreign research reactors that have been licensed and converted. A list of such reactors is given. Each chapter of the document deals with the verification or validation of a specific model. The model verification is usually done by comparing the code with hand calculation, Microsoft spreadsheet calculation, or Mathematica calculation. The model validation is done by comparing the code with experimental data or a more validated code like the RELAP5 code.

  2. Verification and large deformation analysis using the reproducing kernel particle method

    Energy Technology Data Exchange (ETDEWEB)

    Beckwith, Frank [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    The reproducing kernel particle method (RKPM) is a meshless method used to solve general boundary value problems using the principle of virtual work. RKPM corrects the kernel approximation by introducing reproducing conditions which force the method to be complete to arbritrary order polynomials selected by the user. Effort in recent years has led to the implementation of RKPM within the Sierra/SM physics software framework. The purpose of this report is to investigate convergence of RKPM for verification and validation purposes as well as to demonstrate the large deformation capability of RKPM in problems where the finite element method is known to experience difficulty. Results from analyses using RKPM are compared against finite element analysis. A host of issues associated with RKPM are identified and a number of potential improvements are discussed for future work.

  3. Formal verification of algorithms for critical systems

    Science.gov (United States)

    Rushby, John M.; Von Henke, Friedrich

    1993-01-01

    We describe our experience with formal, machine-checked verification of algorithms for critical applications, concentrating on a Byzantine fault-tolerant algorithm for synchronizing the clocks in the replicated computers of a digital flight control system. First, we explain the problems encountered in unsynchronized systems and the necessity, and criticality, of fault-tolerant synchronization. We give an overview of one such algorithm, and of the arguments for its correctness. Next, we describe a verification of the algorithm that we performed using our EHDM system for formal specification and verification. We indicate the errors we found in the published analysis of the algorithm, and other benefits that we derived from the verification. Based on our experience, we derive some key requirements for a formal specification and verification system adequate to the task of verifying algorithms of the type considered. Finally, we summarize our conclusions regarding the benefits of formal verification in this domain, and the capabilities required of verification systems in order to realize those benefits.

  4. Process Sensitivity, Performance, and Direct Verification Testing of Adhesive Locking Features

    Science.gov (United States)

    Golden, Johnny L.; Leatherwood, Michael D.; Montoya, Michael D.; Kato, Ken A.; Akers, Ed

    2012-01-01

    during assembly by measuring the dynamic prevailing torque. Adhesive locking features or LLCs are another method of providing redundant locking, but a direct verification method has not been used in aerospace applications to verify proper installation when using LLCs because of concern for damage to the adhesive bond. The reliability of LLCs has also been questioned due to failures observed during testing with coupons for process verification, although the coupon failures have often been attributed to a lack of proper procedures. It is highly desirable to have a direct method of verifying the LLC cure or bond integrity. The purpose of the Phase II test program was to determine if the torque applied during direct verification of an adhesive locking feature degrades that locking feature. This report documents the test program used to investigate the viability of such a direct verification method. Results of the Phase II testing were positive, and additional investigation of direct verification of adhesive locking features is merited.

  5. On the safety and performance demonstration tests of Prototype Gen-IV Sodium-Cooled Fast Reactor and validation and verification of computational codes

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Bum; Jeong, Ji Young; Lee, Tae Ho; Kim, Sung Kyun; Euh, Dong Jin; Joo, Hyung Kook [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    The design of Prototype Gen-IV Sodium-Cooled Fast Reactor (PGSFR) has been developed and the validation and verification (V and V) activities to demonstrate the system performance and safety are in progress. In this paper, the current status of test activities is described briefly and significant results are discussed. The large-scale sodium thermal-hydraulic test program, Sodium Test Loop for Safety Simulation and Assessment-1 (STELLA-1), produced satisfactory results, which were used for the computer codes V and V, and the performance test results of the model pump in sodium showed good agreement with those in water. The second phase of the STELLA program with the integral effect tests facility, STELLA-2, is in the detailed design stage of the design process. The sodium thermal-hydraulic experiment loop for finned-tube sodium-to-air heat exchanger performance test, the intermediate heat exchanger test facility, and the test facility for the reactor flow distribution are underway. Flow characteristics test in subchannels of a wire-wrapped rod bundle has been carried out for safety analysis in the core and the dynamic characteristic test of upper internal structure has been performed for the seismic analysis model for the PGSFR. The performance tests for control rod assemblies (CRAs) have been conducted for control rod drive mechanism driving parts and drop tests of the CRA under scram condition were performed. Finally, three types of inspection sensors under development for the safe operation of the PGSFR were explained with significant results.

  6. Experimental verification of a high performed multiple-band metamaterial absorber

    Science.gov (United States)

    Zhang, Zhenya; Wang, Saisai

    2017-05-01

    In this paper, a thin-film metamaterial absorber with multiple-band is experimental verified and calculated analysis. Two absorption peaks higher than 99% and 98% are obtained at normal incidence. The resonance of the local surface plasma (LSP) mode and the internal surface plasmon (ISP) mode lead to the two high absorption peaks. The impedance matched condition is obtained behind two high absorption peaks. Measured results indicate that high absorption performed can be observed with different dielectric layer combinations (Al2O3-ZnSe, Al2O3-Al2O3, and ZnSe-ZnSe), which indicates that the designed metamaterial absorber is insensitive to the dielectric layer combination. High absorption performed is obtained under both TE and TM configurations at various incident angles.

  7. Safe Neighborhood Computation for Hybrid System Verification

    Directory of Open Access Journals (Sweden)

    Yi Deng

    2015-01-01

    Full Text Available For the design and implementation of engineering systems, performing model-based analysis can disclose potential safety issues at an early stage. The analysis of hybrid system models is in general difficult due to the intrinsic complexity of hybrid dynamics. In this paper, a simulation-based approach to formal verification of hybrid systems is presented.

  8. Field Test and Performance Verification: Integrated Active Desiccant Rooftop Hybrid System Installed in a School - Final Report: Phase 4A

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, J

    2005-12-21

    This report summarizes the results of a field verification pilot site investigation that involved the installation of a hybrid integrated active desiccant/vapor-compression rooftop heating, ventilation, and air-conditioning (HVAC) unit at an elementary school in the Atlanta Georgia area. For years, the school had experienced serious humidity and indoor air quality (IAQ) problems that had resulted in occupant complaints and microbial (mold) remediation. The outdoor air louvers of the original HVAC units had been closed in an attempt to improve humidity control within the space. The existing vapor compression variable air volume system was replaced by the integrated active desiccant rooftop (IADR) system that was described in detail in an Oak Ridge National Laboratory (ORNL) report published in 2004 (Fischer and Sand 2004). The IADR system and all space conditions have been monitored remotely for more than a year. The hybrid system was able to maintain both the space temperature and humidity as desired while delivering the outdoor air ventilation rate required by American Society of Heating, Refrigerating and Air-Conditioning Engineers Standard 62. The performance level of the IADR unit and the overall system energy efficiency was measured and found to be very high. A comprehensive IAQ investigation was completed by the Georgia Tech Research Institute before and after the system retrofit. Before-and-after data resulting from this investigation confirmed a significant improvement in IAQ, humidity control, and occupant comfort. These observations were reported by building occupants and are echoed in a letter to ORNL from the school district energy manager. The IADR system was easily retrofitted in place of the original rooftop system using a custom curb adapter. All work was completed in-house by the school's maintenance staff over one weekend. A subsequent cost analysis completed for the school district by the design engineer of record concluded that the IADR

  9. Standard Verification System (SVS)

    Data.gov (United States)

    Social Security Administration — SVS is a mainframe program that accesses the NUMIDENT to perform SSN verifications. This program is called by SSA Internal applications to verify SSNs. There is also...

  10. Validation and Verification of Operational Land Analysis Activities at the Air Force Weather Agency

    Science.gov (United States)

    Shaw, Michael; Kumar, Sujay V.; Peters-Lidard, Christa D.; Cetola, Jeffrey

    2012-01-01

    The NASA developed Land Information System (LIS) is the Air Force Weather Agency's (AFWA) operational Land Data Assimilation System (LDAS) combining real time precipitation observations and analyses, global forecast model data, vegetation, terrain, and soil parameters with the community Noah land surface model, along with other hydrology module options, to generate profile analyses of global soil moisture, soil temperature, and other important land surface characteristics. (1) A range of satellite data products and surface observations used to generate the land analysis products (2) Global, 1/4 deg spatial resolution (3) Model analysis generated at 3 hours. AFWA recognizes the importance of operational benchmarking and uncertainty characterization for land surface modeling and is developing standard methods, software, and metrics to verify and/or validate LIS output products. To facilitate this and other needs for land analysis activities at AFWA, the Model Evaluation Toolkit (MET) -- a joint product of the National Center for Atmospheric Research Developmental Testbed Center (NCAR DTC), AFWA, and the user community -- and the Land surface Verification Toolkit (LVT), developed at the Goddard Space Flight Center (GSFC), have been adapted to operational benchmarking needs of AFWA's land characterization activities.

  11. Content analysis of age verification, purchase and delivery methods of internet e-cigarette vendors, 2013 and 2014.

    Science.gov (United States)

    Williams, Rebecca S; Derrick, Jason; Liebman, Aliza Kate; LaFleur, Kevin; Ribisl, Kurt M

    2017-05-08

    Identify the population of internet e-cigarette vendors (IEVs) and conduct content analyses of their age verification, purchase and delivery methods in 2013 and 2014. We used multiple sources to identify IEV websites, primarily complex search algorithms scanning more than 180 million websites. In 2013, we manually screened 32 446 websites, identifying 980 IEVs, selecting the 281 most popular for content analysis. This methodology yielded 31 239 websites for screening in 2014, identifying 3096 IEVs, with 283 selected for content analysis. The proportion of vendors that sold online-only, with no retail store, dropped significantly from 2013 (74.7%) to 2014 (64.3%) (pverification or relying exclusively on strategies that cannot effectively verify age. Effective age verification strategies such as online age verification services (7.1% in 2013 and 8.5% in 2014), driving licences (1.8% in 2013 and 7.4% in 2014, pverification at delivery (6.4% in 2013 and 8.1% in 2104) were rarely advertised on IEV websites. Nearly all vendors advertised accepting credit cards, and about ¾ shipping via United States Postal Service, similar to the internet cigarette industry prior to federal bans. The number of IEVs grew sharply from 2013 to 2014, with poor age verification practices. New and expanded regulations for online e-cigarette sales are needed, including strict age and identity verification requirements. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  12. Performance characteristics of an independent dose verification program for helical tomotherapy

    Directory of Open Access Journals (Sweden)

    Isaac C. F. Chang

    2017-01-01

    Full Text Available Helical tomotherapy with its advanced method of intensity-modulated radiation therapy delivery has been used clinically for over 20 years. The standard delivery quality assurance procedure to measure the accuracy of delivered radiation dose from each treatment plan to a phantom is time-consuming. RadCalc®, a radiotherapy dose verification software, has released specifically for beta testing a module for tomotherapy plan dose calculations. RadCalc®'s accuracy for tomotherapy dose calculations was evaluated through examination of point doses in ten lung and ten prostate clinical plans. Doses calculated by the TomoHDA™ tomotherapy treatment planning system were used as the baseline. For lung cases, RadCalc® overestimated point doses in the lung by an average of 13%. Doses within the spinal cord and esophagus were overestimated by 10%. Prostate plans showed better agreement, with overestimations of 6% in the prostate, bladder, and rectum. The systematic overestimation likely resulted from limitations of the pencil beam dose calculation algorithm implemented by RadCalc®. Limitations were more severe in areas of greater inhomogeneity and less prominent in regions of homogeneity with densities closer to 1 g/cm3. Recommendations for RadCalc® dose calculation algorithms and anatomical representation were provided based on the results of the study.

  13. Control Performance Verification of Power System Stabilizer with an EDLC in Islanded Microgrid

    Science.gov (United States)

    Tanabe, Takayuki; Suzuki, Shigeyuki; Ueda, Yoshinobu; Ito, Takamitsu; Numata, Shigeo; Shimoda, Eisuke; Funabashi, Toshihisa; Yokoyama, Ryuichi

    We developed a power system stabilizer with an EDLC (electric double layer capacitor) that makes it possible to operate microgrids autonomously from utility grids, and to maintain the electric power quality in an islanded microgrid. This paper proposes two types of control systems that are composed of a PFC (power flow compensator) and a CVCF (constant voltage constant frequency) compensator. Installation locations of this system with the CVCF compensator are not limited by hardware requirements, and can maintain the quality of electricity in the islanded microgrid. Also, it is possible for the CVCF compensator to manage a dynamic load sharing function. Therefore, it is not always necessary for this equipment to have the central controller by using information networks. The EDLC is capable of charging and discharging stored electricity at a short cycle repetitively and has an advantage to keep a storage resource minimum to maintain the electric power quality. This paper shows specifications and verification results obtained by simulation studies and by demonstrating experiments of this equipment. In order to verify the practicability of the proposed control, these experiments were carried out in the microgrid that is supplying electric power to actual loads.

  14. Design Verification and Performance Evaluation of an Enhanced Wideband CDMA Receiver Using Channel Measurements

    Directory of Open Access Journals (Sweden)

    Sultana Belhassen

    2005-01-01

    Full Text Available The spatio-temporal array receiver (STAR decomposes generic wideband CDMA channel responses across various parameter dimensions (e.g., time delays, multipath components, etc. and extracts the associated time-varying parameters (i.e., analysis before reconstructing the channel (i.e., synthesis with increased accuracy. This work verifies the channel analysis/synthesis design of STAR by illustrating its capability to extract accurately the channel parameters (time delays and drifts, carrier frequency offsets, Doppler spread, etc. from measured data and to adapt online to their observed time evolution in real-world propagation conditions. We also verify the performance of STAR by comparing the results achieved with generic and measured channels for an average multipath power profile of [ ] dB and a vehicular speed below 30 km/h. The results suggest that losses due to operations with real channels are only 1 dB in SNR and – % in capacity with DBPSK and single transmit and receive antennas. The corresponding SNR threshold for operation with real channels is about 5 dB.

  15. The middle range verification of numerical model performance for heavy rainfall in North China

    Science.gov (United States)

    Zhang, Bo; Zhao, Bin; Niu, Ruoyun

    2017-04-01

    The heavy rainfall forecast in North China is the focus and difficulty in middle range numerical weather forecast. 70 typical heavy precipitation cases in North China in summer from 2010 to 2016 are selected, which are divided into vortex type, the west trough and shear line type according to the atmospheric circulation. Based on ECMWF model and the Chinese operational model T639, a spatial verification method MODE is used, the middle range precipitation forecast abilities for heavy rain in summer in North China are evaluated according to contrast the difference of centroidal distance, axis angel and aspect ratios. It is found that the ECMWF model and the T639 model all show weak predictive ability for the low-vortex-type heavy rainfall in Northern China from all the similarities. When the area of rainfall is larger, the precipitation patterns of the two models are mostly northeast-southwest. It is consistent with the actual situation. For a large area of precipitation area, both models predict the precipitation area aspect ratio is less than 1. It shows that precipitation drop area is long and narrow, and the forecast is also consistent with the actual situation. However, as far as T639 and ECMWF models are concerned, there are systematic deviations in the precipitation area, and the predicted precipitation area is located on the southwestern side of the field. For smaller/larger areas of precipitation, the predicted precipitation area is larger/smaller than the actual situation. In addition, a sensitive test for the regional heavy precipitation process in North China (such as Huanghuai and other regions) from July 18 to 20, 2016 is also done and the results show that each numerical model of the process prediction is not successful. Therefore, further research is needed on the future correction of systematic bias of numerical models of regional heavy precipitation in medium-term forecasters.

  16. FINAL REPORT –INDEPENDENT VERIFICATION SURVEY SUMMARY AND RESULTS FOR THE ARGONNE NATIONAL LABORATORY BUILDING 330 PROJECT FOOTPRINT, ARGONNE, ILLINOIS

    Energy Technology Data Exchange (ETDEWEB)

    ERIKA N. BAILEY

    2012-02-29

    ORISE conducted onsite verification activities of the Building 330 project footprint during the period of June 6 through June 7, 2011. The verification activities included technical reviews of project documents, visual inspections, radiation surface scans, and sampling and analysis. The draft verification report was issued in July 2011 with findings and recommendations. The contractor performed additional evaluations and remediation.

  17. Using the SAL technique for spatial verification of cloud processes: A sensitivity analysis

    CERN Document Server

    Weniger, Michael

    2016-01-01

    The feature based spatial verification method SAL is applied to cloud data, i.e. two-dimensional spatial fields of total cloud cover and spectral radiance. Model output is obtained from the COSMO-DE forward operator SynSat and compared to SEVIRI satellite data. The aim of this study is twofold. First, to assess the applicability of SAL to this kind of data, and second, to analyze the role of external object identification algorithms (OIA) and the effects of observational uncertainties on the resulting scores. As a feature based method, SAL requires external OIA. A comparison of three different algorithms shows that the threshold level, which is a fundamental part of all studied algorithms, induces high sensitivity and unstable behavior of object dependent SAL scores (i.e. even very small changes in parameter values can lead to large changes in the resulting scores). An in-depth statistical analysis reveals significant effects on distributional quantities commonly used in the interpretation of SAL, e.g. median...

  18. Characteristics of a micro-fin evaporator: Theoretical analysis and experimental verification

    Directory of Open Access Journals (Sweden)

    Zheng Hui-Fan

    2013-01-01

    Full Text Available A theoretical analysis and experimental verification on the characteristics of a micro-fin evaporator using R290 and R717 as refrigerants were carried out. The heat capacity and heat transfer coefficient of the micro-fin evaporator were investigated under different water mass flow rate, different refrigerant mass flow rate, and different inner tube diameter of micro-fin evaporator. The simulation results of the heat transfer coefficient are fairly in good agreement with the experimental data. The results show that heat capacity and the heat transfer coefficient of the micro-fin evaporator increase with increasing logarithmic mean temperature difference, the water mass flow rate and the refrigerant mass flow rate. Heat capacity of the micro-fin evaporator for diameter 9.52 mm is higher than that of diameter 7.00 mm with using R290 as refrigerant. Heat capacity of the micro-fin evaporator with using R717 as refrigerant is higher than that of R290 as refrigerant. The results of this study can provide useful guidelines for optimal design and operation of micro-fin evaporator in its present or future applications.

  19. The role of the real-time simulation facility, SIMFAC, in the design, development and performance verification of the Shuttle Remote Manipulator System (SRMS) with man-in-the-loop

    Science.gov (United States)

    Mccllough, J. R.; Sharpe, A.; Doetsch, K. H.

    1980-01-01

    The SIMFAC has played a vital role in the design, development, and performance verification of the shuttle remote manipulator system (SRMS) to be installed in the space shuttle orbiter. The facility provides for realistic man-in-the-loop operation of the SRMS by an operator in the operator complex, a flightlike crew station patterned after the orbiter aft flight deck with all necessary man machine interface elements, including SRMS displays and controls and simulated out-of-the-window and CCTV scenes. The characteristics of the manipulator system, including arm and joint servo dynamics and control algorithms, are simulated by a comprehensive mathematical model within the simulation subsystem of the facility. Major studies carried out using SIMFAC include: SRMS parameter sensitivity evaluations; the development, evaluation, and verification of operating procedures; and malfunction simulation and analysis of malfunction performance. Among the most important and comprehensive man-in-the-loop simulations carried out to date on SIMFAC are those which support SRMS performance verification and certification when the SRMS is part of the integrated orbiter-manipulator system.

  20. Analysis of EDP performance

    Science.gov (United States)

    1994-01-01

    The objective of this contract was the investigation of the potential performance gains that would result from an upgrade of the Space Station Freedom (SSF) Data Management System (DMS) Embedded Data Processor (EDP) '386' design with the Intel Pentium (registered trade-mark of Intel Corp.) '586' microprocessor. The Pentium ('586') is the latest member of the industry standard Intel X86 family of CISC (Complex Instruction Set Computer) microprocessors. This contract was scheduled to run in parallel with an internal IBM Federal Systems Company (FSC) Internal Research and Development (IR&D) task that had the goal to generate a baseline flight design for an upgraded EDP using the Pentium. This final report summarizes the activities performed in support of Contract NAS2-13758. Our plan was to baseline performance analyses and measurements on the latest state-of-the-art commercially available Pentium processor, representative of the proposed space station design, and then phase to an IBM capital funded breadboard version of the flight design (if available from IR&D and Space Station work) for additional evaluation of results. Unfortunately, the phase-over to the flight design breadboard did not take place, since the IBM Data Management System (DMS) for the Space Station Freedom was terminated by NASA before the referenced capital funded EDP breadboard could be completed. The baseline performance analyses and measurements, however, were successfully completed, as planned, on the commercial Pentium hardware. The results of those analyses, evaluations, and measurements are presented in this final report.

  1. UR10 Performance Analysis

    DEFF Research Database (Denmark)

    Ravn, Ole; Andersen, Nils Axel; Andersen, Thomas Timm

    While working with the UR-10 robot arm, it has become apparent that some commands have undesired behaviour when operating the robot arm through a socket connection, sending one command at a time. This report is a collection of the results optained when testing the performance of the different...... commands available in URScript to control the robot. It will also describe the different time delays discovered when using the UR-10 robot arm...

  2. Specification, Verification and Optimisation of Business Processes

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas

    Model and Notation (BPMN). The automated analysis of business processes is done by means of quantitative probabilistic model checking which allows verification of validation and performance properties through use of an algorithm for the translation of business process models into a format amenable...

  3. Nonlinear response analysis and experimental verification for thin-walled plates to thermal-acoustic loads

    Directory of Open Access Journals (Sweden)

    Yundong SHA

    2017-12-01

    Full Text Available For large deflection strongly nonlinear response problem of thin-walled structure to thermal-acoustic load, thermal-acoustic excitation test and corresponding simulation analysis for clamped metallic thin-walled plate have been implemented. Comparing calculated values with experimental values shows the consistency and verifies the effectiveness of calculation method and model for thin-walled plate subjected to thermal-acoustic load. Then this paper further completes dynamic response calculation for the cross reinforcement plate under different thermal-acoustic load combinations. Based on the obtained time-domain displacement response, analyses about structure vibration forms are mainly focused on three typical motions of post-buckled plate, indicating that the relative strength between thermal load and acoustic load determines jump forms of plate. The Probability spectrum Density Functions (PDF of displacement response were drawn and analyzed by employing statistical analysis method, and it clearly shows that the PDF of post-buckled plate exhibits bimodal phenomena. Then the Power Spectral Density (PSD functions were used to analyze variations of response frequencies and corresponding peaks with the increase of temperatures, as well as how softening and hardening areas of the plate are determined. In the last section, this paper discusses the change laws of tensile stress and compressive stress in pre/post buckling areas, and gives the reasons for N glyph trend of the stress Root Mean Square (RMS. Keywords: Buckling, Experimental verification, Nonlinear response, Power spectral density, Probability spectrum density, Snap-through, Thermal-acoustic load, Thin-walled structure

  4. Tutorial on method verification: A routine method for the determination of heroin

    OpenAIRE

    Kar-Weng Chan

    2015-01-01

    Method verification is crucial in ensuring that a routine quantitative method remains fit for analysis. Verification is less comprehensive than validation because fewer aspects are covered. In addition, the aspects to be verified must have a significant impact on the analytical readings. In this paper, a verification process is presented in the form of tutorial in order to aid narcotics laboratories in performing this task in a more competent manner. Although heroin is used as an example in t...

  5. Integrated Advanced Microwave Sounding Unit-A (AMSU-A). Performance Verification Reports: Final Comprehensive Performance Test Report, P/N: 1356006-1, S.N: 202/A2

    Science.gov (United States)

    Platt, R.

    1998-01-01

    This is the Performance Verification Report. the process specification establishes the requirements for the comprehensive performance test (CPT) and limited performance test (LPT) of the earth observing system advanced microwave sounding unit-A2 (EOS/AMSU-A2), referred to as the unit. The unit is defined on drawing 1356006.

  6. On the Safety and Performance Demonstration Tests of Prototype Gen-IV Sodium-Cooled Fast Reactor and Validation and Verification of Computational Codes

    Directory of Open Access Journals (Sweden)

    Jong-Bum Kim

    2016-10-01

    Full Text Available The design of Prototype Gen-IV Sodium-Cooled Fast Reactor (PGSFR has been developed and the validation and verification (V&V activities to demonstrate the system performance and safety are in progress. In this paper, the current status of test activities is described briefly and significant results are discussed. The large-scale sodium thermal-hydraulic test program, Sodium Test Loop for Safety Simulation and Assessment-1 (STELLA-1, produced satisfactory results, which were used for the computer codes V&V, and the performance test results of the model pump in sodium showed good agreement with those in water. The second phase of the STELLA program with the integral effect tests facility, STELLA-2, is in the detailed design stage of the design process. The sodium thermal-hydraulic experiment loop for finned-tube sodium-to-air heat exchanger performance test, the intermediate heat exchanger test facility, and the test facility for the reactor flow distribution are underway. Flow characteristics test in subchannels of a wire-wrapped rod bundle has been carried out for safety analysis in the core and the dynamic characteristic test of upper internal structure has been performed for the seismic analysis model for the PGSFR. The performance tests for control rod assemblies (CRAs have been conducted for control rod drive mechanism driving parts and drop tests of the CRA under scram condition were performed. Finally, three types of inspection sensors under development for the safe operation of the PGSFR were explained with significant results.

  7. A Tutorial on Text-Independent Speaker Verification

    Directory of Open Access Journals (Sweden)

    Frédéric Bimbot

    2004-04-01

    Full Text Available This paper presents an overview of a state-of-the-art text-independent speaker verification system. First, an introduction proposes a modular scheme of the training and test phases of a speaker verification system. Then, the most commonly speech parameterization used in speaker verification, namely, cepstral analysis, is detailed. Gaussian mixture modeling, which is the speaker modeling technique used in most systems, is then explained. A few speaker modeling alternatives, namely, neural networks and support vector machines, are mentioned. Normalization of scores is then explained, as this is a very important step to deal with real-world data. The evaluation of a speaker verification system is then detailed, and the detection error trade-off (DET curve is explained. Several extensions of speaker verification are then enumerated, including speaker tracking and segmentation by speakers. Then, some applications of speaker verification are proposed, including on-site applications, remote applications, applications relative to structuring audio information, and games. Issues concerning the forensic area are then recalled, as we believe it is very important to inform people about the actual performance and limitations of speaker verification systems. This paper concludes by giving a few research trends in speaker verification for the next couple of years.

  8. Identification and verification of critical performance dimensions. Phase 1 of the systematic process redesign of drug distribution.

    Science.gov (United States)

    Colen, Hadewig B; Neef, Cees; Schuring, Roel W

    2003-06-01

    Worldwide patient safety has become a major social policy problem for healthcare organisations. As in other organisations, the patients in our hospital also suffer from an inadequate distribution process, as becomes clear from incident reports involving medication errors. Medisch Spectrum Twente is a top primary-care, clinical, teaching hospital. The hospital pharmacy takes care of 1070 internal beds and 1120 beds in an affiliated psychiatric hospital and nursing homes. In the beginning of 1999, our pharmacy group started a large interdisciplinary research project to develop a safe, effective and efficient drug distribution system by using systematic process redesign. The process redesign includes both organisational and technological components. This article describes the identification and verification of critical performance dimensions for the design of drug distribution processes in hospitals (phase 1 of the systematic process redesign of drug distribution). Based on reported errors and related causes, we suggested six generic performance domains. To assess the role of the performance dimensions, we used three approaches: flowcharts, interviews with stakeholders and review of the existing performance using time studies and medication error studies. We were able to set targets for costs, quality of information, responsiveness, employee satisfaction, and degree of innovation. We still have to establish what drug distribution system, in respect of quality and cost-effectiveness, represents the best and most cost-effective way of preventing medication errors. We intend to develop an evaluation model, using the critical performance dimensions as a starting point. This model can be used as a simulation template to compare different drug distribution concepts in order to define the differences in quality and cost-effectiveness.

  9. MPQC: Performance Analysis and Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Sarje, Abhinav [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Williams, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bailey, David [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2013-01-24

    MPQC (Massively Parallel Quantum Chemistry) is a widely used computational quantum chemistry code. It is capable of performing a number of computations commonly occurring in quantum chemistry. In order to achieve better performance of MPQC, in this report we present a detailed performance analysis of this code. We then perform loop and memory access optimizations, and measure performance improvements by comparing the performance of the optimized code with that of the original MPQC code. We observe that the optimized MPQC code achieves a significant improvement in the performance through a better utilization of vector processing and memory hierarchies.

  10. Verification of Large State/Event Systems using Compositionality and Dependency Analysis

    DEFF Research Database (Denmark)

    Lind-Nielsen, Jørn; Andersen, Henrik Reif; Behrmann, Gerd

    1999-01-01

    . This technique makes possible automated verification of large industrial designs with the use of only modest resources (less than one hour on a standard PC for a model with 1421 concurrent machines). The results of the paper are being implemented in the next version of the commercial tool \\visualstate....

  11. A Verification and Analysis of the USAF/DoD Fatigue Model and Fatigue Management Technology

    Science.gov (United States)

    2005-11-01

    We Nap: Evolution, Chronobiology, and Functions of Polyphasic and Ultrashort Sleep . Stampi, C. (ed) Birkhduser, Boston. Defense Acquisition...Windows® soffivare application of the Sleep , Activity, Fatigue, and Task Effectiveness (SAFTE) applied model. The application, the Fatigue Avoidance...Scheduling Tool (FASTTM) was re-engineered as a clone from the SAFTE specification. The verification considered nine sleep /wake schedules that were

  12. Introduction to the Special Issue on Specification Analysis and Verification of Reactive Systems

    NARCIS (Netherlands)

    Delzanno, Giorgio; Etalle, Sandro; Gabbrielli, Maurizio

    2006-01-01

    This special issue is inspired by the homonymous ICLP workshops that took place during ICLP 2001 and ICLP 2002. Extending and shifting slightly from the scope of their predecessors (on verification and logic languages) held in the context of previous editions of ICLP, the aim of the SAVE workshops

  13. Compositional verification of multi-agent systems: A formal analysis of pro-activeness and reactiveness

    NARCIS (Netherlands)

    Jonker, C.M.; Treur, J.

    1998-01-01

    A compositional method is presented for the verification of multi-agent systems. The advantages of the method are the well-structuredness of the proofs and the reusability of parts of these proofs in relation to reuse of components. The method is illustrated for an example multi-agent system,

  14. Compositional verification of multi-agent systems: A formal analysis of pro-activeness and reactiveness.

    NARCIS (Netherlands)

    Jonker, C.M.; Treur, J.

    2002-01-01

    A compositional method is presented for the verification of multi-agent systems. The advantages of the method are the well-structuredness of the proofs and the reusability of parts of these proofs in relation to reuse of components. The method is illustrated for an example multi-agent system,

  15. Analytical Performance Verification of FCS-MPC Applied to Power Electronic Converters

    DEFF Research Database (Denmark)

    Novak, Mateja; Dragicevic, Tomislav; Blaabjerg, Frede

    2017-01-01

    Since the introduction of finite control set model predictive control (FCS-MPC) in power electronics the algorithm has been missing an important aspect that would speed up its implementation in industry: a simple method to verify the algorithm performance. This paper proposes to use a statistical...... model checking (SMC) method for performance evaluation of the algorithm applied to power electronics converters. SMC is simple to implement, intuitive and it requires only an operational model of the system that can be simulated and checked against properties. Device under test for control algorithm...... application in this paper is a standard 2-level voltage source converter (VSC) with LC output filter used for uninterruptible power supply (UPS) systems. The performance of control algorithm is verified using the UPPAAL SMC toolbox and the behavior is compared to simulation results obtained from equivalent...

  16. International Performance Measurement & Verification Protocol: Concepts and Practices for Improved Indoor Environmental Quality, Volume II (Revised)

    Energy Technology Data Exchange (ETDEWEB)

    2002-03-01

    This protocol serves as a framework to determine energy and water savings resulting from the implementation of an energy efficiency program. It is also intended to help monitor the performance of renewable energy systems and to enhance indoor environmental quality in buildings.

  17. Performance verification of an epithermal neutron flux monitor using accelerator-based BNCT neutron sources

    Science.gov (United States)

    Guan, X.; Murata, I.; Wang, T.

    2017-09-01

    The performance of an epithermal neutron flux monitor developed for boron neutron capture therapy (BNCT) is verified by Monte Carlo simulations using accelerator-based neutron sources (ABNSs). The results indicate that the developed epithermal neutron flux monitor works well and it can be efficiently used in practical applications to measure the epithermal neutron fluxes of ABNSs in a high accuracy.

  18. Presentation and verification of a simple mathematical model foridentification of the areas behind noise barrierwith the highest performance

    Directory of Open Access Journals (Sweden)

    M. Monazzam

    2009-07-01

    Full Text Available Background and aims   Traffic noise barriers are the most important measure to control the environmental noise pollution. Diffraction from top edge of noise barriers is the most important path of indirect sound wave moves towards receiver.Therefore, most studies are focused on  improvement of this kind.   Methods   T-shape profile barriers are one of the most successful barrier among many different profiles. In this investigation the theory of destructive effect of diffracted waves from real edge of barrier and the wave diffracted from image of the barrier with phase difference of radians is used. Firstly a simple mathematical representation of the zones behind rigid and absorbent T- shape barriers with the highest insertion loss using the destructive effect of indirect path via barrier  image is introduced and then two different profile reflective and absorption barrier is used for  verification of the introduced model   Results   The results are then compared with the results of a verified two dimensional boundary element method at 1/3 octave band frequencies and in a wide field behind those barriers. Avery good agreement between the results has been achieved. In this method effective height is used for any different profile barriers.   Conclusion   The introduced model is very simple, flexible and fast and could be used for choosing the best location of profile rigid and absorptive barriers to achieve the highest  performance.  

  19. Analysis of Photovoltaic System Energy Performance Evaluation Method

    Energy Technology Data Exchange (ETDEWEB)

    Kurtz, S.; Newmiller, J.; Kimber, A.; Flottemesch, R.; Riley, E.; Dierauf, T.; McKee, J.; Krishnani, P.

    2013-11-01

    Documentation of the energy yield of a large photovoltaic (PV) system over a substantial period can be useful to measure a performance guarantee, as an assessment of the health of the system, for verification of a performance model to then be applied to a new system, or for a variety of other purposes. Although the measurement of this performance metric might appear to be straight forward, there are a number of subtleties associated with variations in weather and imperfect data collection that complicate the determination and data analysis. A performance assessment is most valuable when it is completed with a very low uncertainty and when the subtleties are systematically addressed, yet currently no standard exists to guide this process. This report summarizes a draft methodology for an Energy Performance Evaluation Method, the philosophy behind the draft method, and the lessons that were learned by implementing the method.

  20. Development and Performance Verification of Fiber Optic Temperature Sensors in High Temperature Engine Environments

    Science.gov (United States)

    Adamovsky, Grigory; Mackey, Jeffrey R.; Kren, Lawrence A.; Floyd, Bertram M.; Elam, Kristie A.; Martinez, Martel

    2014-01-01

    A High Temperature Fiber Optic Sensor (HTFOS) has been developed at NASA Glenn Research Center for aircraft engine applications. After fabrication and preliminary in-house performance evaluation, the HTFOS was tested in an engine environment at NASA Armstrong Flight Research Center. The engine tests enabled the performance of the HTFOS in real engine environments to be evaluated along with the ability of the sensor to respond to changes in the engine's operating condition. Data were collected prior, during, and after each test in order to observe the change in temperature from ambient to each of the various test point levels. An adequate amount of data was collected and analyzed to satisfy the research team that HTFOS operates properly while the engine was running. Temperature measurements made by HTFOS while the engine was running agreed with those anticipated.

  1. Grazing Incidence Wavefront Sensing and Verification of X-Ray Optics Performance

    Science.gov (United States)

    Saha, Timo T.; Rohrbach, Scott; Zhang, William W.

    2011-01-01

    Evaluation of interferometrically measured mirror metrology data and characterization of a telescope wavefront can be powerful tools in understanding of image characteristics of an x-ray optical system. In the development of soft x-ray telescope for the International X-Ray Observatory (IXO), we have developed new approaches to support the telescope development process. Interferometrically measuring the optical components over all relevant spatial frequencies can be used to evaluate and predict the performance of an x-ray telescope. Typically, the mirrors are measured using a mount that minimizes the mount and gravity induced errors. In the assembly and mounting process the shape of the mirror segments can dramatically change. We have developed wavefront sensing techniques suitable for the x-ray optical components to aid us in the characterization and evaluation of these changes. Hartmann sensing of a telescope and its components is a simple method that can be used to evaluate low order mirror surface errors and alignment errors. Phase retrieval techniques can also be used to assess and estimate the low order axial errors of the primary and secondary mirror segments. In this paper we describe the mathematical foundation of our Hartmann and phase retrieval sensing techniques. We show how these techniques can be used in the evaluation and performance prediction process of x-ray telescopes.

  2. Verification of an emerging LCA design tool through real life performance monitoring

    Directory of Open Access Journals (Sweden)

    Eon Christine

    2017-01-01

    Full Text Available Recent research has demonstrated that low-emission houses often underperform, consuming more energy than predicted by their designs. Life cycle assessments (LCA have been employed to complement mandatory energy assessments, as they offer a more comprehensive evaluation of greenhouse gas (GHG emissions over the building lifespan. This research monitored ten energy efficient Australian houses and recorded data about energy use and photovoltaic generation over 1 year. The houses were assessed with a relatively new LCA tool in addition to the Australian mandatory house energy assessment Nationwide House Energy Rating Scheme (NatHERS. The objective of this study was twofold: first, to evaluate the results of the assessment tools compared to actual house energy requirements and second, to understand how design, renewable energy, and occupancy can impact the overall GHG emissions of the houses. The results show that energy use is positively related to NatHERS ratings, but some of the high performance houses perform poorly and there was significant variation in energy use between houses with the same ratings. The LCA revealed that modern houses have higher embodied energy than older houses, while solar panels are not always used to their full potential. This paper attributes some of the variations between theoretical and actual energy use to construction issues and occupant practices.

  3. Verification of Commercial Motor Performance for WEAVE at the William Herschel Telescope

    Science.gov (United States)

    Gilbert, J.; Dalton, G.; Lewis, I.

    2016-10-01

    WEAVE is a 1000-fiber multi-object spectroscopic facility for the 4.2 m William Herschel Telescope. It will feature a double-headed pick-and-place fiber positioning robot comprising commercially available robotic axes. This paper presents results on the performance of these axes, obtained by testing a prototype system in the laboratory. Positioning accuracy is found to be better than the manufacturer's published values for the tested cases, indicating that the requirement for a maximum positioning error of 8.0 microns is achievable. Field reconfiguration times well within the planned 60 minute observation window are shown to be likely when individual axis movements are combined in an efficient way.

  4. DTU-ESA millimeter-wave validation standard antenna (mm-vast) – performance verification

    DEFF Research Database (Denmark)

    Pivnenko, Sergey; Kim, Oleksiy S.; Breinbjerg, Olav

    2015-01-01

    A new multi-frequency Validation Standard (VAST) antenna covering upper microwave (K/Ka) and millimeter wave (Q/V) bands, and thus called mmVAST, was developed in cooperation between DTU and TICRA under contract from the European Space Agency. In this paper, the mechanical and electrical requirem......A new multi-frequency Validation Standard (VAST) antenna covering upper microwave (K/Ka) and millimeter wave (Q/V) bands, and thus called mmVAST, was developed in cooperation between DTU and TICRA under contract from the European Space Agency. In this paper, the mechanical and electrical...... requirements as well as the design and manufacturing of the mm-VAST antenna are briefly presented. The focus is then given to the details of conducted mechanical and electrical tests aimed at verifying the performance of the manufactured antenna and to the obtained measurement results....

  5. Confidence Intervals Verification for Simulated Error Rate Performance of Wireless Communication System

    KAUST Repository

    Smadi, Mahmoud A.

    2012-12-06

    In this paper, we derived an efficient simulation method to evaluate the error rate of wireless communication system. Coherent binary phase-shift keying system is considered with imperfect channel phase recovery. The results presented demonstrate the system performance under very realistic Nakagami-m fading and additive white Gaussian noise channel. On the other hand, the accuracy of the obtained results is verified through running the simulation under a good confidence interval reliability of 95 %. We see that as the number of simulation runs N increases, the simulated error rate becomes closer to the actual one and the confidence interval difference reduces. Hence our results are expected to be of significant practical use for such scenarios. © 2012 Springer Science+Business Media New York.

  6. Aerodynamics and performance verifications of test methods for laboratory fume cupboards.

    Science.gov (United States)

    Tseng, Li-Ching; Huang, Rong Fung; Chen, Chih-Chieh; Chang, Cheng-Ping

    2007-03-01

    The laser-light-sheet-assisted smoke flow visualization technique is performed on a full-size, transparent, commercial grade chemical fume cupboard to diagnose the flow characteristics and to verify the validity of several current containment test methods. The visualized flow patterns identify the recirculation areas that would inevitably exist in the conventional fume cupboards because of the fundamental configurations and structures. The large-scale vortex structures exist around the side walls, the doorsill of the cupboard and in the vicinity of the near-wake region of the manikin. The identified recirculation areas are taken as the 'dangerous' regions where the risk of turbulent dispersion of contaminants may be high. Several existing tracer gas containment test methods (BS 7258:1994, prEN 14175-3:2003 and ANSI/ASHRAE 110:1995) are conducted to verify the effectiveness of these methods in detecting the contaminant leakage. By comparing the results of the flow visualization and the tracer gas tests, it is found that the local recirculation regions are more prone to contaminant leakage because of the complex interaction between the shear layers and the smoke movement through the mechanism of turbulent dispersion. From the point of view of aerodynamics, the present study verifies that the methodology of the prEN 14175-3:2003 protocol can produce more reliable and consistent results because it is based on the region-by-region measurement and encompasses the most area of the entire recirculation zone of the cupboard. A modified test method combined with the region-by-region approach at the presence of the manikin shows substantially different results of the containment. A better performance test method which can describe an operator's exposure and the correlation between flow characteristics and the contaminant leakage properties is therefore suggested.

  7. SU-E-T-350: Verification of Gating Performance of a New Elekta Gating Solution: Response Kit and Catalyst System

    Energy Technology Data Exchange (ETDEWEB)

    Xie, X; Cao, D; Housley, D; Mehta, V; Shepard, D [Swedish Cancer Institute, Seattle, WA (United States)

    2014-06-01

    Purpose: In this work, we have tested the performance of new respiratory gating solutions for Elekta linacs. These solutions include the Response gating and the C-RAD Catalyst surface mapping system.Verification measurements have been performed for a series of clinical cases. We also examined the beam on latency of the system and its impact on delivery efficiency. Methods: To verify the benefits of tighter gating windows, a Quasar Respiratory Motion Platform was used. Its vertical-motion plate acted as a respiration surrogate and was tracked by the Catalyst system to generate gating signals. A MatriXX ion-chamber array was mounted on its longitudinal-moving platform. Clinical plans are delivered to a stationary and moving Matrix array at 100%, 50% and 30% gating windows and gamma scores were calculated comparing moving delivery results to the stationary result. It is important to note that as one moves to tighter gating windows, the delivery efficiency will be impacted by the linac's beam-on latency. Using a specialized software package, we generated beam-on signals of lengths of 1000ms, 600ms, 450ms, 400ms, 350ms and 300ms. As the gating windows get tighter, one can expect to reach a point where the dose rate will fall to nearly zero, indicating that the gating window is close to beam-on latency. A clinically useful gating window needs to be significantly longer than the latency for the linac. Results: As expected, the use of tighter gating windows improved delivery accuracy. However, a lower limit of the gating window, largely defined by linac beam-on latency, exists at around 300ms. Conclusion: The Response gating kit, combined with the C-RAD Catalyst, provides an effective solution for respiratorygated treatment delivery. Careful patient selection, gating window design, even visual/audio coaching may be necessary to ensure both delivery quality and efficiency. This research project is funded by Elekta.

  8. Modification of the hand-held Vscan ultrasound and verification of its performance for transvaginal applications.

    Science.gov (United States)

    Troyano Luque, J M; Ferrer-Roca, O; Barco-Marcellán, M J; Sabatel López, R; Pérez-Medina, T; Pérez-Lopez, F R

    2013-01-01

    The purpose of this work was to validate a new clinical obstetrics and gynecology (OB-GYN) application for a hand-held ultrasound (US) device. We modified the smallest hand-held device on the market and tested the system for transvaginal (TV) use. This device was originally conceived for abdominal scanning only. The validation involved 80 successive patients examined by the same operator: 25 obstetric and 55 gynecologic cases. US examination was performed transvaginally with two US systems: the hand-held Vscan (General Electrics; GE Vingmed Ultrasound; Norway) for which an intravaginal gadget TTGP-2010® (Troyano transvaginal gadget probe) was designed, and the Voluson 730 Expert (multifrequency transvaginal ultrasound of 3-9MHz; GE Healthcare, Milwaukee, WI, USA). We performed the same measurements with both US systems in order to confirm whether or not their diagnostic capability was similar. Quantitative difference in measurements between the systems was assessed, as well as the overall diagnostic detection rate and suitability for telemedicine. Regarding lesion visibility with Vscan, optimal distance was 8-16cm depending on the examination type, and the total detection rate was 98.7%. The exception was an ovarian endometrioma, diagnosed as a follicular cyst using the hand-held device. Assessment of reproducibility in 180 measurements showed that the measurements obtained with Vscan were 0.3-0.4cm lower than those obtained with the high resolution US device (Voluson 730 Expert). Nevertheless, Pearson's correlation coefficient was high for biparietal diameter (0.72) and gynecological (GYN) (0.99) measurements, and for overall correlation (0.997). Image transport on USB and SD-flash cards proved convenient for telemedicine. A novel TV application of a hand-held US device is demonstrated for OB-GYN. Heart, abdominal and obstetrics presets of the Vscan together with color-Doppler enable a detection capability comparable to that of a high-definition US device. The

  9. Laboratory testing and performance verification of the CHARIS integral field spectrograph

    Science.gov (United States)

    Groff, Tyler D.; Chilcote, Jeffrey; Kasdin, N. Jeremy; Galvin, Michael; Loomis, Craig; Carr, Michael A.; Brandt, Timothy; Knapp, Gillian; Limbach, Mary Anne; Guyon, Olivier; Jovanovic, Nemanja; McElwain, Michael W.; Takato, Naruhisa; Hayashi, Masahiko

    2016-08-01

    The Coronagraphic High Angular Resolution Imaging Spectrograph (CHARIS) is an integral field spectrograph (IFS) that has been built for the Subaru telescope. CHARIS has two imaging modes; the high-resolution mode is R82, R69, and R82 in J, H, and K bands respectively while the low-resolution discovery mode uses a second low-resolution prism with R19 spanning 1.15-2.37 microns (J+H+K bands). The discovery mode is meant to augment the low inner working angle of the Subaru Coronagraphic Extreme Adaptive Optics (SCExAO) adaptive optics system, which feeds CHARIS a coronagraphic image. The goal is to detect and characterize brown dwarfs and hot Jovian planets down to contrasts five orders of magnitude dimmer than their parent star at an inner working angle as low as 80 milliarcseconds. CHARIS constrains spectral crosstalk through several key aspects of the optical design. Additionally, the repeatability of alignment of certain optical components is critical to the calibrations required for the data pipeline. Specifically, the relative alignment of the lenslet array, prism, and detector must be highly stable and repeatable between imaging modes. We report on the measured repeatability and stability of these mechanisms, measurements of spectral crosstalk in the instrument, and the propagation of these errors through the data pipeline. Another key design feature of CHARIS is the prism, which pairs Barium Fluoride with Ohara L-BBH2 high index glass. The dispersion of the prism is significantly more uniform than other glass choices, and the CHARIS prisms represent the first NIR astronomical instrument that uses L-BBH2 as the high index material. This material choice was key to the utility of the discovery mode, so significant efforts were put into cryogenic characterization of the material. The final performance of the prism assemblies in their operating environment is described in detail. The spectrograph is going through final alignment, cryogenic cycling, and is being

  10. Performance verification of network function virtualization in software defined optical transport networks

    Science.gov (United States)

    Zhao, Yongli; Hu, Liyazhou; Wang, Wei; Li, Yajie; Zhang, Jie

    2017-01-01

    With the continuous opening of resource acquisition and application, there are a large variety of network hardware appliances deployed as the communication infrastructure. To lunch a new network application always implies to replace the obsolete devices and needs the related space and power to accommodate it, which will increase the energy and capital investment. Network function virtualization1 (NFV) aims to address these problems by consolidating many network equipment onto industry standard elements such as servers, switches and storage. Many types of IT resources have been deployed to run Virtual Network Functions (vNFs), such as virtual switches and routers. Then how to deploy NFV in optical transport networks is a of great importance problem. This paper focuses on this problem, and gives an implementation architecture of NFV-enabled optical transport networks based on Software Defined Optical Networking (SDON) with the procedure of vNFs call and return. Especially, an implementation solution of NFV-enabled optical transport node is designed, and a parallel processing method for NFV-enabled OTN nodes is proposed. To verify the performance of NFV-enabled SDON, the protocol interaction procedures of control function virtualization and node function virtualization are demonstrated on SDON testbed. Finally, the benefits and challenges of the parallel processing method for NFV-enabled OTN nodes are simulated and analyzed.

  11. arXiv Performance verification of the CMS Phase-1 Upgrade Pixel detector

    CERN Document Server

    Veszpremi, Viktor

    2017-12-04

    The CMS tracker consists of two tracking systems utilizing semiconductor technology: the inner pixel and the outer strip detectors. The tracker detectors occupy the volume around the beam interaction region between 3 cm and 110 cm in radius and up to 280 cm along the beam axis. The pixel detector consists of 124 million pixels, corresponding to about 2 m 2 total area. It plays a vital role in the seeding of the track reconstruction algorithms and in the reconstruction of primary interactions and secondary decay vertices. It is surrounded by the strip tracker with 10 million read-out channels, corresponding to 200 m 2 total area. The tracker is operated in a high-occupancy and high-radiation environment established by particle collisions in the LHC . The current strip detector continues to perform very well. The pixel detector that has been used in Run 1 and in the first half of Run 2 was, however, replaced with the so-called Phase-1 Upgrade detector. The new system is better suited to match the increased inst...

  12. On the Safety and Performance Demonstration Tests of Prototype Gen-IV Sodium-Cooled Fast Reactor and Validation and Verification of Computational Codes

    OpenAIRE

    Kim, Jong-Bum; Jeong, Ji-Young; Lee, Tae-Ho; Kim, Sungkyun; Euh, Dong-Jin; Joo, Hyung-Kook

    2016-01-01

    The design of Prototype Gen-IV Sodium-Cooled Fast Reactor (PGSFR) has been developed and the validation and verification (V&V) activities to demonstrate the system performance and safety are in progress. In this paper, the current status of test activities is described briefly and significant results are discussed. The large-scale sodium thermal-hydraulic test program, Sodium Test Loop for Safety Simulation and Assessment-1 (STELLA-1), produced satisfactory results, which were used for the co...

  13. Groundwater flow code verification ``benchmarking`` activity (COVE-2A): Analysis of participants` work

    Energy Technology Data Exchange (ETDEWEB)

    Dykhuizen, R.C.; Barnard, R.W.

    1992-02-01

    The Nuclear Waste Repository Technology Department at Sandia National Laboratories (SNL) is investigating the suitability of Yucca Mountain as a potential site for underground burial of nuclear wastes. One element of the investigations is to assess the potential long-term effects of groundwater flow on the integrity of a potential repository. A number of computer codes are being used to model groundwater flow through geologic media in which the potential repository would be located. These codes compute numerical solutions for problems that are usually analytically intractable. Consequently, independent confirmation of the correctness of the solution is often not possible. Code verification is a process that permits the determination of the numerical accuracy of codes by comparing the results of several numerical solutions for the same problem. The international nuclear waste research community uses benchmarking for intercomparisons that partially satisfy the Nuclear Regulatory Commission (NRC) definition of code verification. This report presents the results from the COVE-2A (Code Verification) project, which is a subset of the COVE project.

  14. Thermal Power Plant Performance Analysis

    CERN Document Server

    2012-01-01

    The analysis of the reliability and availability of power plants is frequently based on simple indexes that do not take into account the criticality of some failures used for availability analysis. This criticality should be evaluated based on concepts of reliability which consider the effect of a component failure on the performance of the entire plant. System reliability analysis tools provide a root-cause analysis leading to the improvement of the plant maintenance plan.   Taking in view that the power plant performance can be evaluated not only based on  thermodynamic related indexes, such as heat-rate, Thermal Power Plant Performance Analysis focuses on the presentation of reliability-based tools used to define performance of complex systems and introduces the basic concepts of reliability, maintainability and risk analysis aiming at their application as tools for power plant performance improvement, including: ·         selection of critical equipment and components, ·         defini...

  15. Verification and Validation Process for Progressive Damage and Failure Analysis Methods in the NASA Advanced Composites Consortium

    Science.gov (United States)

    Wanthal, Steven; Schaefer, Joseph; Justusson, Brian; Hyder, Imran; Engelstad, Stephen; Rose, Cheryl

    2017-01-01

    The Advanced Composites Consortium is a US Government/Industry partnership supporting technologies to enable timeline and cost reduction in the development of certified composite aerospace structures. A key component of the consortium's approach is the development and validation of improved progressive damage and failure analysis methods for composite structures. These methods will enable increased use of simulations in design trade studies and detailed design development, and thereby enable more targeted physical test programs to validate designs. To accomplish this goal with confidence, a rigorous verification and validation process was developed. The process was used to evaluate analysis methods and associated implementation requirements to ensure calculation accuracy and to gage predictability for composite failure modes of interest. This paper introduces the verification and validation process developed by the consortium during the Phase I effort of the Advanced Composites Project. Specific structural failure modes of interest are first identified, and a subset of standard composite test articles are proposed to interrogate a progressive damage analysis method's ability to predict each failure mode of interest. Test articles are designed to capture the underlying composite material constitutive response as well as the interaction of failure modes representing typical failure patterns observed in aerospace structures.

  16. The experimental verification of a streamline curvature numerical analysis method applied to the flow through an axial flow fan

    Science.gov (United States)

    Pierzga, M. J.

    1981-01-01

    The experimental verification of an inviscid, incompressible through-flow analysis method is presented. The primary component of this method is an axisymmetric streamline curvature technique which is used to compute the hub-to-tip flow field of a given turbomachine. To analyze the flow field in the blade-to-blade plane of the machine, the potential flow solution of an infinite cascade of airfoils is also computed using a source model technique. To verify the accuracy of such an analysis method an extensive experimental verification investigation was conducted using an axial flow research fan. Detailed surveys of the blade-free regions of the machine along with intra-blade surveys using rotating pressure sensing probes and blade surface static pressure taps provide a one-to-one relationship between measured and predicted data. The results of this investigation indicate the ability of this inviscid analysis method to predict the design flow field of the axial flow fan test rotor to within a few percent of the measured values.

  17. Development of system performance verification test technology for KNGR; optimal design and performance test for DVI of ECC water

    Energy Technology Data Exchange (ETDEWEB)

    Suh, Kune Yull; Yoon, Sang Hyuk; Noh, Sang Woo [Seoul National University, Seoul (Korea); Lee, Il Suk [Kyung Hee University, Seoul (Korea)

    2002-03-01

    In this study, we completed the works for the specific thermal-hydraulic phenomena in the reactor downcomer during a operation of DVI system. The choices for a filed of study are focused on the general abilities and flow behaviors of DVI system in the downcomer, and we intended to be able to suggest the applicable data to the APR1400 for the inclusive, actual, various area from the phenomena caused by the ECC injection to the code analysis of steam flow, ECC bypass and the downcomer water level decrease. 28 refs., 77 figs., 14 tabs. (Author)

  18. Enhancing importance-performance analysis

    DEFF Research Database (Denmark)

    Eskildsen, Jacob Kjær; Kristensen, Kai

    2006-01-01

    Purpose: The interpretation of the importance/performance map is based on an assumption of independence between importance and performance but many studies question the validity of this assumption. The aim of this research is to develop a new typology for job satisfaction attributes as well...... in more than one subset. This is a problem with the data generating process that to some extent might influence the analysis. Practical implications: Profound impact on the way that the importance/performance map should be interpreted since non-proportional attributes will move both vertically as well...... as horizontally in the traditional importance/performance map as performance changes. Originality/value: This paper gives a theoretical explanation for the presence of non-proportional satisfiers and develops a new importance/performance map that takes the presence of non-proportional satisfiers into account....

  19. Performance Analysis using CPN Tools

    DEFF Research Database (Denmark)

    Wells, Lisa Marie

    2006-01-01

    This paper provides an overview of new facilities for performance analysis using Coloured Petri Nets and the tool CPN Tools. Coloured Petri Nets is a formal modeling language that is well suited for modeling and analyzing large and complex systems. The new facilities include support for collecting...... data during simulations, for generating different kinds of performance-related output, and for running multiple simulation replications. A simple example of a network protocol is used to illustrate the flexibility of the new facilities....

  20. Shift Performance Test and Analysis of Multipurpose Vehicle

    Directory of Open Access Journals (Sweden)

    Can Yang

    2014-08-01

    Full Text Available This paper presented an analysis of the gear shifting performances of a multipurpose vehicle transmission in driving condition by Ricardo's Gear Shift Quality Assessment (GSQA system. The performances of the transmission included the travel and effort of the gear shift lever and synchronizing time. The mathematic models of the transmission including the gear shift mechanism and synchronizer were developed in MATLAB. The model of the gear shift mechanism was developed to analyze the travel map of the gear shift lever and the model of the synchronizer was developed to obtain the force-time curve of the synchronizer during the slipping time. The model of the synchronizer was used to investigate the relationship between the performances of the transmission and the variation of parameters during gear shifting. The mathematic models of the gear shift mechanism and the synchronizer provided a rapid design and verification method for the transmission with ring spring.

  1. Total systems design analysis of high performance structures

    Science.gov (United States)

    Verderaime, V.

    1993-01-01

    Designer-control parameters were identified at interdiscipline interfaces to optimize structural systems performance and downstream development and operations with reliability and least life-cycle cost. Interface tasks and iterations are tracked through a matrix of performance disciplines integration versus manufacturing, verification, and operations interactions for a total system design analysis. Performance integration tasks include shapes, sizes, environments, and materials. Integrity integrating tasks are reliability and recurring structural costs. Significant interface designer control parameters were noted as shapes, dimensions, probability range factors, and cost. Structural failure concept is presented, and first-order reliability and deterministic methods, benefits, and limitations are discussed. A deterministic reliability technique combining benefits of both is proposed for static structures which is also timely and economically verifiable. Though launch vehicle environments were primarily considered, the system design process is applicable to any surface system using its own unique filed environments.

  2. Total systems design analysis of high performance structures

    Science.gov (United States)

    Verderaime, V.

    1993-11-01

    Designer-control parameters were identified at interdiscipline interfaces to optimize structural systems performance and downstream development and operations with reliability and least life-cycle cost. Interface tasks and iterations are tracked through a matrix of performance disciplines integration versus manufacturing, verification, and operations interactions for a total system design analysis. Performance integration tasks include shapes, sizes, environments, and materials. Integrity integrating tasks are reliability and recurring structural costs. Significant interface designer control parameters were noted as shapes, dimensions, probability range factors, and cost. Structural failure concept is presented, and first-order reliability and deterministic methods, benefits, and limitations are discussed. A deterministic reliability technique combining benefits of both is proposed for static structures which is also timely and economically verifiable. Though launch vehicle environments were primarily considered, the system design process is applicable to any surface system using its own unique filed environments.

  3. Comparability of the performance of in-line computer vision for geometrical verification of parts, produced by Additive Manufacturing

    DEFF Research Database (Denmark)

    Pedersen, David B.; Hansen, Hans N.

    2014-01-01

    -customized parts with narrow geometrical tolerances require individual verification whereas many hyper-complex parts simply cannot be measured by traditional means such as by optical or mechanical measurement tools. This paper address the challenge by detailing how in-line computer vision has been employed...

  4. Integrated Advanced Microwave Sounding Unit-A (AMSU-A). Performance Verification Report: METSAT Phase Locked Oscillator Assembly, P/N 1348360-1, S/N's F09

    Science.gov (United States)

    Pines, D.

    1999-01-01

    This is the Performance Verification Report, METSAT (Meteorological Satellites) Phase Locked Oscillator Assembly, P/N 1348360-1, S/N F09 and F10, for the Integrated Advanced Microwave Sounding Unit-A (AMSU-A).

  5. Integrated Advanced Microwave Sounding Unit-A (AMSU-A). Performance Verification Report: Final Comprehensive Performance Test Report, P/N 1331720-2TST, S/N 105/A1

    Science.gov (United States)

    Platt, R.

    1999-01-01

    This is the Performance Verification Report, Final Comprehensive Performance Test (CPT) Report, for the Integrated Advanced Microwave Sounding Unit-A (AMSU-A). This specification establishes the requirements for the CPT and Limited Performance Test (LPT) of the AMSU-1A, referred to here in as the unit. The sequence in which the several phases of this test procedure shall take place is shown.

  6. Performance verification and comparison of TianLong automatic hypersensitive hepatitis B virus DNA quantification system with Roche CAP/CTM system.

    Science.gov (United States)

    Li, Ming; Chen, Lin; Liu, Li-Ming; Li, Yong-Li; Li, Bo-An; Li, Bo; Mao, Yuan-Li; Xia, Li-Fang; Wang, Tong; Liu, Ya-Nan; Li, Zheng; Guo, Tong-Sheng

    2017-10-07

    To investigate and compare the analytical and clinical performance of TianLong automatic hypersensitive hepatitis B virus (HBV) DNA quantification system and Roche CAP/CTM system. Two hundred blood samples for HBV DNA testing, HBV-DNA negative samples and high-titer HBV-DNA mixture samples were collected and prepared. National standard materials for serum HBV and a worldwide HBV DNA panel were employed for performance verification. The analytical performance, such as limit of detection, limit of quantification, accuracy, precision, reproducibility, linearity, genotype coverage and cross-contamination, was determined using the TianLong automatic hypersensitive HBV DNA quantification system (TL system). Correlation and Bland-Altman plot analyses were carried out to compare the clinical performance of the TL system assay and the CAP/CTM system. The detection limit of the TL system was 10 IU/mL, and its limit of quantification was 30 IU/mL. The differences between the expected and tested concentrations of the national standards were less than ± 0.4 Log10 IU/mL, which showed high accuracy of the system. Results of the precision, reproducibility and linearity tests showed that the multiple test coefficient of variation (CV) of the same sample was less than 5% for 102-106 IU/mL; and for 30-108 IU/mL, the linear correlation coefficient r2 = 0.99. The TL system detected HBV DNA (A-H) genotypes and there was no cross-contamination during the "checkerboard" test. When compared with the CAP/CTM assay, the two assays showed 100% consistency in both negative and positive sample results (15 negative samples and 185 positive samples). No statistical differences between the two assays in the HBV DNA quantification values were observed (P > 0.05). Correlation analysis indicated a significant correlation between the two assays, r2 = 0.9774. The Bland-Altman plot analysis showed that 98.9% of the positive data were within the 95% acceptable range, and the maximum difference was -0

  7. Unmanned Aircraft Systems Minimum Operations Performance Standards End-to-End Verification and Validation (E2-V2) Simulation

    Science.gov (United States)

    Ghatas, Rania W.; Jack, Devin P.; Tsakpinis, Dimitrios; Vincent, Michael J.; Sturdy, James L.; Munoz, Cesar A.; Hoffler, Keith D.; Dutle, Aaron M.; Myer, Robert R.; Dehaven, Anna M.; hide

    2017-01-01

    As Unmanned Aircraft Systems (UAS) make their way to mainstream aviation operations within the National Airspace System (NAS), research efforts are underway to develop a safe and effective environment for their integration into the NAS. Detect and Avoid (DAA) systems are required to account for the lack of "eyes in the sky" due to having no human on-board the aircraft. The current NAS relies on pilot's vigilance and judgement to remain Well Clear (CFR 14 91.113) of other aircraft. RTCA SC-228 has defined DAA Well Clear (DAAWC) to provide a quantified Well Clear volume to allow systems to be designed and measured against. Extended research efforts have been conducted to understand and quantify system requirements needed to support a UAS pilot's ability to remain well clear of other aircraft. The efforts have included developing and testing sensor, algorithm, alerting, and display requirements. More recently, sensor uncertainty and uncertainty mitigation strategies have been evaluated. This paper discusses results and lessons learned from an End-to-End Verification and Validation (E2-V2) simulation study of a DAA system representative of RTCA SC-228's proposed Phase I DAA Minimum Operational Performance Standards (MOPS). NASA Langley Research Center (LaRC) was called upon to develop a system that evaluates a specific set of encounters, in a variety of geometries, with end-to-end DAA functionality including the use of sensor and tracker models, a sensor uncertainty mitigation model, DAA algorithmic guidance in both vertical and horizontal maneuvering, and a pilot model which maneuvers the ownship aircraft to remain well clear from intruder aircraft, having received collective input from the previous modules of the system. LaRC developed a functioning batch simulation and added a sensor/tracker model from the Federal Aviation Administration (FAA) William J. Hughes Technical Center, an in-house developed sensor uncertainty mitigation strategy, and implemented a pilot

  8. design analysis and performance evaluation

    African Journals Online (AJOL)

    The design analysis and performance evaluation of an active solar crop dryer was undertaken by drying marched cassava. The drying rate, system drying, collector and pick-up efficiencies were 1.6kg/day (14%/day), 9%, 46% and 29% respectively. Comparatively, the drying rate for sun drying was 0.9kg/day. The collector ...

  9. EXPERIMENTAL PERFORMANCE ANALYSIS OF WIRELESS ...

    African Journals Online (AJOL)

    EXPERIMENTAL PERFORMANCE ANALYSIS OF WIRELESS LINKS .... less reliable predictors of link quality. They recognised that the accuracy of the results of their work was, however, affected by delay im- posed by measurement window and the period of the ... a digital computer and analysing the results. Although.

  10. Techniques for Automated Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  11. Achievement of VO2max criteria during a continuous graded exercise test and a verification stage performed by college athletes.

    Science.gov (United States)

    Mier, Constance M; Alexander, Ryan P; Mageean, Amanda L

    2012-10-01

    The purpose of this study was to determine the incidence of meeting specific VO2max criteria and to test the effectiveness of a VO2max verification stage in college athletes. Thirty-five subjects completed a continuous graded exercise test (GXT) to volitional exhaustion. The frequency of achieving various respiratory exchange ratio (RER) and age-predicted maximum heart rate (HRmax) criteria and a VO2 plateau within 2 and 2.2 ml·kg(-1)·min(-1) (VO2max plateau was 5 (≤2 ml·kg(-1)·min(-1)) and 7 (≤2.2 ml·kg(-1)·min(-1)), RER criteria 34 (≥1.05), 32 (≥1.10), and 24 (≥1.15), HRmax criteria, 35 (VO2max and HRmax did not differ between GXT and the verification stage (53.6 ± 5.6 vs. 55.5 ± 5.6 ml·kg(-1)·min(-1) and 187 ± 7 vs. 187 ± 6 b·min(-1)); however, the RER was lower during the verification stage (1.15 ± 0.06 vs. 1.07 ± 0.07, p = 0.004). Six subjects achieved a similar VO2 (within 2.2 ml·kg(-1)·min(-1)), whereas 4 achieved a higher VO2 compared with the GXT. These data demonstrate that a continuous GXT limits the college athlete's ability to achieve VO2max plateau and certain RER and HR criteria. The use of a verification stage increases the frequency of VO2max achievement and may be an effective method to improve the accuracy of VO2max measurements in college athletes.

  12. Application of FE-analysis in Design and Verification of Bolted Joints according to VDI 2230 at CERN

    CERN Document Server

    Apeland, Jorgen; Dassa, Luca; Welo, Torgeir

    This thesis investigates how finite element analysis (FEA) can be used to simplify and improve analysis of bolted joints according to the guideline VDI 2230. Some aspects of how FEA can be applied to aid design and verification of bolted joints are given in the guideline, but not in a streamlined way that makes it simple and efficient to apply. The scope of this thesis is to clarify how FEA and VDI 2230 can be combined in analysis of bolted joints, and to present a streamlined workflow. The goal is to lower the threshold for carrying out such combined analysis. The resulting benefits are improved analysis validity and quality, and improved analysis efficiency. A case from the engineering department at CERN, where FEA has been used in analysis of bolted joints is used as basis to identify challenges in combining FEA and VDI 2230. This illustrates the need for a streamlined analysis strategy and well described workflow. The case in question is the Helium vessel (pressure vessel) for the DQW Crab Cavities, whi...

  13. Numident Online Verification Utility (NOVU)

    Data.gov (United States)

    Social Security Administration — NOVU is a mainframe application that accesses the NUMIDENT to perform real-time SSN verifications. This program is called by other SSA online programs that serve as...

  14. Initial verification and validation of RAZORBACK - A research reactor transient analysis code

    Energy Technology Data Exchange (ETDEWEB)

    Talley, Darren G. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    This report describes the work and results of the initial verification and validation (V&V) of the beta release of the Razorback code. Razorback is a computer code designed to simulate the operation of a research reactor (such as the Annular Core Research Reactor (ACRR)) by a coupled numerical solution of the point reactor kinetics equations, the energy conservation equation for fuel element heat transfer, and the mass, momentum, and energy conservation equations for the water cooling of the fuel elements. This initial V&V effort was intended to confirm that the code work to-date shows good agreement between simulation and actual ACRR operations, indicating that the subsequent V&V effort for the official release of the code will be successful.

  15. Verification Games: Crowd-Sourced Formal Verification

    Science.gov (United States)

    2016-03-01

    VERIFICATION GAMES : CROWD-SOURCED FORMAL VERIFICATION UNIVERSITY OF WASHINGTON MARCH 2016 FINAL TECHNICAL REPORT...DATES COVERED (From - To) JUN 2012 – SEP 2015 4. TITLE AND SUBTITLE VERIFICATION GAMES : CROWD-SOURCED FORMAL VERIFICATION 5a. CONTRACT NUMBER FA8750...clarification memorandum dated 16 Jan 09. 13. SUPPLEMENTARY NOTES 14. ABSTRACT Over the more than three years of the project Verification Games : Crowd-sourced

  16. Gender verification in competitive sports.

    Science.gov (United States)

    Simpson, J L; Ljungqvist, A; de la Chapelle, A; Ferguson-Smith, M A; Genel, M; Carlson, A S; Ehrhardt, A A; Ferris, E

    1993-11-01

    The possibility that men might masquerade as women and be unfair competitors in women's sports is accepted as outrageous by athletes and the public alike. Since the 1930s, media reports have fuelled claims that individuals who once competed as female athletes subsequently appeared to be men. In most of these cases there was probably ambiguity of the external genitalia, possibly as a result of male pseudohermaphroditism. Nonetheless, beginning at the Rome Olympic Games in 1960, the International Amateur Athletics Federation (IAAF) began establishing rules of eligibility for women athletes. Initially, physical examination was used as a method for gender verification, but this plan was widely resented. Thus, sex chromatin testing (buccal smear) was introduced at the Mexico City Olympic Games in 1968. The principle was that genetic females (46,XX) show a single X-chromatic mass, whereas males (46,XY) do not. Unfortunately, sex chromatin analysis fell out of common diagnostic use by geneticists shortly after the International Olympic Committee (IOC) began its implementation for gender verification. The lack of laboratories routinely performing the test aggravated the problem of errors in interpretation by inexperienced workers, yielding false-positive and false-negative results. However, an even greater problem is that there exist phenotypic females with male sex chromatin patterns (e.g. androgen insensitivity, XY gonadal dysgenesis). These individuals have no athletic advantage as a result of their congenital abnormality and reasonably should not be excluded from competition. That is, only the chromosomal (genetic) sex is analysed by sex chromatin testing, not the anatomical or psychosocial status. For all the above reasons sex chromatin testing unfairly excludes many athletes. Although the IOC offered follow-up physical examinations that could have restored eligibility for those 'failing' sex chromatin tests, most affected athletes seemed to prefer to 'retire'. All

  17. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) verification and validation plan. version 1.

    Energy Technology Data Exchange (ETDEWEB)

    Bartlett, Roscoe Ainsworth; Arguello, Jose Guadalupe, Jr.; Urbina, Angel; Bouchard, Julie F.; Edwards, Harold Carter; Freeze, Geoffrey A.; Knupp, Patrick Michael; Wang, Yifeng; Schultz, Peter Andrew; Howard, Robert (Oak Ridge National Laboratory, Oak Ridge, TN); McCornack, Marjorie Turner

    2011-01-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. To meet this objective, NEAMS Waste IPSC M&S capabilities will be applied to challenging spatial domains, temporal domains, multiphysics couplings, and multiscale couplings. A strategic verification and validation (V&V) goal is to establish evidence-based metrics for the level of confidence in M&S codes and capabilities. Because it is economically impractical to apply the maximum V&V rigor to each and every M&S capability, M&S capabilities will be ranked for their impact on the performance assessments of various components of the repository systems. Those M&S capabilities with greater impact will require a greater level of confidence and a correspondingly greater investment in V&V. This report includes five major components: (1) a background summary of the NEAMS Waste IPSC to emphasize M&S challenges; (2) the conceptual foundation for verification, validation, and confidence assessment of NEAMS Waste IPSC M&S capabilities; (3) specifications for the planned verification, validation, and confidence-assessment practices; (4) specifications for the planned evidence information management system; and (5) a path forward for the incremental implementation of this V&V plan.

  18. The 3D MHD code GOEMHD3 for astrophysical plasmas with large Reynolds numbers. Code description, verification, and computational performance

    Science.gov (United States)

    Skála, J.; Baruffa, F.; Büchner, J.; Rampp, M.

    2015-08-01

    Context. The numerical simulation of turbulence and flows in almost ideal astrophysical plasmas with large Reynolds numbers motivates the implementation of magnetohydrodynamical (MHD) computer codes with low resistivity. They need to be computationally efficient and scale well with large numbers of CPU cores, allow obtaining a high grid resolution over large simulation domains, and be easily and modularly extensible, for instance, to new initial and boundary conditions. Aims: Our aims are the implementation, optimization, and verification of a computationally efficient, highly scalable, and easily extensible low-dissipative MHD simulation code for the numerical investigation of the dynamics of astrophysical plasmas with large Reynolds numbers in three dimensions (3D). Methods: The new GOEMHD3 code discretizes the ideal part of the MHD equations using a fast and efficient leap-frog scheme that is second-order accurate in space and time and whose initial and boundary conditions can easily be modified. For the investigation of diffusive and dissipative processes the corresponding terms are discretized by a DuFort-Frankel scheme. To always fulfill the Courant-Friedrichs-Lewy stability criterion, the time step of the code is adapted dynamically. Numerically induced local oscillations are suppressed by explicit, externally controlled diffusion terms. Non-equidistant grids are implemented, which enhance the spatial resolution, where needed. GOEMHD3 is parallelized based on the hybrid MPI-OpenMP programing paradigm, adopting a standard two-dimensional domain-decomposition approach. Results: The ideal part of the equation solver is verified by performing numerical tests of the evolution of the well-understood Kelvin-Helmholtz instability and of Orszag-Tang vortices. The accuracy of solving the (resistive) induction equation is tested by simulating the decay of a cylindrical current column. Furthermore, we show that the computational performance of the code scales very

  19. Periodic verification of beam with wedge dynamics through analysis of dynalog files; Verificacion periodica de hace con cuna dinamica mediante analisis de los archivos dynalog

    Energy Technology Data Exchange (ETDEWEB)

    Camacho, C.; Perez-Alija, J.; Pedro, A.

    2013-07-01

    During the administration of the field such information is it sampled and collected in files called Dynalog. The objective of this work is the analysis of these files as a complement to regular quality control of the EDW technique, as well as the independent verification system of generation and control of dynamic wedge fields. (Author)

  20. A verification study and trend analysis of simulated boundary layer wind fields over Europe

    Energy Technology Data Exchange (ETDEWEB)

    Lindenberg, Janna

    2011-07-01

    Simulated wind fields from regional climate models (RCMs) are increasingly used as a surrogate for observations which are costly and prone to homogeneity deficiencies. Compounding the problem, a lack of reliable observations makes the validation of the simulated wind fields a non trivial exercise. Whilst the literature shows that RCMs tend to underestimate strong winds over land these investigations mainly relied on comparisons with near surface measurements and extrapolated model wind fields. In this study a new approach is proposed using measurements from high towers and a robust validation process. Tower height wind data are smoother and thus more representative of regional winds. As benefit this approach circumvents the need to extrapolate simulated wind fields. The performance of two models using different downscaling techniques is evaluated. The influence of the boundary conditions on the simulation of wind statistics is investigated. Both models demonstrate a reasonable performance over flat homogeneous terrain and deficiencies over complex terrain, such as the Upper Rhine Valley, due to a too coarse spatial resolution ({proportional_to}50 km). When the spatial resolution is increased to 10 and 20 km respectively a benefit is found for the simulation of the wind direction only. A sensitivity analysis shows major deviations of international land cover data. A time series analysis of dynamically downscaled simulations is conducted. While the annual cycle and the interannual variability are well simulated, the models are less effective at simulating small scale fluctuations and the diurnal cycle. The hypothesis that strong winds are underestimated by RCMs is supported by means of a storm analysis. Only two-thirds of the observed storms are simulated by the model using a spectral nudging approach. In addition ''False Alarms'' are simulated, which are not detected in the observations. A trend analysis over the period 1961 - 2000 is conducted

  1. Scalable Performance Measurement and Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gamblin, Todd [Univ. of North Carolina, Chapel Hill, NC (United States)

    2009-01-01

    Concurrency levels in large-scale, distributed-memory supercomputers are rising exponentially. Modern machines may contain 100,000 or more microprocessor cores, and the largest of these, IBM's Blue Gene/L, contains over 200,000 cores. Future systems are expected to support millions of concurrent tasks. In this dissertation, we focus on efficient techniques for measuring and analyzing the performance of applications running on very large parallel machines. Tuning the performance of large-scale applications can be a subtle and time-consuming task because application developers must measure and interpret data from many independent processes. While the volume of the raw data scales linearly with the number of tasks in the running system, the number of tasks is growing exponentially, and data for even small systems quickly becomes unmanageable. Transporting performance data from so many processes over a network can perturb application performance and make measurements inaccurate, and storing such data would require a prohibitive amount of space. Moreover, even if it were stored, analyzing the data would be extremely time-consuming. In this dissertation, we present novel methods for reducing performance data volume. The first draws on multi-scale wavelet techniques from signal processing to compress systemwide, time-varying load-balance data. The second uses statistical sampling to select a small subset of running processes to generate low-volume traces. A third approach combines sampling and wavelet compression to stratify performance data adaptively at run-time and to reduce further the cost of sampled tracing. We have integrated these approaches into Libra, a toolset for scalable load-balance analysis. We present Libra and show how it can be used to analyze data from large scientific applications scalably.

  2. Measurement campaigns for selection of optimum on-ground performance verification approach for large deployable reflector antenna

    DEFF Research Database (Denmark)

    Pivnenko, Sergey; Nielsen, Jeppe Majlund; Kim, Oleksiy S.

    2012-01-01

    This paper describes the measurement campaigns carried out at P-band (435 MHz) for selection of optimum on-ground verification approach for a large deployable reflector antenna (LDA). The feed array of the LDA was measured in several configurations with spherical, cylindrical, and planar near......-field techniques at near-field facilities in Denmark and in the Netherlands. The measured results for the feed array were then used in calculation of the radiation pattern and gain of the entire LDA. The primary goals for the campaigns were to obtain realistic measurement uncertainty estimates and to investigate...

  3. Visualization of Instrumental Verification Information Details (VIVID) : code development, description, and usage.

    Energy Technology Data Exchange (ETDEWEB)

    Roy, Christopher John; Bainbridge, Bruce L.; Potter, Donald L.; Blottner, Frederick G.; Black, Amalia Rebecca

    2005-03-01

    The formulation, implementation and usage of a numerical solution verification code is described. This code uses the Richardson extrapolation procedure to estimate the order of accuracy and error of a computational program solution. It evaluates multiple solutions performed in numerical grid convergence studies to verify a numerical algorithm implementation. Analyses are performed on both structured and unstructured grid codes. Finite volume and finite element discretization programs are examined. Two and three-dimensional solutions are evaluated. Steady state and transient solution analysis capabilities are present in the verification code. Multiple input data bases are accepted. Benchmark options are included to allow for minimal solution validation capability as well as verification.

  4. Statistical analysis and verification of 3-hourly geomagnetic activity probability predictions

    Science.gov (United States)

    Wang, Jingjing; Zhong, Qiuzhen; Liu, Siqing; Miao, Juan; Liu, Fanghua; Li, Zhitao; Tang, Weiwei

    2015-12-01

    The Space Environment Prediction Center (SEPC) has classified geomagnetic activity into four levels: quiet to unsettled (Kp 6). The 3-hourly Kp index prediction product provided by the SEPC is updated half hourly. In this study, the statistical conditional forecast models for the 3-hourly geomagnetic activity level were developed based on 10 years of data and applied to more than 3 years of data, using the previous Kp index, interplanetary magnetic field, and solar wind parameters measured by the Advanced Composition Explorer as conditional parameters. The quality of the forecast models was measured and compared against verifications of accuracy, reliability, discrimination capability, and skill of predicting all geomagnetic activity levels, especially the probability of reaching the storm level given a previous "calm" (nonstorm level) or "storm" (storm level) condition. It was found that the conditional models that used the previous Kp index, the peak value of BtV (the product of the total interplanetary magnetic field and speed), the average value of Bz (the southerly component of the interplanetary magnetic field), and BzV (the product of the southerly component of the interplanetary magnetic field and speed) over the last 6 h as conditional parameters provide a relative operating characteristic area of 0.64 and can be an appropriate predictor for the probability forecast of geomagnetic activity level.

  5. Integrated Advanced Microwave Sounding Unit-A (AMSU-A). Performance Verification Report: Initial Comprehensive Performance Test Report, P/N 1331200-2-IT, S/N 105/A2

    Science.gov (United States)

    Platt, R.

    1999-01-01

    This is the Performance Verification Report, Initial Comprehensive Performance Test Report, P/N 1331200-2-IT, S/N 105/A2, for the Integrated Advanced Microwave Sounding Unit-A (AMSU-A). The specification establishes the requirements for the Comprehensive Performance Test (CPT) and Limited Performance Test (LPT) of the Advanced Microwave Sounding, Unit-A2 (AMSU-A2), referred to herein as the unit. The unit is defined on Drawing 1331200. 1.2 Test procedure sequence. The sequence in which the several phases of this test procedure shall take place is shown in Figure 1, but the sequence can be in any order.

  6. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  7. Validation of a χ2 model of HRR target RCS variability and verification of the resulting ATR performance model

    Science.gov (United States)

    Holt, Craig R.; Attili, Joseph B.; Schmidt, Steven L.

    2001-10-01

    A (chi) 2 model for radar cross section (RCS) variability of High Range Resolution (HRR) measurements is validated using compact range data from the U.S. Army National Ground Intelligence Center (NGIC). It is shown that targets can be represented by a mean template and by a variance template, or in this case, an effective number of degrees-of-freedom for the (chi) 2-distribution. The analysis also includes comparison of the measured tails of the RCS distribution to that predicated by the (chi) 2-distribution. The likelihood classifier is obtained, and a Monte Carlo performance model is developed to validate the statistical model at the level of ATR performance.

  8. Spatial Verification Using Wavelet Transforms: A Review

    CERN Document Server

    Weniger, Michael; Friederichs, Petra

    2016-01-01

    Due to the emergence of new high resolution numerical weather prediction (NWP) models and the availability of new or more reliable remote sensing data, the importance of efficient spatial verification techniques is growing. Wavelet transforms offer an effective framework to decompose spatial data into separate (and possibly orthogonal) scales and directions. Most wavelet based spatial verification techniques have been developed or refined in the last decade and concentrate on assessing forecast performance (i.e. forecast skill or forecast error) on distinct physical scales. Particularly during the last five years, a significant growth in meteorological applications could be observed. However, a comparison with other scientific fields such as feature detection, image fusion, texture analysis, or facial and biometric recognition, shows that there is still a considerable, currently unused potential to derive useful diagnostic information. In order to tab the full potential of wavelet analysis, we revise the stat...

  9. Optimized periodic verification testing blended risk and performance-based MOV inservice test program an application of ASME code case OMN-1

    Energy Technology Data Exchange (ETDEWEB)

    Sellers, C.; Fleming, K.; Bidwell, D.; Forbes, P. [and others

    1996-12-01

    This paper presents an application of ASME Code Case OMN-1 to the GL 89-10 Program at the South Texas Project Electric Generating Station (STPEGS). Code Case OMN-1 provides guidance for a performance-based MOV inservice test program that can be used for periodic verification testing and allows consideration of risk insights. Blended probabilistic and deterministic evaluation techniques were used to establish inservice test strategies including both test methods and test frequency. Described in the paper are the methods and criteria for establishing MOV safety significance based on the STPEGS probabilistic safety assessment, deterministic considerations of MOV performance characteristics and performance margins, the expert panel evaluation process, and the development of inservice test strategies. Test strategies include a mix of dynamic and static testing as well as MOV exercising.

  10. State of the Art: Signature Biometrics Verification

    Directory of Open Access Journals (Sweden)

    Nourddine Guersi

    2010-04-01

    Full Text Available This paper presents a comparative analysis of the performance of three estimation algorithms: Expectation Maximization (EM, Greedy EM Algorithm (GEM and Figueiredo-Jain Algorithm (FJ - based on the Gaussian mixture models (GMMs for signature biometrics verification. The simulation results have shown significant performance achievements. The test performance of EER=5.49 % for "EM", EER=5.04 % for "GEM" and EER=5.00 % for "FJ", shows that the behavioral information scheme of signature biometrics is robust and has a discriminating power, which can be explored for identity authentication.

  11. The ASTRI SST-2M prototype for the next generation of Cherenkov telescopes: a single framework approach from requirement analysis to integration and verification strategy definition

    Science.gov (United States)

    Fiorini, Mauro; La Palombara, Nicola; Stringhetti, Luca; Canestrari, Rodolfo; Catalano, Osvaldo; Giro, Enrico; Leto, Giuseppe; Maccarone, Maria Concetta; Pareschi, Giovanni; Tosti, Gino; Vercellone, Stefano

    2014-08-01

    ASTRI is a flagship project of the Italian Ministry of Education, University and Research, which aims to develop an endto- end prototype of one of the three types of telescopes to be part of the Cherenkov Telescope Array (CTA), an observatory which will be the main representative of the next generation of Imaging Atmospheric Cherenkov Telescopes. The ASTRI project, led by the Italian National Institute of Astrophysics (INAF), has proposed an original design for the Small Size Telescope, which is aimed to explore the uppermost end of the Very High Energy domain up to about few hundreds of TeV with unprecedented sensitivity, angular resolution and imaging quality. It is characterized by challenging and innovative technological solutions which will be adopted for the first time in a Cherenkov telescope: a dual-mirror Schwarzschild-Couder configuration, a modular, light and compact camera based on silicon photomultipliers, and a front-end electronic based on a specifically designed ASIC. The end-to-end project is also including all the data-analysis software and the data archive. In this paper we describe the process followed to derive the ASTRI specifications from the CTA general requirements, a process which had to take into proper account the impact on the telescope design of the different types of the CTA requirements (performance, environment, reliability-availability-maintenance, etc.). We also describe the strategy adopted to perform the specification verification, which will be based on different methods (inspection, analysis, certification, and test) in order to demonstrate the telescope compliance with the CTA requirements. Finally we describe the integration planning of the prototype assemblies (structure, mirrors, camera, control software, auxiliary items) and the test planning of the end-to-end telescope. The approach followed by the ASTRI project is to have all the information needed to report the verification process along all project stages in a single

  12. Thermal Analysis of MIRIS Space Observation Camera for Verification of Passive Cooling

    Directory of Open Access Journals (Sweden)

    Duk-Hang Lee

    2012-09-01

    Full Text Available We conducted thermal analyses and cooling tests of the space observation camera (SOC of the multi-purpose infrared imaging system (MIRIS to verify passive cooling. The thermal analyses were conducted with NX 7.0 TMG for two cases of attitude of the MIRIS: for the worst hot case and normal case. Through the thermal analyses of the flight model, it was found that even in the worst case the telescope could be cooled to less than 206°K. This is similar to the results of the passive cooling test (~200.2°K. For the normal attitude case of the analysis, on the other hand, the SOC telescope was cooled to about 160°K in 10 days. Based on the results of these analyses and the test, it was determined that the telescope of the MIRIS SOC could be successfully cooled to below 200°K with passive cooling. The SOC is, therefore, expected to have optimal performance under cooled conditions in orbit.

  13. Finite element analysis of vibration-driven electro-active paper energy harvester with experimental verification

    Directory of Open Access Journals (Sweden)

    Zafar Abas

    2015-02-01

    Full Text Available In this research work, a coupled-field finite element model of electro-active paper energy harvester is presented, and the results are verified experimentally. Electro-active paper is a smart form of cellulose coated with electrodes on both sides. A finite element model was developed, and harmonic and transient analyses were performed using a commercial finite element analysis package. Two 80 mm × 50 mm and 100 mm × 50 mm aluminum cantilever benders bonded with electro-active paper were tested to validate the finite element model results. Displacement and voltage generated by the energy harvester at the electrode surfaces were measured. The electro-active paper energy harvesters were excited at their fundamental resonance frequencies by a sinusoidal force located 18 mm from the free end. The voltage obtained from the 80 mm × 50 mm and 100 mm × 50 mm electro-active paper energy harvester finite element model was 3.7 and 7 mV, respectively. Experimental results have shown good agreement with the finite element model. The direct piezoelectric effect of electro-active paper shows potential for a cellulose-based eco-friendly energy harvester.

  14. Energy Star Lighting Verification Program (Program for the Evaluation and Analysis of Residential Lighting)

    Energy Technology Data Exchange (ETDEWEB)

    Conan O' Rourke; Yutao Zhou

    2006-03-01

    The Program for the Evaluation and Analysis of Residential Lighting (PEARL) is a watchdog program. It was created in response to complaints received by utility program managers about the performance of certain Energy Star lighting products being promoted within their service territories and the lack of a self-policing mechanism within the lighting industry that would ensure the reliability of these products and their compliance with ENERGY STAR specifications. To remedy these problems, PEARL purchases and tests products that are available to the consumers in the marketplace. The Lighting Research Center (LRC) tests the selected products against the corresponding Energy Star specifications. This report includes the experimental procedure and data results of Cycle Three of PEARL program during the period of October 2002 to April 2003, along with the description of apparatus used, equipment calibration process, experimental methodology, and research findings from the testing. The products tested are 20 models of screw-based compact fluorescent lamps (CFL) of various types and various wattages made or marketed by 12 different manufacturers, and ten models of residential lighting fixtures from eight different manufacturers.

  15. Energy Star Lighting Verification Program (Program for the Evaluation and Analysis of Residential Lighting)

    Energy Technology Data Exchange (ETDEWEB)

    Conan O' Rourke; Yutao Zhou

    2006-03-01

    The Program for the Evaluation and Analysis of Residential Lighting (PEARL) is a watchdog program. It was created in response to complaints received by utility program managers about the performance of certain Energy Star lighting products being promoted within their service territories and the lack of a self-policing mechanism within the lighting industry that would ensure the reliability of these products and their compliance with ENERGY STAR specifications. To remedy these problems, PEARL purchases and tests products that are available to the consumers in the marketplace. The Lighting Research Center (LRC) tests the selected products against the corresponding Energy Star specifications. This report includes the experimental procedure and data results of Cycle Four and Cycle Five of PEARL program during the period of October 2003 to April 2004, along with the description of apparatus used, equipment calibration process, experimental methodology, and research findings from the testing. The parameter tested for Cycle Four is lumen maintenance at 40% rated life, and parameters tested for Cycle Five are all parameters required in Energy Star specifications except lumen maintenance at 40% rated life.

  16. Energy Star Lighting Verification Program (Program for the Evaluation and Analysis of Residential Lighting)

    Energy Technology Data Exchange (ETDEWEB)

    Conan O' Rourke; Yutao Zhou

    2006-03-01

    The Program for the Evaluation and Analysis of Residential Lighting (PEARL) is a watchdog program. It was created in response to complaints received by utility program managers about the performance of certain Energy Star lighting products being promoted within their service territories and the lack of a self-policing mechanism within the lighting industry that would ensure the reliability of these products and their compliance with ENERGY STAR specifications. To remedy these problems, PEARL purchases and tests products that are available to the consumers in the marketplace. The Lighting Research Center (LRC) tests the selected products against the corresponding Energy Star specifications. This report includes the experimental procedure and data results of Cycle Three and Cycle Four of PEARL program during the period of April 2003 to October 2003, along with the description of apparatus used, equipment calibration process, experimental methodology, and research findings from the testing. The parameter tested for Cycle three is lumen maintenance at 40% rated life, and parameters tested for Cycle Four are all parameters required in Energy Star specifications except lumen maintenance at 40% rated life.

  17. Multi-Field Analysis and Experimental Verification on Piezoelectric Valve-Less Pumps Actuated by Centrifugal Force

    Science.gov (United States)

    Ma, Yu-Ting; Pei, Zhi-Guo; Chen, Zhong-Xiang

    2017-07-01

    A piezoelectric centrifugal pump was developed previously to overcome the low frequency responses of piezoelectric pumps with check valves and liquid reflux of conventional valveless piezoelectric pumps. However, the electro-mechanical-fluidic analysis on this pump has not been done. Therefore, multi-field analysis and experimental verification on piezoelectrically actuated centrifugal valveless pumps are conducted for liquid transport applications. The valveless pump consists of two piezoelectric sheets and a metal tube with piezoelectric elements pushing the metal tube to swing at the first bending resonant frequency. The centrifugal force generated by the swinging motion will force the liquid out of the metal tube. The governing equations for the solid and fluid domains are established, and the coupling relations of the mechanical, electrical and fluid fields are described. The bending resonant frequency and bending mode in solid domain are discussed, and the liquid flow rate, velocity profile, and gauge pressure are investigated in fluid domain. The working frequency and flow rate concerning different components sizes are analyzed and verified through experiments to guide the pump design. A fabricated prototype with an outer diameter of 2.2 mm and a length of 80 mm produced the largest flow rate of 13.8 mL/min at backpressure of 0.8 kPa with driving voltage of 80 Vpp. By solving the electro-mechanical-fluidic coupling problem, the model developed can provide theoretical guidance on the optimization of centrifugal valveless pump characters.

  18. Environmental Technology Verification Report -- Baghouse filtration products, GE Energy QG061 filtration media ( tested May 2007)

    Science.gov (United States)

    EPA has created the Environmental Technology Verification Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The Air Pollution Control Technology Verification Center, a cente...

  19. Distorted Fingerprint Verification System

    Directory of Open Access Journals (Sweden)

    Divya KARTHIKAESHWARAN

    2011-01-01

    Full Text Available Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the template fingerprint image and matching score is computed. To improve the performance of the system, a fuzzy clustering based on distance and density has been used to cluster the feature set obtained from the fingerprint matcher. Finally a classifier framework has been developed and found that cost sensitive classifier produces better results. The system has been evaluated on fingerprint database and the experimental result shows that system produces a verification rate of 96%. This system plays an important role in forensic and civilian applications.

  20. Organics Verification Study for Sinclair and Dyes Inlets, Washington

    Energy Technology Data Exchange (ETDEWEB)

    Kohn, Nancy P.; Brandenberger, Jill M.; Niewolny, Laurie A.; Johnston, Robert K.

    2006-09-28

    Sinclair and Dyes Inlets near Bremerton, Washington, are on the State of Washington 1998 303(d) list of impaired waters because of fecal coliform contamination in marine water, metals in sediment and fish tissue, and organics in sediment and fish tissue. Because significant cleanup and source control activities have been conducted in the inlets since the data supporting the 1998 303(d) listings were collected, two verification studies were performed to address the 303(d) segments that were listed for metal and organic contaminants in marine sediment. The Metals Verification Study (MVS) was conducted in 2003; the final report, Metals Verification Study for Sinclair and Dyes Inlets, Washington, was published in March 2004 (Kohn et al. 2004). This report describes the Organics Verification Study that was conducted in 2005. The study approach was similar to the MVS in that many surface sediment samples were screened for the major classes of organic contaminants, and then the screening results and other available data were used to select a subset of samples for quantitative chemical analysis. Because the MVS was designed to obtain representative data on concentrations of contaminants in surface sediment throughout Sinclair Inlet, Dyes Inlet, Port Orchard Passage, and Rich Passage, aliquots of the 160 MVS sediment samples were used in the analysis for the Organics Verification Study. However, unlike metals screening methods, organics screening methods are not specific to individual organic compounds, and are not available for some target organics. Therefore, only the quantitative analytical results were used in the organics verification evaluation. The results of the Organics Verification Study showed that sediment quality outside of Sinclair Inlet is unlikely to be impaired because of organic contaminants. Similar to the results for metals, in Sinclair Inlet, the distribution of residual organic contaminants is generally limited to nearshore areas already within the

  1. Formal verification of human-automation interaction

    Science.gov (United States)

    Degani, Asaf; Heymann, Michael

    2002-01-01

    This paper discusses a formal and rigorous approach to the analysis of operator interaction with machines. It addresses the acute problem of detecting design errors in human-machine interaction and focuses on verifying the correctness of the interaction in complex and automated control systems. The paper describes a systematic methodology for evaluating whether the interface provides the necessary information about the machine to enable the operator to perform a specified task successfully and unambiguously. It also addresses the adequacy of information provided to the user via training material (e.g., user manual) about the machine's behavior. The essentials of the methodology, which can be automated and applied to the verification of large systems, are illustrated by several examples and through a case study of pilot interaction with an autopilot aboard a modern commercial aircraft. The expected application of this methodology is an augmentation and enhancement, by formal verification, of human-automation interfaces.

  2. Development of requirements tracking and verification system for the software design of distributed control system

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Chul Hwan; Kim, Jang Yeol; Kim, Jung Tack; Lee, Jang Soo; Ham, Chang Shik [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    In this paper a prototype of Requirement Tracking and Verification System(RTVS) for a Distributed Control System was implemented and tested. The RTVS is a software design and verification tool. The main functions required by the RTVS are managing, tracking and verification of the software requirements listed in the documentation of the DCS. The analysis of DCS software design procedures and interfaces with documents were performed to define the user of the RTVS, and the design requirements for RTVS were developed. 4 refs., 3 figs. (Author)

  3. Automatic Verification of Autonomous Robot Missions

    Science.gov (United States)

    2014-01-01

    for a mission related to the search for a biohazard. Keywords: mobile robots, formal verification , performance guarantees, automatic translation 1...tested. 2 Related Work Formal verification of systems is critical when failure creates a high cost, such as life or death scenarios. A variety of...robot. 3.3 PARS Process algebras are specification languages that allow for formal verification of concurrent systems. Process Algebra for Robot

  4. Verification of the model of predisposition in triathlon – structural model of confirmative factor analysis

    Directory of Open Access Journals (Sweden)

    Lenka Kovářová

    2012-09-01

    Full Text Available BACKGROUND: The triathlon is a combination of three different types of sport – swimming, cycling, and running. Each of these requires different top level predispositions and complex approach to talent selection is a rather difficult process. Attempts to identify assumptions in the triathlon have so far been specific and focused only on some groups of predispositions (physiology, motor tests, and psychology. The latest studies missed the structural approach and were based on determinants of sport performance, theory of sports training and expert assessment. OBJECTIVE: The aim of our study was to verify the model of predisposition in the short triathlon for talent assessment of young male athletes age 17–20 years. METHODS: The research sample consisted of 55 top level triathletes – men, who were included in the Government supported sports talent programme in the Czech Republic at the age of 17–20 years. We used a confirmative factor analysis (FA and Path diagram to verify the model, which allow us to explain mutual relationships among observed variables. For statistical data processing we used a structure equating modeling (SEM by software Lisrel L88. RESULTS: The study confirms best structural model for talent selection in triathlon at the age of 17–20 years old men, which composed seventeen indicators (tests and explained 91% of all cross-correlations (Goodness of Fit Index /GFI/ 0.91, Root Mean Square Residual /RMSR/ 0.13. Tests for predispositions in triathlons were grouped into five items, three motor predispositions (swimming, cycling and running skills, aerobic and psychological predispositions. Aerobic predispositions showed the highest importance to the assumptions to the general factor (1.00; 0. Running predispositions were measured as a very significant factor (–0.85; 0.28 which confirms importance of this critical stage of the race. Lower factor weight showed clusters of swimming (–0.61; 0.63 and cycling (0.53; 0

  5. Turbulence Modeling Verification and Validation

    Science.gov (United States)

    Rumsey, Christopher L.

    2014-01-01

    Computational fluid dynamics (CFD) software that solves the Reynolds-averaged Navier-Stokes (RANS) equations has been in routine use for more than a quarter of a century. It is currently employed not only for basic research in fluid dynamics, but also for the analysis and design processes in many industries worldwide, including aerospace, automotive, power generation, chemical manufacturing, polymer processing, and petroleum exploration. A key feature of RANS CFD is the turbulence model. Because the RANS equations are unclosed, a model is necessary to describe the effects of the turbulence on the mean flow, through the Reynolds stress terms. The turbulence model is one of the largest sources of uncertainty in RANS CFD, and most models are known to be flawed in one way or another. Alternative methods such as direct numerical simulations (DNS) and large eddy simulations (LES) rely less on modeling and hence include more physics than RANS. In DNS all turbulent scales are resolved, and in LES the large scales are resolved and the effects of the smallest turbulence scales are modeled. However, both DNS and LES are too expensive for most routine industrial usage on today's computers. Hybrid RANS-LES, which blends RANS near walls with LES away from walls, helps to moderate the cost while still retaining some of the scale-resolving capability of LES, but for some applications it can still be too expensive. Even considering its associated uncertainties, RANS turbulence modeling has proved to be very useful for a wide variety of applications. For example, in the aerospace field, many RANS models are considered to be reliable for computing attached flows. However, existing turbulence models are known to be inaccurate for many flows involving separation. Research has been ongoing for decades in an attempt to improve turbulence models for separated and other nonequilibrium flows. When developing or improving turbulence models, both verification and validation are important

  6. Diagnosing SWOT through Importance-performance Analysis

    OpenAIRE

    Alagirisamy Kamatchi Subbiah Sukumaran; Shanmugasundaram Chandrasekaran Sivasundaram

    2015-01-01

    The aim of the study is to evaluate the performance of the firms surveyed against the importance of the opportunities, threats, strengths and weaknesses applicable to those firms using Importance-Performance analysis. Firms optimize their Strengths, Weaknesses, Opportunities and Threats with the help of SWOT analysis. Martilla and James (1977) popularized the Importance-Performance analysis through their study titled with the same name. Importance-Performance analysis can be used to evaluate ...

  7. Thermal Analysis of the Driving Component Based on the Thermal Network Method in a Lunar Drilling System and Experimental Verification

    Directory of Open Access Journals (Sweden)

    Dewei Tang

    2017-03-01

    Full Text Available The main task of the third Chinese lunar exploration project is to obtain soil samples that are greater than two meters in length and to acquire bedding information from the surface of the moon. The driving component is the power output unit of the drilling system in the lander; it provides drilling power for core drilling tools. High temperatures can cause the sensors, permanent magnet, gears, and bearings to suffer irreversible damage. In this paper, a thermal analysis model for this driving component, based on the thermal network method (TNM was established and the model was solved using the quasi-Newton method. A vacuum test platform was built and an experimental verification method (EVM was applied to measure the surface temperature of the driving component. Then, the TNM was optimized, based on the principle of heat distribution. Through comparative analyses, the reasonableness of the TNM is validated. Finally, the static temperature field of the driving component was predicted and the “safe working times” of every mode are given.

  8. Complete Functional Verification

    OpenAIRE

    Bormann, Joerg (Dr.)

    2017-01-01

    The dissertation describes a practically proven, particularly efficient approach for the verification of digital circuit designs. The approach outperforms simulation based verification wrt. final circuit quality as well as wrt. required verification effort. In the dissertation, the paradigm of transaction based verification is ported from simulation to formal verification. One consequence is a particular format of formal properties, called operation properties. Circuit descriptions are verifi...

  9. Verification of finite element analysis of fixed partial denture with in vitro electronic strain measurement.

    Science.gov (United States)

    Wang, Gaoqi; Zhang, Song; Bian, Cuirong; Kong, Hui

    2016-01-01

    The purpose of the study was to verify the finite element analysis model of three-unite fixed partial denture with in vitro electronic strain analysis and analyze clinical situation with the verified model. First, strain gauges were attached to the critical areas of a three-unit fixed partial denture. Strain values were measured under 300 N load perpendicular to the occlusal plane. Secondly, a three-dimensional finite element model in accordance with the electronic strain analysis experiment was constructed from the scanning data. And the strain values obtained by finite element analysis and in vitro measurements were compared. Finally, the clinical destruction of the fixed partial denture was evaluated with the verified finite element analysis model. There was a mutual agreement and consistency between the finite element analysis results and experimental data. The finite element analysis revealed that failure will occur in the veneer layer on buccal surface of the connector under occlusal force of 570 N. The results indicate that the electronic strain analysis is an appropriate and cost saving method to verify the finite element model. The veneer layer on buccal surface of the connector is the weakest area in the fixed partial denture. Copyright © 2015 Japan Prosthodontic Society. Published by Elsevier Ltd. All rights reserved.

  10. Performance Analysis in Elite Sports

    NARCIS (Netherlands)

    Talsma, Bertus Gatze

    2013-01-01

    The central theme of this dissertation concerns the development of techniques for analyzing and comparing performances of elite sportsmen. When performances are delivered under varying circumstances, or are influenced by other factors than the athletes' abilities, a fair comparison, for instance

  11. Formal verification of complex properties on PLC programs

    CERN Document Server

    Darvas, D; Voros, A; Bartha, T; Blanco Vinuela, E; Gonzalez Suarez, V M

    2014-01-01

    Formal verification has become a recommended practice in the safety-critical application areas. However, due to the complexity of practical control and safety systems, the state space explosion often prevents the use of formal analysis. In this paper we extend our former verification methodology with effective property preserving reduction techniques. For this purpose we developed general rule-based reductions and a customized version of the Cone of Influence (COI) reduction. Using these methods, the verification of complex requirements formalised with temporal logics (e.g. CTL, LTL) can be orders of magnitude faster. We use the NuSMV model checker on a real-life PLC program from CERN to demonstrate the performance of our reduction techniques.

  12. ENVIRONMENTAL TECHNOLOGY VERIFICATION JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT, PERFORMANCE OF INDUCTION MIXERS FOR DISINFECTION OF WET WEATHER FLOWS, US FILTER/STRANCO PRODUCTS WATER CHAMP R F SERIES CHEMICAL INDUCTION SYSTEM

    Science.gov (United States)

    The Wet-Weather Flow Technologies Pilot of the EPA's Technology Verification (ETV) Program under a partnership with NSF International has verified the performawnce of the USFilter/Stranco Products chemical induction mixer used for disinfection of wet-weather flows. The USFilter t...

  13. Modeling of thinning process of structures in temperature analysis and its verification

    Energy Technology Data Exchange (ETDEWEB)

    Tsukimori, K.; Furuhashi, I. [Japan Nuclear Cycle Development Institute, JNC, Ibaraki-ken (Japan)

    2001-07-01

    It is important to consider the thinning process in analyzing the behavior of structures including the change of their strength when thinning of structures is significant due to corrosion, melting, etc. The thinning process in the stress or strain analysis can be expressed by using artificial creep and reduction of elastic modulus for example. If the thinning process goes with temperature change, temperature analysis has to be needed. If the structures are relatively thin like thin plates or thin shells, the effect of thinning process may be neglected in the temperature analysis. However, in the cases of thick structures or the structures of which temperature gradient in the thickness is expected to be large due to thermal boundary conditions, the thinning process should be considered in the temperature analyses as well as stress or strain analyses. In this study the modeling of thinning process in the temperature analysis has been developed. The detailed formulation is described and the function of this modeling is verified by simple one dimensional problem. As an applied example, a problem of thinning heat tube is analyzed. (authors)

  14. Analysis of Nerve Agent Metabolites from Hair for Long-Term Verification of Nerve Agent Exposure

    Science.gov (United States)

    2016-05-09

    other drugs of abuse.41 The analysis of hair has been extensively studied for popular drugs of abuse, such as cocaine, amphetamines, codeine, morphine...and marijuana .42−48 Researchers have also used hair samples to identify victims of sexual assault.42,49−53 In addition, hair samples have also been...curves. Recovery of IMPA and PMPA was determined at low, medium, and high QC concentrations by comparing analyte signals produced from spiked hair and

  15. Verification of a neutronic code for transient analysis in reactors with Hex-z geometry

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez-Pintor, S.; Verdu, G. [Departamento de Ingenieria Quimica Y Nuclear, Universitat Politecnica de Valencia, Cami de Vera, 14, 46022. Valencia (Spain); Ginestar, D. [Departamento de Matematica Aplicada, Universitat Politecnica de Valencia, Cami de Vera, 14, 46022. Valencia (Spain)

    2012-07-01

    Due to the geometry of the fuel bundles, to simulate reactors such as VVER reactors it is necessary to develop methods that can deal with hexagonal prisms as basic elements of the spatial discretization. The main features of a code based on a high order finite element method for the spatial discretization of the neutron diffusion equation and an implicit difference method for the time discretization of this equation are presented and the performance of the code is tested solving the first exercise of the AER transient benchmark. The obtained results are compared with the reference results of the benchmark and with the results provided by PARCS code. (authors)

  16. Fault Tree Analysis for Safety/Security Verification in Aviation Software

    Directory of Open Access Journals (Sweden)

    Andrew J. Kornecki

    2013-01-01

    Full Text Available The Next Generation Air Traffic Management system (NextGen is a blueprint of the future National Airspace System. Supporting NextGen is a nation-wide Aviation Simulation Network (ASN, which allows integration of a variety of real-time simulations to facilitate development and validation of the NextGen software by simulating a wide range of operational scenarios. The ASN system is an environment, including both simulated and human-in-the-loop real-life components (pilots and air traffic controllers. Real Time Distributed Simulation (RTDS developed at Embry Riddle Aeronautical University, a suite of applications providing low and medium fidelity en-route simulation capabilities, is one of the simulations contributing to the ASN. To support the interconnectivity with the ASN, we designed and implemented a dedicated gateway acting as an intermediary, providing logic for two-way communication and transfer messages between RTDS and ASN and storage for the exchanged data. It has been necessary to develop and analyze safety/security requirements for the gateway software based on analysis of system assets, hazards, threats and attacks related to ultimate real-life future implementation. Due to the nature of the system, the focus was placed on communication security and the related safety of the impacted aircraft in the simulation scenario. To support development of safety/security requirements, a well-established fault tree analysis technique was used. This fault tree model-based analysis, supported by a commercial tool, was a foundation to propose mitigations assuring the gateway system safety and security. 

  17. Finite element analysis and experimental verification of Polymer reinforced CRC improved for close-in detonation

    DEFF Research Database (Denmark)

    Riisgaard, Benjamin; Georgakis, Christos; Stang, Henrik

    2007-01-01

    Compact Reinforced Composite, CRC, is a high-strength cement-based composite that holds an enormous flexural and energy-absorbing capacity due to the close-spaced high strength steel reinforcement and a high-strength cement-based fiber DSP matrix. The material has been used in various constructions...... without breaching. This paper introduces an efficient method for implementing high fractions of polymer shock reinforcement in a CRC element. Experimental tests and explicit finite element analysis is used to demonstrate the potentials of this material. This paper also provides the reader...

  18. Logic analysis and verification of n-input genetic logic circuits

    DEFF Research Database (Denmark)

    Baig, Hasan; Madsen, Jan

    2017-01-01

    accordingly. As compared to electronic circuits, genetic circuits exhibit stochastic behavior and do not always behave as intended. Therefore, there is a growing interest in being able to analyze and verify the logical behavior of a genetic circuit model, prior to its physical implementation in a laboratory....... In this paper, we present an approach to analyze and verify the Boolean logic of a genetic circuit from the data obtained through stochastic analog circuit simulations. The usefulness of this analysis is demonstrated through different case studies illustrating how our approach can be used to verify the expected...... behavior of an n-input genetic logic circuit....

  19. Video Analysis Verification of Head Impact Events Measured by Wearable Sensors.

    Science.gov (United States)

    Cortes, Nelson; Lincoln, Andrew E; Myer, Gregory D; Hepburn, Lisa; Higgins, Michael; Putukian, Margot; Caswell, Shane V

    2017-08-01

    Wearable sensors are increasingly used to quantify the frequency and magnitude of head impact events in multiple sports. There is a paucity of evidence that verifies head impact events recorded by wearable sensors. To utilize video analysis to verify head impact events recorded by wearable sensors and describe the respective frequency and magnitude. Cohort study (diagnosis); Level of evidence, 2. Thirty male (mean age, 16.6 ± 1.2 years; mean height, 1.77 ± 0.06 m; mean weight, 73.4 ± 12.2 kg) and 35 female (mean age, 16.2 ± 1.3 years; mean height, 1.66 ± 0.05 m; mean weight, 61.2 ± 6.4 kg) players volunteered to participate in this study during the 2014 and 2015 lacrosse seasons. Participants were instrumented with GForceTracker (GFT; boys) and X-Patch sensors (girls). Simultaneous game video was recorded by a trained videographer using a single camera located at the highest midfield location. One-third of the field was framed and panned to follow the ball during games. Videographic and accelerometer data were time synchronized. Head impact counts were compared with video recordings and were deemed valid if (1) the linear acceleration was ≥20 g, (2) the player was identified on the field, (3) the player was in camera view, and (4) the head impact mechanism could be clearly identified. Descriptive statistics of peak linear acceleration (PLA) and peak rotational velocity (PRV) for all verified head impacts ≥20 g were calculated. For the boys, a total recorded 1063 impacts (2014: n = 545; 2015: n = 518) were logged by the GFT between game start and end times (mean PLA, 46 ± 31 g; mean PRV, 1093 ± 661 deg/s) during 368 player-games. Of these impacts, 690 were verified via video analysis (65%; mean PLA, 48 ± 34 g; mean PRV, 1242 ± 617 deg/s). The X-Patch sensors, worn by the girls, recorded a total 180 impacts during the course of the games, and 58 (2014: n = 33; 2015: n = 25) were verified via video analysis (32%; mean PLA, 39 ± 21 g; mean PRV, 1664

  20. 9 CFR 417.8 - Agency verification.

    Science.gov (United States)

    2010-01-01

    ... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the adequacy of the HACCP plan(s) by determining that each HACCP plan meets the requirements of this part and all other applicable regulations. Such verification may include: (a) Reviewing the HACCP plan; (b...

  1. Numerical analysis and experimental verification of elastomer bending process with different material models

    Directory of Open Access Journals (Sweden)

    Kut Stanislaw

    2016-01-01

    Full Text Available The article presents the results of tests in order to verifying the effectiveness of the nine selected elastomeric material models (Neo-Hookean, Mooney with two and three constants, Signorini, Yeoh, Ogden, Arruda-Boyce, Gent and Marlow, which the material constants were determined in one material test - the uniaxial tension testing. The convergence assessment of nine analyzed models were made on the basis of their performance from an experimental bending test of the elastomer samples from the results of numerical calculations FEM for each material models. To calculate the material constants for the analyzed materials, a model has been generated by the stressstrain characteristics created as a result of experimental uniaxial tensile test with elastomeric dumbbell samples, taking into account the parameters received in its 18th cycle. Using such a calculated material constants numerical simulation of the bending process of a elastomeric, parallelepipedic sampleswere carried out using MARC / Mentat program.

  2. Verification of the optimum tropospheric parameters setting for the kinematic PPP analysis

    Science.gov (United States)

    Hirata, Y.; Ohta, Y.

    2015-12-01

    Kinematic GNSS analysis is useful for extraction of the crustal deformation phenomena between seconds to one day such as coseismic and postseismic deformation after a large earthquake. The kinematic GNSS analysis, however, have fundamental difficulties for the separation between unknown parameters such as the site coordinate and tropospheric parameters, caused by a strong correlation between each other. Thus, we focused on the improvement of the separation precision between coordinate time series of kinematic PPP and wet zenith tropospheric delay (WZTD) based on the comprehensive search of the parameter space. We used GIPSY-OASIS II Ver. 6.3 software for kinematic PPP processing of whole GEONET sites in 10 March 2011. We applied the every 6 hours nominal WZTD value as a priori information based on the ECMWF global numerical climate model. For the coordinate time series and tropospheric parameters, we assumed white noise and random walk stochastic process, respectively. These unknown parameters are very sensitive to assumed process noise for each stochastic process. Thus, we searched for the optimum two variable parameters; wet zenith tropospheric parameter (named as TROP) and its gradient (named as GRAD). We defined the optimum parameters, which minimized the standard deviation of coordinate time series.We firstly checked the spatial distribution of optimum pair of TROP and GRAD. Even though the optimum parameters showed the certain range (TROP: 2×10-8 ~ 6×10-7 (horizontal), 5.5×10-9 ~ 2×10-8 (vertical); GRAD: 2×10-10 ~ 6×10-9 (horizontal), 2×10-10 ~ 1×10-8 (vertical) (unit: km·s-½)), we found they showed the large diversity. It suggests there are strong heterogeneity of atmospheric state. We also estimated temporal variations of optimum TROP and GRAD in specific site. We analyzed the data through 2010 at GEONET 940098 station located in the most southern part of Kyusyu, Japan. Obtained time series of optimum GRAD showed clear annual variation, and the

  3. Verification analysis of thermoluminescent albedo neutron dosimetry at MOX fuel facilities.

    Science.gov (United States)

    Nakagawa, Takahiro; Takada, Chie; Tsujimura, Norio

    2011-07-01

    Radiation workers engaging in the fabrication of MOX fuels at the Japan Atomic Energy Agency-Nuclear Fuel Cycle Engineering Laboratories are exposed to neutrons. Accordingly, thermoluminescent albedo dosemeters (TLADs) are used for individual neutron dosimetry. Because dose estimation using TLADs is susceptible to variation of the neutron energy spectrum, the authors have provided TLADs incorporating solid-state nuclear tracks detectors (SSNTDs) to selected workers who are routinely exposed to neutrons and have continued analysis of the relationship between the SSNTD and the TLAD (T/R(f)) over the past 6 y from 2004 to 2009. Consequently, the T/R(f) value in each year was less than the data during 1991-1993, although the neutron spectra had not changed since then. This decrease of the T/R(f) implies that the ratio of operation time nearby gloveboxes and the total work time has decreased.

  4. Verification of Three-Phase Dependency Analysis Bayesian Network Learning Method for Maize Carotenoid Gene Mining.

    Science.gov (United States)

    Liu, Jianxiao; Tian, Zonglin

    2017-01-01

    Mining the genes related to maize carotenoid components is important to improve the carotenoid content and the quality of maize. On the basis of using the entropy estimation method with Gaussian kernel probability density estimator, we use the three-phase dependency analysis (TPDA) Bayesian network structure learning method to construct the network of maize gene and carotenoid components traits. In the case of using two discretization methods and setting different discretization values, we compare the learning effect and efficiency of 10 kinds of Bayesian network structure learning methods. The method is verified and analyzed on the maize dataset of global germplasm collection with 527 elite inbred lines. The result confirmed the effectiveness of the TPDA method, which outperforms significantly another 9 kinds of Bayesian network learning methods. It is an efficient method of mining genes for maize carotenoid components traits. The parameters obtained by experiments will help carry out practical gene mining effectively in the future.

  5. Control analysis and experimental verification of a practical dc–dc boost converter

    Directory of Open Access Journals (Sweden)

    Saswati Swapna Dash

    2015-12-01

    Full Text Available This paper presents detailed open loop and closed loop analysis on boost dc–dc converter for both voltage mode control and current mode control. Here the boost dc–dc converter is a practical converter considering all possible parasitic elements like ESR and on state voltage drops. The open loop control, closed loop current mode control and voltage mode control are verified. The comparative study of all control techniques is presented. The PI compensator for closed loop current mode control is designed using these classical techniques like root locus technique and bode diagram. The simulation results are validated with the experimental results of voltage mode control for both open loop and closed loop control.

  6. Model Performance Evaluation and Scenario Analysis (MPESA)

    Science.gov (United States)

    Model Performance Evaluation and Scenario Analysis (MPESA) assesses the performance with which models predict time series data. The tool was developed Hydrological Simulation Program-Fortran (HSPF) and the Stormwater Management Model (SWMM)

  7. System Verification Through Reliability, Availability, Maintainability (RAM) Analysis & Technology Readiness Levels (TRLs)

    Energy Technology Data Exchange (ETDEWEB)

    Emmanuel Ohene Opare, Jr.; Charles V. Park

    2011-06-01

    The Next Generation Nuclear Plant (NGNP) Project, managed by the Idaho National Laboratory (INL), is authored by the Energy Policy Act of 2005, to research, develop, design, construct, and operate a prototype fourth generation nuclear reactor to meet the needs of the 21st Century. A section in this document proposes that the NGNP will provide heat for process heat applications. As with all large projects developing and deploying new technologies, the NGNP is expected to meet high performance and availability targets relative to current state of the art systems and technology. One requirement for the NGNP is to provide heat for the generation of hydrogen for large scale productions and this process heat application is required to be at least 90% or more available relative to other technologies currently on the market. To reach this goal, a RAM Roadmap was developed highlighting the actions to be taken to ensure that various milestones in system development and maturation concurrently meet required availability requirements. Integral to the RAM Roadmap was the use of a RAM analytical/simulation tool which was used to estimate the availability of the system when deployed based on current design configuration and the maturation level of the system.

  8. An analysis of depressive symptoms in stroke survivors: verification of a moderating effect of demographic characteristics.

    Science.gov (United States)

    Park, Eun-Young; Kim, Jung-Hee

    2017-04-08

    The rehabilitation of depressed stroke patients is more difficult because poststroke depression is associated with disruption of daily activities, functioning, and quality of life. However, research on depression in stroke patients is limited. The aim of our study was to evaluate the interaction of demographic characteristics including gender, age, education level, the presence of a spouse, and income status on depressive symptoms in stroke patients and to identify groups that may need more attention with respect to depressive symptoms. We completed a secondary data analysis using data from a completed cross-sectional study of people with stroke. Depression was measured using the Center for Epidemiologic Studies Depression Scale. In this study, depressive symptoms in women living with a spouse were less severe than among those without a spouse. For those with insufficient income, depressive symptom scores were higher in the above high school group than in the below high school group, but were lower in patients who were living with a spouse than in those living without a spouse. Assessing depressive symptoms after stroke should consider the interaction of gender, economic status, education level, and the presence/absence of a spouse. These results would help in comprehensive understanding of the importance of screening for and treating depressive symptoms during rehabilitation after stroke.

  9. Verification of ceramic structures

    NARCIS (Netherlands)

    Behar-Lafenetre, S.; Cornillon, L.; Rancurel, M.; Graaf, D. de; Hartmann, P.; Coe, G.; Laine, B.

    2012-01-01

    In the framework of the "Mechanical Design and Verification Methodologies for Ceramic Structures" contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and

  10. Verification and disarmament

    Energy Technology Data Exchange (ETDEWEB)

    Blix, H. [IAEA, Vienna (Austria)

    1998-07-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed.

  11. Systematic analysis of biological and physical limitations of proton beam range verification with offline PET/CT scans

    NARCIS (Netherlands)

    Knopf, A; Parodi, K.; Bortfeld, Thomas; Shih, Helen A; Paganetti, Harald

    2009-01-01

    The clinical use of offline positron emission tomography/computed tomography (PET/CT) scans for proton range verification is currently under investigation at the Massachusetts General Hospital (MGH). Validation is achieved by comparing measured activity distributions, acquired in patients after

  12. Space Shuttle Crawler Transporter Vibration Analysis in Support of Rollout Fatigue Load Spectra Verification Program

    Science.gov (United States)

    Margasahayam, Ravi N.; Meyer, Karl A.; Nerolich, Shaun M.; Burton, Roy C.; Gosselin, Armand M.

    2004-01-01

    The Crawler Transporter (CT), designed and built for the Apollo Program in the 1960's and surpassing its initial operational life, has become an integral part of the Space Shuttle Program (SSP). The CT transports the Space Shuttle Vehicle (SSV) stack, atop the Mobile Launch Platform (MLP), from the Vehicle Assembly Building (VAB) to the launch pad. This support structure provides hydraulic jacking, leveling and load equalization for the 12 million pound stack on its 3.5-5.0 mile rollout to the launch pad. Major elements of the SSV, consisting of the orbiter, solid rocket boosters (SRB) and external tank (ET) have required fatigue analyses as part of the mission life certification. Compared to rollout vibration, the SSV sees relatively high vibration loads during launch, ascent, descent and landing phases of the mission. Although preliminary measured SRB vibration levels during rollout were of low amplitude and frequency, the duration of the rollout phase is typically high, from 5-6 hours. As part of an expanded mission life assessment, additional certification effort was initiated to define fatigue load spectra for rollout. This study addresses the CT vibration analyses in support of the rollout fatigue study. Structural models developed for modal and vibration analyses were used to identify unique CT, CT/MLP and CT/MLP/SRB vibration characteristics for comparison to instrumented rollout tests. Whereas the main structural and vibration characteristics of the SSV are well defined, minimum analytical and vibration test data on the Crawler Transporter were available. Unique vibration characteristics of the CT are attributable to the drive mechanism, hydraulic jacking system, structural framing and the CT-to-MLP support pad restraints. Initial tests performed on the CT/MLP/SRB configuration showed reasonable correlation with predicted mode shapes and frequencies.

  13. Proteomic analysis and qRT-PCR verification of temperature response to Arthrospira (Spirulina platensis.

    Directory of Open Access Journals (Sweden)

    Wang Huili

    Full Text Available Arthrospira (Spirulina platensis (ASP is a representative filamentous, non-N2-fixing cyanobacterium that has great potential to enhance the food supply and possesses several valuable physiological features. ASP tolerates high and low temperatures along with highly alkaline and salty environments, and can strongly resist oxidation and irradiation. Based on genomic sequencing of ASP, we compared the protein expression profiles of this organism under different temperature conditions (15°C, 35°Cand 45°C using 2-DE and peptide mass fingerprinting techniques. A total of 122 proteins having a significant differential expression response to temperature were retrieved. Of the positively expressed proteins, the homologies of 116 ASP proteins were found in Arthrospira (81 proteins in Arthrospira platensis str. Paraca and 35 in Arthrospira maxima CS-328. The other 6 proteins have high homology with other microorganisms. We classified the 122 differentially expressed positive proteins into 14 functions using the COG database, and characterized their respective KEGG metabolism pathways. The results demonstrated that these differentially expressed proteins are mainly involved in post-translational modification (protein turnover, chaperones, energy metabolism (photosynthesis, respiratory electron transport, translation (ribosomal structure and biogenesis and carbohydrate transport and metabolism. Others proteins were related to amino acid transport and metabolism, cell envelope biogenesis, coenzyme metabolism and signal transduction mechanisms. Results implied that these proteins can perform predictable roles in rendering ASP resistance against low and high temperatures. Subsequently, we determined the transcription level of 38 genes in vivo in response to temperature and identified them by qRT-PCR. We found that the 26 differentially expressed proteins, representing 68.4% of the total target genes, maintained consistency between transcription and

  14. New approach to accuracy verification of 3D surface models: An analysis of point cloud coordinates.

    Science.gov (United States)

    Lee, Wan-Sun; Park, Jong-Kyoung; Kim, Ji-Hwan; Kim, Hae-Young; Kim, Woong-Chul; Yu, Chin-Ho

    2016-04-01

    The precision of two types of surface digitization devices, i.e., a contact probe scanner and an optical scanner, and the trueness of two types of stone replicas, i.e., one without an imaging powder (SR/NP) and one with an imaging powder (SR/P), were evaluated using a computer-aided analysis. A master die was fabricated from stainless steel. Ten impressions were taken, and ten stone replicas were prepared from Type IV stone (Fujirock EP, GC, Leuven, Belgium). The precision of two types of scanners was analyzed using the root mean square (RMS), measurement error (ME), and limits of agreement (LoA) at each coordinate. The trueness of the stone replicas was evaluated using the total deviation. A Student's t-test was applied to compare the discrepancies between the CAD-reference-models of the master die (m-CRM) and point clouds for the two types of stone replicas (α=.05). The RMS values for the precision were 1.58, 1.28, and 0.98μm along the x-, y-, and z-axes in the contact probe scanner and 1.97, 1.32, and 1.33μm along the x-, y-, and z-axes in the optical scanner, respectively. A comparison with m-CRM revealed a trueness of 7.10μm for SR/NP and 8.65μm for SR/P. The precision at each coordinate (x-, y-, and z-axes) was revealed to be higher than the one assessed in the previous method (overall offset differences). A comparison between the m-CRM and 3D surface models of the stone replicas revealed a greater dimensional change in SR/P than in SR/NP. Copyright © 2015 Japan Prosthodontic Society. Published by Elsevier Ltd. All rights reserved.

  15. Accuracy verification and analysis of SEA method for calculating radiation noise pressure of submerged cylindrical shell

    Directory of Open Access Journals (Sweden)

    ZHANG Kai

    2017-08-01

    Full Text Available Statistical Energy Analysis(SEAis an effective method for solving high frequency structural vibration and acoustic radiation problems. When we use it to analyze submerged structures, it is necessary to consider the actions of fluid as'heavy fluid' relative to structures, which differs from when it is used in the air. The simple model of a submerged cylindrical shell is used to calculate at a higher frequency using FEM/BEM. The SEA and FEM method are then used to calculate the radiation sound pressure level, verifying the accuracy of the SEA prediction for submerged structures. The classified method of subsystems and the effect of the error of the internal loss factor on the accuracy of the results are explored. The calculated results of SEA and FEM/BEM are very different below 400 Hz, and basically the same above 400 Hz. The error caused by the division of different subsystems is about 5 dB. The error in the calculation results caused by the error of the internal loss factor is 2-3 dB. It is possible to use SEA to calculate the radiated noise of an underwater cylindrical shell when the modal density is high enough.For the cylindrical shell, dividing the subsystems along the circumference is not reliable at a low frequency, as it may lead to inaccurate calculation results. At a high frequency, it is more accurate to divide the subsystems along the circumference than the axle. For subsystems with high energy, the internal loss factor has a greater effect on the simulation results, so a more accurate way should be taken to determine the internal loss factor of subsystems with high energy.

  16. Automated Verification of Mesoscale Forecasts using Image Processing Techniques

    Science.gov (United States)

    2005-09-30

    human-machine interaction, and model and forecast verification with an emphasis on mesoscale ensembles and visualization of uncertainty . The... uncertainty . The verification effort?s long-term goal is to develop an automated, objective verification technique for assessment of very high-resolution...into the MVT framework. These include agglomerative cluster analysis (Marzban and Sandgathe, 2005) and variograms . WORK COMPLETED FY2003

  17. An Update on the Mechanical and EM Performance of the Composite Dish Verification Antenna (DVA-1) for the SKA

    Science.gov (United States)

    Lacy, G. E.; Fleming, M.; Baker, L.; Imbriale, W.; Cortes-Medellin, G.; Veidt, B.; Hovey, G. J.; DeBoer, D.

    2012-01-01

    This paper will give an overview of the unique mechanical and optical design of the DVA-1 telescope. The rim supported carbon fibre reflector surfaces are designed to be both low cost and have high performance under wind, gravity, and thermal loads. The shaped offset Gregorian optics offer low and stable side lobes along with a large area at the secondary focus for multiple feeds with no aperture blockage. Telescope performance under ideal conditions as well as performance under gravity, wind, and thermal loads will be compared directly using calculated radiation patterns for each of these operating conditions.

  18. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    Science.gov (United States)

    The Niton XLt 700 Series (XLt) XRF Services x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the XLt analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the XLt analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy

  19. 0-6674 : improving fracture resistance measurement in asphalt binder specification with verification on asphalt mixture cracking performance.

    Science.gov (United States)

    2014-08-01

    The current performance grading (PG) specification for asphalt binders is based primarily on the study of unmodified asphalt binders. Over the years, experience has proven that the PG grading system, while good for ensuring overall quality, fails in ...

  20. Code Verification Capabilities and Assessments in Support of ASC V&V Level 2 Milestone #6035

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Budzien, Joanne Louise [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ferguson, Jim Michael [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Harwell, Megan Louise [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hickmann, Kyle Scott [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Israel, Daniel M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Magrogan, William Richard III [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Singleton, Jr., Robert [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Srinivasan, Gowri [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Walter, Jr, John William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Woods, Charles Nathan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-26

    This document provides a summary of the code verification activities supporting the FY17 Level 2 V&V milestone entitled “Deliver a Capability for V&V Assessments of Code Implementations of Physics Models and Numerical Algorithms in Support of Future Predictive Capability Framework Pegposts.” The physics validation activities supporting this milestone are documented separately. The objectives of this portion of the milestone are: 1) Develop software tools to support code verification analysis; 2) Document standard definitions of code verification test problems; and 3) Perform code verification assessments (focusing on error behavior of algorithms). This report and a set of additional standalone documents serve as the compilation of results demonstrating accomplishment of these objectives.

  1. An Illumination Independent Face Verification Based on Gabor Wavelet and Supported Vector Machine

    Science.gov (United States)

    Zhang, Xingming; Liu, Dian; Chen, Jianfu

    Face verification technology is widely used in the fields of public safety, e-commerce and so on. Due its characteristic of insensitive to the varied illumination, a new method about face verification with illumination invariant is presented in this paper based on gabor wavelet. First, ATICR method is used to do light preprocessing on images. Second, certain gabor wavelet filters, which are selected on the experiment inducing different gagor wavelet filter has not the same effect in verification, are used to extract feature of the image, of which the dimension in succession is reduced by Principal Component Analysis. At last, SVM classifiers are modeled on the data with reduced dimension. The experiment results in IFACE database and NIRFACE database indicate the algorithm named "Selected Paralleled Gabor Method" can achieves higher verification performance and better adaptability to the variable illumination.

  2. Verification, Performance Analysis and Controller Synthesis for Real-Time Systems

    DEFF Research Database (Denmark)

    Fahrenberg, Uli; Larsen, Kim Guldstrand; Thrane, Claus Rørbæk

    2009-01-01

    This note aims at providing a concise and precise Travellers Guide, Phrase Book or Reference Manual to the timed automata modeling formalism introduced by Alur and Dill [7, 8]. The note gives comprehensive definitions of timed automata, priced (or weighted) timed automata, and timed games...

  3. A Formal Verification Model for Performance Analysis of Reinforcement Learning Algorithms Applied t o Dynamic Networks

    OpenAIRE

    Shrirang Ambaji KULKARNI; Raghavendra G . RAO

    2017-01-01

    Routing data packets in a dynamic network is a difficult and important problem in computer networks. As the network is dynamic, it is subject to frequent topology changes and is subject to variable link costs due to congestion and bandwidth. Existing shortest path algorithms fail to converge to better solutions under dynamic network conditions. Reinforcement learning algorithms posses better adaptation techniques in dynamic environments. In this paper we apply model based Q-Routing technique ...

  4. Verification and Analysis of Implementing Virtual Electric Devices in Circuit Simulation of Pulsed DC Electrical Devices in the NI MULTISIM 10.1 Environment

    Directory of Open Access Journals (Sweden)

    V. A. Solov'ev

    2015-01-01

    Full Text Available The paper presents the analysis results of the implementation potential and evaluation of the virtual electric devices reliability when conducting circuit simulation of pulsed DC electrical devices in the NI Multisim 10.1environment. It analyses metrological properties of electric measuring devices and sensors of the NI Multisim 10.1environment. To calculate the reliable parameters of periodic non-sinusoidal electrical values based on their physical feasibility the mathematical expressions have been defined.To verify the virtual electric devices a circuit model of the power section of buck DC converter with enabled devices under consideration at its input and output is used as a consumer of pulse current of trapezoidal or triangular form. It is used as an example to show a technique to verify readings of virtual electric measuring devices in the NI Multisim 10.1environment.It is found that when simulating the pulsed DC electric devices to measure average and RMS voltage supply and current consumption values it is advisable to use the probe. Electric device power consumption read from the virtual power meter is equal to its average value, and its displayed power factor is inversely proportional to the input current form factor. To determine the RMS pulsed DC current by ammeter and multi-meter it is necessary to measure current by these devices in DC and AC modes, and then determine the RMS value of measurement results.Virtual electric devices verification has proved the possibility of their application to determine the energy performance of transistor converters for various purposes in the circuit simulation in the NI 10.1 Multisim environment, thus saving time of their designing.

  5. Automated Verification of Virtualized Infrastructures

    DEFF Research Database (Denmark)

    Bleikertz, Sören; Gross, Thomas; Mödersheim, Sebastian Alexander

    2011-01-01

    Virtualized infrastructures and clouds present new challenges for security analysis and formal verification: they are complex environments that continuously change their shape, and that give rise to non-trivial security goals such as isolation and failure resilience requirements. We present...

  6. Interim Letter Report - Verification Survey Results for Activities Performed in March 2009 for the Vitrification Test Facility Warehouse at the West Valley Demonstration Project, Ashford, New York

    Energy Technology Data Exchange (ETDEWEB)

    B.D. Estes

    2009-04-24

    The objective of the verification activities was to provide independent radiological surveys and data for use by the Department of Energy (DOE) to ensure that the building satisfies the requirements for release without radiological controls.

  7. Super-SILAC mix coupled with SIM/AIMS assays for targeted verification of phosphopeptides discovered in a large-scale phosphoproteome analysis of hepatocellular carcinoma.

    Science.gov (United States)

    Lin, Yu-Tsun; Chien, Kun-Yi; Wu, Chia-Chun; Chang, Wen-Yu; Chu, Lichieh Julie; Chen, Min-Chi; Yeh, Chau-Ting; Yu, Jau-Song

    2017-03-22

    Plentiful studies have established a close association between aberrant phosphorylation and hepatocellular carcinoma (HCC). Here, we applied a quantitative phosphoproteomics platform combining dimethylation labeling and online 3D strong cation exchange chromatography (SCX)-titanium oxide (TiO2)/RP-LTQ-Orbitrap to compare phosphoproteomes between three pairs of HCC tissues and non-tumor counterparts. This analysis yielded 7868 quantifiable phosphopeptides and numerous up- or down-regulated candidates. Increased phosphorylation of LMNA and NIPA was confirmed using specific antibodies. To expand our verification capability, we evaluated the use of LTQ-Orbitrap run in SIM/Accurate inclusion mass screening (AIMS) mode with a super-SILAC mixture as an internal standard to quantify a subset of phosphopeptide candidates in HCC tissue samples. In sample I used for discovery experiment, we successfully quantified 32 (in SIM mode) and 30 (in AIMS mode) phosphopeptides with median coefficients of variation (CVs) of 7.5% and 8.3%, respectively. When the assay was applied to other three pairs of HCC specimens for verification experiment, 40 target phosphopeptides were quantified reliably (~7.5% CV), and more than half of them were differentially expressed between tumor and adjacent non-tumor tissues. Collectively, these results indicate the feasibility of using super-SILAC mix-SIM/AIMS assays for targeted verification of phosphopeptides discovered by large-scale phosphoproteome analyses of HCC specimens. In this study, we developed a strategy for conducting both discovery and targeted verification of deregulated phosphoproteins in HCC tissue specimens on LTQ-Orbitrap. This strategy allowed us to generate a quantitative HCC tissue phosphoproteome dataset containing significantly deregulated phosphoproteins that represents a valuable resource for the identification of potential HCC biomarkers and/or therapeutic targets. Furthermore, our proof-of-concept experiments demonstrated

  8. Power and performance software analysis and optimization

    CERN Document Server

    Kukunas, Jim

    2015-01-01

    Power and Performance: Software Analysis and Optimization is a guide to solving performance problems in modern Linux systems. Power-efficient chips are no help if the software those chips run on is inefficient. Starting with the necessary architectural background as a foundation, the book demonstrates the proper usage of performance analysis tools in order to pinpoint the cause of performance problems, and includes best practices for handling common performance issues those tools identify. Provides expert perspective from a key member of Intel's optimization team on how processors and memory

  9. Compression-distraction reduction surgical verification and optimization to treat the basilar invagination and atlantoaxial dislocation: a finite element analysis.

    Science.gov (United States)

    Bo, Xuefeng; Wang, Weida; Chen, Zan; Liu, Zhicheng

    2016-12-28

    Basilar invagination (BI) combined with atlantoaxial dislocation (AAD) leads to foramen magnum stenosis and medullary spinal cord compression, causing nerve dysfunction. The purpose of the surgery is to remove the bony compression at brainstem ventral side and fix the unstable spinal segment and make it fused stably. Occipital cervical internal fixation system that simultaneously reduces atlantoaxial horizontal and vertical dislocation are established. We propose here a new compression-distraction reduction (CDR) technique. We aimed to construct a congenital BI-AAD preoperative finite element model (FEM) to simulate the CDR technique for AAD reduction surgery. Based on computed tomographic scans of patients' cervical vertebrae, a three-dimensional (3D) geometric model of the cervical spine (C0-C4) of congenital BI-AAD patients was established using Mimics13.1, Geomagic2012, and Space Claim14.0 softwares. The mechanical parameters of the tissues were assigned according to their material characteristics using ANSYS Workbench 14.0 software. A 3D FEM was established using the tetrahedral mesh method. The bending moment was loaded on C0. Physiological conditions-anteflexion, retroflexion, left and right flexion, left and right rotation-were simulated for preoperative verification. The occipital cervical fixation system FEM was established. The CDR technique was simulated to perform AAD reduction surgery. Data were obtained when the atlantoaxial horizontal and vertical dislocation reductions were verified postoperatively. Stress data for the two surgical schemes were analyzed, as was the reduction surgery optimization scheme for congenital BI-AAD patients with abnormal lateral atlantoaxial articulation ossification. Cervical spine (C0-C4) FEM of congenital BI-AAD patients was established. The CDR technique was simulated for AAD reduction. We obtained the mechanical data when the atlantoaxial horizontal and vertical dislocation reductions were satisfied for the two

  10. The Innovative Design and Prototype Verification of Wheelchair with One Degree of Freedom to Perform Lifting and Standing Functions

    Science.gov (United States)

    Hsieh, Long-Chang; Chen, Tzu-Hsia

    2017-12-01

    Traditionally, the mechanism of wheelchair with lifting and standing functions has 2 degrees of freedom, and used 2 power sources to perform these 2 motion function. The purpose of this paper is to invent new wheelchair with 1 degree of freedom to perform these 2 motion functions. Hence, we can use only 1 power source to drive the mechanism to achieve lifting and standing motion functions. The new design has the advantages of simple operation, more stability, and more safety. For traditional standing wheelchair, its’ centre of gravity moves forward when standing up and it needs 2 auxiliary wheels to prevent dumping. In this paper, by using the checklist method of Osborn, the wheelchair with 1 DOF is invented to perform lifting and standing functions. The centre of gravity of this new wheelchair after standing up still located between the front and rear wheels, no auxiliary wheels needed. Finally, the prototype is manufactured to verify the theoretical results.

  11. International Performance Measurement and Verification Protocol: Concepts and Options for Determining Energy and Water Savings, Volume I (Revised)

    Energy Technology Data Exchange (ETDEWEB)

    2002-03-01

    This protocol serves as a framework to determine energy and water savings resulting from the implementation of an energy efficiency program. It is also intended to help monitor the performance of renewable energy systems and to enhance indoor environmental quality in buildings.

  12. Integrated Advanced Microwave Sounding Unit-A (AMSU-A). Performance Verification Report: AMSU-A1 Antenna Drive Subsystem, PN 1331720-2, S/N 106

    Science.gov (United States)

    Luu, D.

    1999-01-01

    This is the Performance Verification Report, AMSU-A1 Antenna Drive Subsystem, P/N 1331720-2, S/N 106, for the Integrated Advanced Microwave Sounding Unit-A (AMSU-A). The antenna drive subsystem of the METSAT AMSU-A1, S/N 106, P/N 1331720-2, completed acceptance testing per A-ES Test Procedure AE-26002/lD. The test included: Scan Motion and Jitter, Pulse Load Bus Peak Current and Rise Time, Resolver Reading and Position Error, Gain/ Phase Margin, and Operational Gain Margin. The drive motors and electronic circuitry were also tested at the component level. The drive motor test includes: Starting Torque Test, Motor Commutation Test, Resolver Operation/ No-Load Speed Test, and Random Vibration. The electronic circuitry was tested at the Circuit Card Assembly (CCA) level of production; each test exercised all circuit functions. The transistor assembly was tested during the W3 cable assembly (1356941-1) test.

  13. Adsorption and biodegradation of 2-chlorophenol by mixed culture using activated carbon as a supporting medium-reactor performance and model verification

    Science.gov (United States)

    Lin, Yen-Hui

    2017-11-01

    A non-steady-state mathematical model system for the kinetics of adsorption and biodegradation of 2-chlorophenol (2-CP) by attached and suspended biomass on activated carbon process was derived. The mechanisms in the model system included 2-CP adsorption by activated carbon, 2-CP mass transport diffusion in biofilm, and biodegradation by attached and suspended biomass. Batch kinetic tests were performed to determine surface diffusivity of 2-CP, adsorption parameters for 2-CP, and biokinetic parameters of biomass. Experiments were conducted using a biological activated carbon (BAC) reactor system with high recycled rate to approximate a completely mixed flow reactor for model verification. Concentration profiles of 2-CP by model predictions indicated that biofilm bioregenerated the activated carbon by lowering the 2-CP concentration at the biofilm-activated carbon interface as the biofilm grew thicker. The removal efficiency of 2-CP by biomass was approximately 98.5% when 2-CP concentration in the influent was around 190.5 mg L-1 at a steady-state condition. The concentration of suspended biomass reached up to about 25.3 mg L-1 while the thickness of attached biomass was estimated to be 636 μm at a steady-state condition by model prediction. The experimental results agree closely with the results of the model predictions.

  14. Mathematical Verification for Transmission Performance of Centralized Lightwave WDM-RoF-PON with Quintuple Services Integrated in Each Wavelength Channel

    Directory of Open Access Journals (Sweden)

    Shuai Chen

    2015-01-01

    Full Text Available Wavelength-division-multiplexing passive-optical-network (WDM-PON has been recognized as a promising solution of the “last mile” access as well as multibroadband data services access for end users, and WDM-RoF-PON, which employs radio-over-fiber (RoF technique in WDM-PON, is even a more attractive approach for future broadband fiber and wireless access for its strong availability of centralized multiservices transmission operation and its transparency for bandwidth and signal modulation formats. As for multiservices development in WDM-RoF-PON, various system designs have been reported and verified via simulation or experiment till now, and the scheme with multiservices transmitted in each single wavelength channel is believed as the one that has the highest bandwidth efficiency; however, the corresponding mathematical verification is still hard to be found in state-of-the-art literature. In this paper, system design and data transmission performance of a quintuple services integrated WDM-RoF-PON which jointly employs carrier multiplexing and orthogonal modulation techniques, have been theoretically analyzed and verified in detail; moreover, the system design has been duplicated and verified experimentally and the theory system of such WDM-RoF-PON scheme has thus been formed.

  15. Adsorption and biodegradation of 2-chlorophenol by mixed culture using activated carbon as a supporting medium-reactor performance and model verification

    Science.gov (United States)

    Lin, Yen-Hui

    2016-12-01

    A non-steady-state mathematical model system for the kinetics of adsorption and biodegradation of 2-chlorophenol (2-CP) by attached and suspended biomass on activated carbon process was derived. The mechanisms in the model system included 2-CP adsorption by activated carbon, 2-CP mass transport diffusion in biofilm, and biodegradation by attached and suspended biomass. Batch kinetic tests were performed to determine surface diffusivity of 2-CP, adsorption parameters for 2-CP, and biokinetic parameters of biomass. Experiments were conducted using a biological activated carbon (BAC) reactor system with high recycled rate to approximate a completely mixed flow reactor for model verification. Concentration profiles of 2-CP by model predictions indicated that biofilm bioregenerated the activated carbon by lowering the 2-CP concentration at the biofilm-activated carbon interface as the biofilm grew thicker. The removal efficiency of 2-CP by biomass was approximately 98.5% when 2-CP concentration in the influent was around 190.5 mg L-1 at a steady-state condition. The concentration of suspended biomass reached up to about 25.3 mg L-1 while the thickness of attached biomass was estimated to be 636 μm at a steady-state condition by model prediction. The experimental results agree closely with the results of the model predictions.

  16. Personal Verification/Identification via Analysis of the Peripheral ECG Leads: Influence of the Personal Health Status on the Accuracy

    Directory of Open Access Journals (Sweden)

    Irena Jekova

    2015-01-01

    Full Text Available Traditional means for identity validation (PIN codes, passwords, and physiological and behavioral biometric characteristics (fingerprint, iris, and speech are susceptible to hacker attacks and/or falsification. This paper presents a method for person verification/identification based on correlation of present-to-previous limb ECG leads: I (rI, II (rII, calculated from them first principal ECG component (rPCA, linear and nonlinear combinations between rI, rII, and rPCA. For the verification task, the one-to-one scenario is applied and threshold values for rI, rII, and rPCA and their combinations are derived. The identification task supposes one-to-many scenario and the tested subject is identified according to the maximal correlation with a previously recorded ECG in a database. The population based ECG-ILSA database of 540 patients (147 healthy subjects, 175 patients with cardiac diseases, and 218 with hypertension has been considered. In addition a common reference PTB dataset (14 healthy individuals with short time interval between the two acquisitions has been taken into account. The results on ECG-ILSA database were satisfactory with healthy people, and there was not a significant decrease in nonhealthy patients, demonstrating the robustness of the proposed method. With PTB database, the method provides an identification accuracy of 92.9% and a verification sensitivity and specificity of 100% and 89.9%.

  17. Personal Verification/Identification via Analysis of the Peripheral ECG Leads: Influence of the Personal Health Status on the Accuracy.

    Science.gov (United States)

    Jekova, Irena; Bortolan, Giovanni

    2015-01-01

    Traditional means for identity validation (PIN codes, passwords), and physiological and behavioral biometric characteristics (fingerprint, iris, and speech) are susceptible to hacker attacks and/or falsification. This paper presents a method for person verification/identification based on correlation of present-to-previous limb ECG leads: I (r I), II (r II), calculated from them first principal ECG component (r PCA), linear and nonlinear combinations between r I, r II, and r PCA. For the verification task, the one-to-one scenario is applied and threshold values for r I, r II, and r PCA and their combinations are derived. The identification task supposes one-to-many scenario and the tested subject is identified according to the maximal correlation with a previously recorded ECG in a database. The population based ECG-ILSA database of 540 patients (147 healthy subjects, 175 patients with cardiac diseases, and 218 with hypertension) has been considered. In addition a common reference PTB dataset (14 healthy individuals) with short time interval between the two acquisitions has been taken into account. The results on ECG-ILSA database were satisfactory with healthy people, and there was not a significant decrease in nonhealthy patients, demonstrating the robustness of the proposed method. With PTB database, the method provides an identification accuracy of 92.9% and a verification sensitivity and specificity of 100% and 89.9%.

  18. Performance Verification of Production-Scalable Energy-Efficient Solutions: Winchester/Camberley Homes Mixed-Humid Climate

    Energy Technology Data Exchange (ETDEWEB)

    Mallay, D. [Partnership for Home Innovation, Upper Marlboro, MD (United States); Wiehagen, J. [Partnership for Home Innovation, Upper Marlboro, MD (United States)

    2014-07-01

    Winchester/Camberley Homes collaborated with the Building America team Partnership for Home Innovation to develop a new set of high performance home designs that could be applicable on a production scale. The new home designs are to be constructed in the mixed humid climate zone and could eventually apply to all of the builder's home designs to meet or exceed future energy codes or performance-based programs. However, the builder recognized that the combination of new wall framing designs and materials, higher levels of insulation in the wall cavity, and more detailed air sealing to achieve lower infiltration rates changes the moisture characteristics of the wall system. In order to ensure long term durability and repeatable successful implementation with few call-backs, the project team demonstrated through measured data that the wall system functions as a dynamic system, responding to changing interior and outdoor environmental conditions within recognized limits of the materials that make up the wall system. A similar investigation was made with respect to the complete redesign of the HVAC systems to significantly improve efficiency while maintaining indoor comfort. Recognizing the need to demonstrate the benefits of these efficiency features, the builder offered a new house model to serve as a test case to develop framing designs, evaluate material selections and installation requirements, changes to work scopes and contractor learning curves, as well as to compare theoretical performance characteristics with measured results.

  19. Performance Verification of Production-Scalable Energy-Efficient Solutions: Winchester/Camberley Homes Mixed-Humid Climate

    Energy Technology Data Exchange (ETDEWEB)

    Mallay, D.; Wiehagen, J.

    2014-07-01

    Winchester/Camberley Homes with the Building America program and its NAHB Research Center Industry Partnership collaborated to develop a new set of high performance home designs that could be applicable on a production scale. The new home designs are to be constructed in the mixed humid climate zone four and could eventually apply to all of the builder's home designs to meet or exceed future energy codes or performance-based programs. However, the builder recognized that the combination of new wall framing designs and materials, higher levels of insulation in the wall cavity, and more detailed air sealing to achieve lower infiltration rates changes the moisture characteristics of the wall system. In order to ensure long term durability and repeatable successful implementation with few call-backs, this report demonstrates through measured data that the wall system functions as a dynamic system, responding to changing interior and outdoor environmental conditions within recognized limits of the materials that make up the wall system. A similar investigation was made with respect to the complete redesign of the heating, cooling, air distribution, and ventilation systems intended to optimize the equipment size and configuration to significantly improve efficiency while maintaining indoor comfort. Recognizing the need to demonstrate the benefits of these efficiency features, the builder offered a new house model to serve as a test case to develop framing designs, evaluate material selections and installation requirements, changes to work scopes and contractor learning curves, as well as to compare theoretical performance characteristics with measured results.

  20. Performance optimisations for distributed analysis in ALICE

    CERN Document Server

    Betev, L; Gheata, M; Grigoras, C; Hristov, P

    2014-01-01

    Performance is a critical issue in a production system accommodating hundreds of analysis users. Compared to a local session, distributed analysis is exposed to services and network latencies, remote data access and heterogeneous computing infrastructure, creating a more complex performance and efficiency optimization matrix. During the last 2 years, ALICE analysis shifted from a fast development phase to the more mature and stable code. At the same time, the framewo rks and tools for deployment, monitoring and management of large productions have evolved considerably too. The ALICE Grid production system is currently used by a fair share of organized and individual user analysis, consuming up to 30% or the available r esources and ranging from fully I/O - bound analysis code to CPU intensive correlations or resonances studies. While the intrinsic analysis performance is unlikely to improve by a large factor during the LHC long shutdown (LS1), the overall efficiency of the system has still to be improved by a...

  1. Standard Verification System Lite (SVS Lite)

    Data.gov (United States)

    Social Security Administration — SVS Lite is a mainframe program used exclusively by the Office of Child Support Enforcement (OCSE) to perform batch SSN verifications. This process is exactly the...

  2. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM: FUEL CELLS

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This techno...

  3. Language dependence in multilingual speaker verification

    CSIR Research Space (South Africa)

    Kleynhans, NT

    2005-11-01

    Full Text Available An investigation into the performance of current speaker verification technology within a multilingual context is presented. Using the Oregon Graduate Institute (OGI) Multi-Language Telephone Speech Corpus (MLTS) database, the authors found...

  4. 12 CFR 715.8 - Requirements for verification of accounts and passbooks.

    Science.gov (United States)

    2010-01-01

    ...-statistical method. When the verification is performed by an Independent person licensed by the State or... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Requirements for verification of accounts and... CREDIT UNIONS SUPERVISORY COMMITTEE AUDITS AND VERIFICATIONS § 715.8 Requirements for verification of...

  5. Verification of the coupled fluid/solid transfer in a CASL grid-to-rod-fretting simulation : a technical brief on the analysis of convergence behavior and demonstration of software tools for verification.

    Energy Technology Data Exchange (ETDEWEB)

    Copps, Kevin D.

    2011-12-01

    For a CASL grid-to-rod fretting problem, Sandia's Percept software was used in conjunction with the Sierra Mechanics suite to analyze the convergence behavior of the data transfer from a fluid simulation to a solid mechanics simulation. An analytic function, with properties relatively close to numerically computed fluid approximations, was chosen to represent the pressure solution in the fluid domain. The analytic pressure was interpolated on a sequence of grids on the fluid domain, and transferred onto a separate sequence of grids in the solid domain. The error in the resulting pressure in the solid domain was measured with respect to the analytic pressure. The error in pressure approached zero as both the fluid and solids meshes were refined. The convergence of the transfer algorithm was limited by whether the source grid resolution was the same or finer than the target grid resolution. In addition, using a feature coverage analysis, we found gaps in the solid mechanics code verification test suite directly relevant to the prototype CASL GTRF simulations.

  6. Building America Performance Analysis Procedures: Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    None

    2004-06-01

    To measure progress toward multi-year research goals, cost and performance trade-offs are evaluated through a series of controlled field and laboratory experiments supported by energy analysis techniques using test data to calibrate simulation models.

  7. Development and verification of NRC`s single-rod fuel performance codes FRAPCON-3 AND FRAPTRAN

    Energy Technology Data Exchange (ETDEWEB)

    Beyer, C.E.; Cunningham, M.E.; Lanning, D.D. [Pacific Northwest National Lab., Richland, WA (United States)

    1998-03-01

    The FRAPCON and FRAP-T code series, developed in the 1970s and early 1980s, are used by the US Nuclear Regulatory Commission (NRC) to predict fuel performance during steady-state and transient power conditions, respectively. Both code series are now being updated by Pacific Northwest National Laboratory to improve their predictive capabilities at high burnup levels. The newest versions of the codes are called FRAPCON-3 and FRAPTRAN. The updates to fuel property and behavior models are focusing on providing best estimate predictions under steady-state and fast transient power conditions up to extended fuel burnups (> 55 GWd/MTU). Both codes will be assessed against a data base independent of the data base used for code benchmarking and an estimate of code predictive uncertainties will be made based on comparisons to the benchmark and independent data bases.

  8. Power performance verification in complex terrain using nacelle lidars: the Hill of Towie (HoT) campaign

    DEFF Research Database (Denmark)

    Borraccino, Antoine; Wagner, Rozenn; Vignaroli, Andrea

    Nacelle lidars are an attractive alternative to meteorological masts for power performance testing in complex terrain, because of the ease of deployment. This report presents the comparison of wind speed and power curve measurements using two commercial nacelle lidar system – one Avent 4-beam Wind....... With the wind model, the wind speed estimate is within 2% from the ZP300 measurements, corresponding to an error in AEP in the order of 4%. With the wind-induction model, the free stream wind speed estimate is within 1% from the ZP300 corresponding to an AEP error of approximately 2%. In the second case......, the reference wind speed is the ZP300 wind speed measurements corrected using the site calibration. The power curve measured using the three measurement systems were compared to the turbine manufacturer warranted power curve as reference. The reduction in the statistical power uncertainty (type A) usually...

  9. In-flight verification of the calibration and performance of the ASTRO-H (Hitomi) Soft X-Ray Spectrometer

    Science.gov (United States)

    Leutenegger, Maurice A.; Audard, Marc; Boyce, Kevin R.; Brown, Gregory V.; Chiao, Meng P.; Eckart, Megan E.; Fujimoto, Ryuichi; Furuzawa, Akihiro; Guainazzi, Matteo; Haas, Daniel; den Herder, Jan-Willem; Hayashi, Takayuki; Iizuka, Ryo; Ishida, Manabu; Ishisaki, Yoshitaka; Kelley, Richard L.; Kikuchi, Naomichi; Kilbourne, Caroline A.; Koyama, Shu; Kurashima, Sho; Maeda, Yoshitomo; Markevitch, Maxim; McCammon, Dan; Mitsuda, Kazuhisa; Mori, Hideyuki; Nakaniwa, Nozomi; Okajima, Takashi; Paltani, Stéphane; Petre, Robert; Porter, F. Scott; Sato, Kosuke; Sato, Toshiki; Sawada, Makoto; Serlemitsos, Peter J.; Seta, Hiromi; Sneiderman, Gary; Soong, Yang; Sugita, Satoshi; Szymkowiak, Andrew E.; Takei, Yoh; Tashiro, Makoto; Tawara, Yuzuru; Tsujimoto, Masahiro; de Vries, Cor P.; Watanabe, Tomomi; Yamada, Shinya; Yamasaki, Noriko

    2016-07-01

    The Soft X-ray Spectrometer (SXS) onboard the Astro-H (Hitomi) orbiting x-ray observatory featured an array of 36 silicon thermistor x-ray calorimeters optimized to perform high spectral resolution x-ray imaging spectroscopy of astrophysical sources in the 0.3-12 keV band. Extensive pre- flight calibration measurements are the basis for our modeling of the pulse-height-energy relation and energy resolution for each pixel and event grade, telescope collecting area, detector efficiency, and pulse arrival time. Because of the early termination of mission operations, we needed to extract the maximum information from observations performed only days into the mission when the onboard calibration sources had not yet been commissioned and the dewar was still coming into thermal equilibrium, so our technique for reconstructing the per-pixel time-dependent pulse-height-energy relation had to be modified. The gain scale was reconstructed using a combination of an absolute energy scale calibration at a single time using a fiducial from an onboard radioactive source, and calibration of a dominant time-dependent gain drift component using a dedicated calibration pixel, as well as a residual time-dependent variation using spectra from the Perseus cluster of galaxies. The energy resolution was also measured using the onboard radioactive sources. It is consistent with instrument-level measurements accounting for the modest increase in noise due to spacecraft systems interference. We use observations of two pulsars to validate our models of the telescope area and detector efficiency, and to derive a more accurate value for the thickness of the gate valve Be window, which had not been opened by the time mission operations ceased. We use observations of the Crab pulsar to refine the pixel-to-pixel timing and validate the absolute timing.

  10. Utterance Verification for Text-Dependent Speaker Recognition

    DEFF Research Database (Denmark)

    Kinnunen, Tomi; Sahidullah, Md; Kukanov, Ivan

    2016-01-01

    Text-dependent automatic speaker verification naturally calls for the simultaneous verification of speaker identity and spoken content. These two tasks can be achieved with automatic speaker verification (ASV) and utterance verification (UV) technologies. While both have been addressed previously...... in the literature, a treatment of simultaneous speaker and utterance verification with a modern, standard database is so far lacking. This is despite the burgeoning demand for voice biometrics in a plethora of practical security applications. With the goal of improving overall verification performance, this paper...... reports different strategies for simultaneous ASV and UV in the context of short-duration, text-dependent speaker verification. Experiments performed on the recently released RedDots corpus are reported for three different ASV systems and four different UV systems. Results show that the combination...

  11. 78 FR 53017 - Changes to the Salmonella Verification Sampling Program: Analysis of Raw Beef for Shiga Toxin...

    Science.gov (United States)

    2013-08-28

    ... ``Pathogen Reduction; Hazard Analysis and Critical Control Point (PR/HACCP) Systems,'' which FSIS published... other things, the PR/HACCP rule set Salmonella performance standards for establishments producing... sampling was expensive for the Agency. As stated in the PR/HACCP rule (at 61 FR 38835), FSIS selected...

  12. Formal verification of a fault tolerant clock synchronization algorithm

    Science.gov (United States)

    Rushby, John; Vonhenke, Frieder

    1989-01-01

    A formal specification and mechanically assisted verification of the interactive convergence clock synchronization algorithm of Lamport and Melliar-Smith is described. Several technical flaws in the analysis given by Lamport and Melliar-Smith were discovered, even though their presentation is unusally precise and detailed. It seems that these flaws were not detected by informal peer scrutiny. The flaws are discussed and a revised presentation of the analysis is given that not only corrects the flaws but is also more precise and easier to follow. Some of the corrections to the flaws require slight modifications to the original assumptions underlying the algorithm and to the constraints on its parameters, and thus change the external specifications of the algorithm. The formal analysis of the interactive convergence clock synchronization algorithm was performed using the Enhanced Hierarchical Development Methodology (EHDM) formal specification and verification environment. This application of EHDM provides a demonstration of some of the capabilities of the system.

  13. Structural Performance Optimization and Verification of an Improved Thin-Walled Storage Tank for a Pico-Satellite

    Directory of Open Access Journals (Sweden)

    Lai Teng

    2017-11-01

    Full Text Available This paper presents an improved mesh storage tank structure obtained using 3D metal printing. The storage tank structure is optimized using a multi-objective uniform design method. Each parameter influencing the storage tank is considered as the optimization factor, and the compression stress (σ, volume utilization ratio (ν, and weight (m are considered as the optimization objectives. Regression equations were established between the optimization factors and targets, the orders of the six factors affecting three target values are analyzed, and the relative deviations between the regression equation and calculation results for σ, ν, and m were 9.72%, 4.15%, and 2.94%, respectively. The optimization results showed that the regression equations can predict the structure performance of the improved storage tank, and the values of the influence factors obtained through the optimization are effective. In addition, the compression stress was improved by 24.98%, the volume utilization ratio was increased by 26.86%, and the weight was reduced by 26.83%. The optimized storage tank was developed through 3D metal printing, and the compressive stress was improved by 58.71%, the volume utilization ratio was increased by 24.52%, and the weight was reduced by 11.67%.

  14. Performance Verification of the Lattice-type ECCS Sump Strainer to Prevent the Thin-bed effect

    Energy Technology Data Exchange (ETDEWEB)

    Sung, Je Joong; Kim, Chang Hyun; Ha, Sang Jun [KHNP CRI, Daejeon (Korea, Republic of)

    2013-10-15

    In the event of a Loss of Coolant Accident (LOCA), a variety of debris could be generated under the post-LOCA conditions. The debris could block the Emergency Core Cooling System (ECCS) sump strainer, leading to a considerable head loss which in turn causes an abnormal ECCS and/or CS pump performance. The determination of strainer capacity is very important through the optimization of the head loss due to debris blockage. Especially, the thin-bed effect is a dominant factor to the design of the strainer. This paper presents experimental head loss data to confirm an advantage of an advanced lattice-type strainer for the thin-bed effect and is compared to the results of NUREG/CR-6224 head loss correlation. The thin-bed effect is a dominant design factor because the head loss could increase drastically by the lack of available voids in the debris bed for coolant to pass through it. Though this study, the lattice-type strainer to reduce or prevent the thin-bed effect has been designed. As the experimental data shows, there is no thin-bed effect in the present lattice-type strainer. It is expected that the required capacity of the strainer to maintain the function of ECCS will be significantly reduced by the lattice-type strainer of the present study.

  15. Mechanistic Physiologically Based Pharmacokinetic (PBPK) Model of the Heart Accounting for Inter-individual Variability: Development and Performance Verification.

    Science.gov (United States)

    Tylutki, Zofia; Mendyk, Aleksander; Polak, Sebastian

    2017-11-23

    Modern model-based approaches to cardiac safety and efficacy assessment require accurate drug concentration-effect relationship establishment. Thus, knowledge of the active concentration of drugs in heart tissue is desirable along with inter-subject variability influence estimation. To that end, we developed a mechanistic physiologically based pharmacokinetic model of the heart. The models were described with literature-derived parameters and written in R, v.3.4.0. Five parameters were estimated. The model was fitted to amitriptyline and nortriptyline concentrations after an intravenous infusion of amitriptyline. The cardiac model consisted of 5 compartments representing the pericardial fluid, heart extracellular water, and epicardial intracellular, midmyocardial intracellular, and endocardial intracellular fluids. Drug cardiac metabolism, passive diffusion, active efflux, and uptake were included in the model as mechanisms involved in the drug disposition within the heart. The model accounted for inter-individual variability. The estimates of optimized parameters were within physiological ranges. The model performance was verified by simulating 5 clinical studies of amitriptyline intravenous infusion, and the simulated pharmacokinetic profiles agreed with clinical data. The results support the model feasibility. The proposed structure can be tested with the goal of improving the patient-specific model-based cardiac safety assessment and offers a framework for predicting cardiac concentrations of various xenobiotics. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  16. Overcoming urban GPS navigation challenges through the use of MEMS inertial sensors and proper verification of navigation system performance

    Science.gov (United States)

    Vinande, Eric T.

    This research proposes several means to overcome challenges in the urban environment to ground vehicle global positioning system (GPS) receiver navigation performance through the integration of external sensor information. The effects of narrowband radio frequency interference and signal attenuation, both common in the urban environment, are examined with respect to receiver signal tracking processes. Low-cost microelectromechanical systems (MEMS) inertial sensors, suitable for the consumer market, are the focus of receiver augmentation as they provide an independent measure of motion and are independent of vehicle systems. A method for estimating the mounting angles of an inertial sensor cluster utilizing typical urban driving maneuvers is developed and is able to provide angular measurements within two degrees of truth. The integration of GPS and MEMS inertial sensors is developed utilizing a full state navigation filter. Appropriate statistical methods are developed to evaluate the urban environment navigation improvement due to the addition of MEMS inertial sensors. A receiver evaluation metric that combines accuracy, availability, and maximum error measurements is presented and evaluated over several drive tests. Following a description of proper drive test techniques, record and playback systems are evaluated as the optimal way of testing multiple receivers and/or integrated navigation systems in the urban environment as they simplify vehicle testing requirements.

  17. Transient Analysis of Manufacturing Systems Performance

    OpenAIRE

    Narahari, Y.; Viswanadham, N

    1994-01-01

    Studies in performance evaluation of automated manufacturing systems, using simulation or analytical models,have always emphasized steady-state or equilibrium performance in preference to transient performance. In this study, we present several situations in manufacturing systems where transient analysis is very important. Manufacturing systems and models in which such situations arise include: systems with failure states and deadlocks, unstable queueing systems, and systems with fluctuating ...

  18. Improvement of electrophoresis performance by spectral analysis ...

    African Journals Online (AJOL)

    This paper describes a new design of standard agarose gel electrophoresis procedure for nucleic acids analysis. The electrophoresis was improved by using the real-time spectral analysis of the samples to increase its performance. A laser beam illuminated the analysed sample at wavelength with the highest absorption of ...

  19. Gas chromatography/mass spectrometric analysis of methyl esters of N,N-dialkylaminoethane-2-sulfonic acids for verification of the Chemical Weapons Convention.

    Science.gov (United States)

    Pardasani, Deepak; Gupta, Arvinda K; Palit, Meehir; Shakya, Purushottam; Kanaujia, Pankaj K; Sekhar, K; Dubey, Devendra K

    2005-01-01

    This paper describes the synthesis and gas chromatography/electron ionization mass spectrometric (GC/EI-MS) analysis of methyl esters of N,N-dialkylaminoethane-2-sulfonic acids (DAESAs). These sulfonic acids are important environmental signatures of nerve agent VX and its toxic analogues, hence GC/EI-MS analysis of their methyl esters is of paramount importance for verification of the Chemical Weapons Convention. DAESAs were prepared by condensation of 2-bromoethane sulfonic acid with dialkylamines, and by condensation of dialkylaminoethyl chloride with sodium bisulfite. GC/EI-MS analysis of methyl esters of DAESAs yielded mass spectra; based on these spectra, generalized fragmentation routes are proposed that rationalize most of the characteristic ions. (c) 2005 John Wiley & Sons, Ltd.

  20. NEMVP: North American energy measurement and verification protocol

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-03-01

    This measurement and verification protocol discusses procedures that,when implemented, allow buyers, sellers, and financiers of energy projects to quantify energy conservation measure performance and savings.

  1. Shift Verification and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Pandya, Tara M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Evans, Thomas M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Davidson, Gregory G [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Seth R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Godfrey, Andrew T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-09-07

    This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of Light Water Reactors (CASL). Five main types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed-source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results, and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over a burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.

  2. Online fingerprint verification.

    Science.gov (United States)

    Upendra, K; Singh, S; Kumar, V; Verma, H K

    2007-01-01

    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications.

  3. Parking Space Verification

    DEFF Research Database (Denmark)

    Høg Peter Jensen, Troels; Thomsen Schmidt, Helge; Dyremose Bodin, Niels

    2018-01-01

    With the number of privately owned cars increasing, the issue of locating an available parking space becomes apparant. This paper deals with the verification of vacant parking spaces, by using a vision based system looking over parking areas. In particular the paper proposes a binary classifier...... system, based on a Convolutional Neural Network, that is capable of determining if a parking space is occupied or not. A benchmark database consisting of images captured from different parking areas, under different weather and illumination conditions, has been used to train and test the system....... The system shows promising performance on the database with an accuracy of 99.71% overall and is robust to the variations in parking areas and weather conditions....

  4. Privacy Preserving Iris Based Biometric Identity Verification

    Directory of Open Access Journals (Sweden)

    Przemyslaw Strzelczyk

    2011-08-01

    Full Text Available Iris biometrics is considered one of the most accurate and robust methods of identity verification. Individually unique iris features can be presented in a compact binary form easily compared with reference template to confirm identity. However, when templates or features are disclosed, iris biometrics is no longer suitable for verification. Therefore, there is a need to perform iris feature matching without revealing the features itself and reference template. The paper proposes an extension of the standard iris-based verification protocol that introduces features and a template locking mechanism, which guarantees that no sensitive information is exposed.Article in English

  5. What is the Final Verification of Engineering Requirements?

    Science.gov (United States)

    Poole, Eric

    2010-01-01

    This slide presentation reviews the process of development through the final verification of engineering requirements. The definition of the requirements is driven by basic needs, and should be reviewed by both the supplier and the customer. All involved need to agree upon a formal requirements including changes to the original requirements document. After the requirements have ben developed, the engineering team begins to design the system. The final design is reviewed by other organizations. The final operational system must satisfy the original requirements, though many verifications should be performed during the process. The verification methods that are used are test, inspection, analysis and demonstration. The plan for verification should be created once the system requirements are documented. The plan should include assurances that every requirement is formally verified, that the methods and the responsible organizations are specified, and that the plan is reviewed by all parties. The options of having the engineering team involved in all phases of the development as opposed to having some other organization continue the process once the design has been complete is discussed.

  6. Program Verification and System Dependability

    Science.gov (United States)

    Jackson, Michael

    Formal verification of program correctness is a long-standing ambition, recently given added prominence by a “Grand Challenge” project. Major emphases have been on the improvement of languages for program specification and program development, and on the construction of verification tools. The emphasis on tools commands general assent, but while some researchers focus on narrow verification aimed only at program correctness, others want to pursue wide verification aimed at the larger goal of system dependability. This paper presents an approach to system dependability based on problem frames and suggests how this approach can be supported by formal software tools. Dependability is to be understood and evaluated in the physical and human problem world of a system. The complexity and non-formal nature of the problem world demand the development and evolution of normal designs and normal design practices for specialised classes of systems and subsystems. The problem frames discipline of systems analysis and development that can support normal design practices is explained and illustrated. The role of formal reasoning in achieving dependability is discussed and some conceptual, linguistic and software tools are suggested.

  7. A Method for Automatic Runtime Verification of Automata-Based Programs

    OpenAIRE

    Oleg, Stepanov; Anatoly, Shalyto

    2008-01-01

    Currently Model Checking is the only practically used method for verification of automata-based programs. However, current implementations of this method only allow verification of simple automata systems. We suggest using a different approach, runtime verification, for verification of automata systems. We discuss advantages and disadvantages of this approach, propose a method for automatic verification of automata-based programs which uses this approach and conduct experimental performance s...

  8. Scalable hardware verification with symbolic simulation

    CERN Document Server

    Bertacco, Valeria

    2006-01-01

    An innovative presentation of the theory of disjoint support decomposition, presenting novel results and algorithms, plus original and up-to-date techniques in formal verificationProvides an overview of current verification techniques, and unveils the inner workings of symbolic simulationFocuses on new techniques that narrow the performance gap between the complexity of digital systems and the limited ability to verify themAddresses key topics in need of future research.

  9. Verification Account Management System (VAMS)

    Data.gov (United States)

    Social Security Administration — The Verification Account Management System (VAMS) is the centralized location for maintaining SSA's verification and data exchange accounts. VAMS account management...

  10. Performance Analysis of Photovoltaic Water Heating System

    OpenAIRE

    Tomas Matuska; Borivoj Sourek

    2017-01-01

    Performance of solar photovoltaic water heating systems with direct coupling of PV array to DC resistive heating elements has been studied and compared with solar photothermal systems. An analysis of optimum fixed load resistance for different climate conditions has been performed for simple PV heating systems. The optimum value of the fixed load resistance depends on the climate, especially on annual solar irradiation level. Use of maximum power point tracking compared to fixed optimized loa...

  11. Verification is experimentation!

    NARCIS (Netherlands)

    Brinksma, Hendrik

    2001-01-01

    The formal verification of concurrent systems is usually seen as an example par excellence of the application of mathematical methods to computer science. Although the practical application of such verification methods will always be limited by the underlying forms of combinatorial explosion, recent

  12. Verification Is Experimentation!

    NARCIS (Netherlands)

    Brinksma, Hendrik

    2000-01-01

    the formal verification of concurrent systems is usually seen as an example par excellence of the application of mathematical methods to computer science. although the practical application of such verification methods will always be limited by the underlying forms of combinatorial explosion, recent

  13. ENVIRONMENTAL TECHNOLOGY VERIFICATION: TEST/QA PLAN FOR THE VERIFICATION TESTING OF SELECTIVE CATALYTIC REDUCTION CONTROL TECHNOLOGIES FOR HIGHWAY, NONROAD, AND STATIONARY USE DIESEL ENGINES

    Science.gov (United States)

    The U.S. Environmental Protection Agency established the Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of product performance. Research Triangl...

  14. Land surface Verification Toolkit (LVT)

    Science.gov (United States)

    Kumar, Sujay V.

    2017-01-01

    LVT is a framework developed to provide an automated, consolidated environment for systematic land surface model evaluation Includes support for a range of in-situ, remote-sensing and other model and reanalysis products. Supports the analysis of outputs from various LIS subsystems, including LIS-DA, LIS-OPT, LIS-UE. Note: The Land Information System Verification Toolkit (LVT) is a NASA software tool designed to enable the evaluation, analysis and comparison of outputs generated by the Land Information System (LIS). The LVT software is released under the terms and conditions of the NASA Open Source Agreement (NOSA) Version 1.1 or later. Land Information System Verification Toolkit (LVT) NOSA.

  15. RAZORBACK - A Research Reactor Transient Analysis Code Version 1.0 - Volume 3: Verification and Validation Report.

    Energy Technology Data Exchange (ETDEWEB)

    Talley, Darren G.

    2017-04-01

    This report describes the work and results of the verification and validation (V&V) of the version 1.0 release of the Razorback code. Razorback is a computer code designed to simulate the operation of a research reactor (such as the Annular Core Research Reactor (ACRR)) by a coupled numerical solution of the point reactor kinetics equations, the energy conservation equation for fuel element heat transfer, the equation of motion for fuel element thermal expansion, and the mass, momentum, and energy conservation equations for the water cooling of the fuel elements. This V&V effort was intended to confirm that the code shows good agreement between simulation and actual ACRR operations.

  16. Automatic Verification of Timing Constraints for Safety Critical Space Systems

    Science.gov (United States)

    Fernandez, Javier; Parra, Pablo; Sanchez Prieto, Sebastian; Polo, Oscar; Bernat, Guillem

    2015-09-01

    In this paper is presented an automatic process of verification. We focus in the verification of scheduling analysis parameter. This proposal is part of process based on Model Driven Engineering to automate a Verification and Validation process of the software on board of satellites. This process is implemented in a software control unit of the energy particle detector which is payload of Solar Orbiter mission. From the design model is generated a scheduling analysis model and its verification model. The verification as defined as constraints in way of Finite Timed Automatas. When the system is deployed on target the verification evidence is extracted as instrumented points. The constraints are fed with the evidence, if any of the constraints is not satisfied for the on target evidence the scheduling analysis is not valid.

  17. PERFORMANCE ANALYSIS OF METHODS FOR ESTIMATING ...

    African Journals Online (AJOL)

    2014-12-31

    Dec 31, 2014 ... The performance analysis revealed that the MLM was the most accurate model followed by the EPF and the GM. Furthermore, the comparison between the wind speed standard deviation predicted by the proposed models and the measured data showed that the MLM has a smaller relative error of -3.33% ...

  18. Experimental verification and stability state space analysis of CLL-T series parallel resonant converter with fuzzy controller

    Science.gov (United States)

    Nagarajan, Chinnadurai; Madheswaran, Muthusamy

    2012-12-01

    This paper presents a closed loop CLL-T (capacitor inductor inductor) series parallel resonant converter (SPRC) has been simulated and the performance is analyzed. A three element CLL-T SPRC working under load independent operation (voltage type and current type load) is presented in this paper. The stability and AC analysis of CLL-T SPRC has been developed using state space technique and the regulation of output voltage is done by using Fuzzy controller. The simulation study indicates the superiority of fuzzy control over the conventional control methods. The proposed approach is expected to provide better voltage regulation for dynamic load conditions. A prototype 300 W, 100 kHz converter is designed and built to experimentally demonstrate, dynamic and steady state performance for the CLL-T SPRC are compared from the simulation studies.

  19. Performance analysis of opportunistic nonregenerative relaying

    KAUST Repository

    Tourki, Kamel

    2013-01-01

    Opportunistic relaying in cooperative communication depends on careful relay selection. However, the traditional centralized method used for opportunistic amplify-and-forward protocols requires precise measurements of channel state information at the destination. In this paper, we adopt the max-min criterion as a relay selection framework for opportunistic amplify-and-forward cooperative communications, which was exhaustively used for the decode-and-forward protocol, and offer an accurate performance analysis based on exact statistics of the local signal-to-noise ratios of the best relay. Furthermore, we evaluate the asymptotical performance and deduce the diversity order of our proposed scheme. Finally, we validate our analysis by showing that performance simulation results coincide with our analytical results over Rayleigh fading channels, and we compare the max-min relay selection with their centralized channel state information-based and partial relay selection counterparts.

  20. IMPORTANCE-PERFORMANCE ANALYSIS TO ARJOSARI TERMINAL

    Directory of Open Access Journals (Sweden)

    SEDAYU Agung

    2014-12-01

    Full Text Available Public transport is one of the important solutions to solve transportation problems in Indonesia. At this present, the level of congestion and traffic accidents were very high. This was caused by increasing in using cars and motorcycles. This condition is not matched by public transport services that are decreasing in quantity and performance quality of the vehicles. Therefore with the issues it needs improve public transport services that are reliable and cheap. This study suggests efforts to improve public transport services based terminal user perceptions. This study location is Arjosari terminal. The method is Importance-Performance Analysis (IPA. The analysis results obtained service attributes include assurance, responsiveness, performance, aesthetics, easy, reliability, durability, frequency, comfort, and the availability of facilities. The results get priority to be repaired include security protection and safety, Aesthetics waiting room, no long in waiting time, provide information and complaint center, and the available of Goods repository.

  1. Automated continuous verification for numerical simulation

    Directory of Open Access Journals (Sweden)

    P. E. Farrell

    2011-05-01

    Full Text Available Verification is a process crucially important for the final users of a computational model: code is useless if its results cannot be relied upon. Typically, verification is seen as a discrete event, performed once and for all after development is complete. However, this does not reflect the reality that many geoscientific codes undergo continuous development of the mathematical model, discretisation and software implementation. Therefore, we advocate that in such cases verification must be continuous and happen in parallel with development: the desirability of their automation follows immediately. This paper discusses a framework for automated continuous verification of wide applicability to any kind of numerical simulation. It also documents a range of test cases to show the possibilities of the framework.

  2. Compositional verification of real-time systems using Ecdar

    DEFF Research Database (Denmark)

    David, A.; Larsen, K.G.; Møller, M.H.

    2012-01-01

    We present a specification theory for timed systems implemented in the Ecdar tool. We illustrate the operations of the specification theory on a running example, showing the models and verification checks. To demonstrate the power of the compositional verification, we perform an in depth case study...... of a leader election protocol; Modeling it in Ecdar as Timed input/output automata Specifications and performing both monolithic and compositional verification of two interesting properties on it. We compare the execution time of the compositional to the classical verification showing a huge difference...

  3. ANALYSIS FRAMEWORKS OF THE COLLABORATIVE INNOVATION PERFORMANCE

    Directory of Open Access Journals (Sweden)

    Dan SERGHIE

    2014-12-01

    Full Text Available Time management is one of the resources by which we can achieve improved performance innovation. This perspective of resource management and process efficiency by reducing the timing of incubation of ideas, selecting profitable innovations and turning them into added value relates to that absolute time, a time specific to human existence. In this article I will try to prove that the main way to obtain high performance through inter-organizational innovation can be achieved by manipulating the context and manipulating knowledge outside the arbitrary concept for “time”. This article presents the results of the research suggesting a sequential analysis and evaluation model of the performance through a rational and refined process of selection of the performance indicators, aiming at providing the shortest and most relevant list of criteria.

  4. Verification of CFD analysis methods for predicting the drag force and thrust power of an underwater disk robot

    Directory of Open Access Journals (Sweden)

    Joung Tae-Hwan

    2014-06-01

    Full Text Available This paper examines the suitability of using the Computational Fluid Dynamics (CFD tools, ANSYSCFX, as an initial analysis tool for predicting the drag and propulsion performance (thrust and torque of a concept underwater vehicle design. In order to select an appropriate thruster that will achieve the required speed of the Underwater Disk Robot (UDR, the ANSYS-CFX tools were used to predict the drag force of the UDR. Vertical Planar Motion Mechanism (VPMM test simulations (i.e. pure heaving and pure pitching motion by CFD motion analysis were carried out with the CFD software. The CFD results reveal the distribution of hydrodynamic values (velocity, pressure, etc. of the UDR for these motion studies. Finally, CFD bollard pull test simulations were performed and compared with the experimental bollard pull test results conducted in a model basin. The experimental results confirm the suitability of using the ANSYS-CFX tools for predicting the behavior of concept vehicles early on in their design process.

  5. Verification of CFD analysis methods for predicting the drag force and thrust power of an underwater disk robot

    Directory of Open Access Journals (Sweden)

    Tae-Hwan Joung

    2014-06-01

    Full Text Available This paper examines the suitability of using the Computational Fluid Dynamics (CFD tools, ANSYS-CFX, as an initial analysis tool for predicting the drag and propulsion performance (thrust and torque of a concept underwater vehicle design. In order to select an appropriate thruster that will achieve the required speed of the Underwater Disk Robot (UDR, the ANSYS-CFX tools were used to predict the drag force of the UDR. Vertical Planar Motion Mechanism (VPMM test simulations (i.e. pure heaving and pure pitching motion by CFD motion analysis were carried out with the CFD software. The CFD results reveal the distribution of hydrodynamic values (velocity, pressure, etc. of the UDR for these motion studies. Finally, CFD bollard pull test simulations were performed and compared with the experimental bollard pull test results conducted in a model basin. The experimental results confirm the suitability of using the ANSYS-CFX tools for predicting the behavior of concept vehicles early on in their design process.

  6. Fifty years of progress in speaker verification

    Science.gov (United States)

    Rosenberg, Aaron E.

    2004-10-01

    The modern era in speaker recognition started about 50 years ago at Bell Laboratories with the controversial invention of the voiceprint technique for speaker identification based on expert analysis of speech spectrograms. Early speaker recognition research concentrated on finding acoustic-phonetic features effective in discriminating speakers. The first truly automatic text dependent speaker verification systems were based on time contours or templates of speaker specific acoustic features. An important element of these systems was the ability to time warp sample templates with model templates in order to provide useful comparisons. Most modern text dependent speaker verification systems are based on statistical representations of acoustic features analyzed as a function of time over specified utterances, most particularly the hidden markov model (HMM) representation. Modern text independent systems are based on vector quantization representations and, more recently, on Gaussian mixture model (GMM) representations. An important ingredient of statistically based systems is likelihood ratio decision techniques making use of speaker background models. Some recent research has shown how to extract higher level features based on speaking behavior and combine it with lower level, acoustic features for improved performance. The talk will present these topics in historical order showing the evolution of techniques.

  7. Building America House Performance Analysis Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Hendron, R.; Farrar-Nagy, S.; Anderson, R.; Judkoff, R.

    2001-10-29

    As the Building America Program has grown to include a large and diverse cross section of the home building industry, accurate and consistent analysis techniques have become more important to help all program partners as they perform design tradeoffs and calculate energy savings for prototype houses built as part of the program. This document illustrates some of the analysis concepts proven effective and reliable for analyzing the transient energy usage of advanced energy systems as well as entire houses. The analysis procedure described here provides a starting point for calculating energy savings of a prototype house relative to two base cases: builder standard practice and regional standard practice. Also provides building simulation analysis to calculate annual energy savings based on side-by-side short-term field testing of a prototype house.

  8. Secure Hardware Performance Analysis in Virtualized Cloud Environment

    Directory of Open Access Journals (Sweden)

    Chee-Heng Tan

    2013-01-01

    Full Text Available The main obstacle in mass adoption of cloud computing for database operations is the data security issue. In this paper, it is shown that IT services particularly in hardware performance evaluation in virtual machine can be accomplished effectively without IT personnel gaining access to real data for diagnostic and remediation purposes. The proposed mechanisms utilized TPC-H benchmark to achieve 2 objectives. First, the underlying hardware performance and consistency is supervised via a control system, which is constructed using a combination of TPC-H queries, linear regression, and machine learning techniques. Second, linear programming techniques are employed to provide input to the algorithms that construct stress-testing scenarios in the virtual machine, using the combination of TPC-H queries. These stress-testing scenarios serve 2 purposes. They provide the boundary resource threshold verification to the first control system, so that periodic training of the synthetic data sets for performance evaluation is not constrained by hardware inadequacy, particularly when the resources in the virtual machine are scaled up or down which results in the change of the utilization threshold. Secondly, they provide a platform for response time verification on critical transactions, so that the expected Quality of Service (QoS from these transactions is assured.

  9. Structural Verification of the First Orbital Wonder of the World - The Structural Testing and Analysis of the International Space Station (ISS)

    Science.gov (United States)

    Zipay, John J.; Bernstein, Karen S.; Bruno, Erica E.; Deloo, Phillipe; Patin, Raymond

    2012-01-01

    The International Space Station (ISS) can be considered one of the structural engineering wonders of the world. On par with the World Trade Center, the Colossus of Rhodes, the Statue of Liberty, the Great Pyramids, the Petronas towers and the Burj Khalifa skyscraper of Dubai, the ambition and scope of the ISS structural design, verification and assembly effort is a truly global success story. With its on-orbit life projected to be from its beginning in 1998 to the year 2020 (and perhaps beyond), all of those who participated in its development can consider themselves part of an historic engineering achievement representing all of humanity. The structural design and verification of the ISS could be the subject of many scholarly papers. Several papers have been written on the structural dynamic characterization of the ISS once it was assembled on-orbit [1], but the ground-based activities required to assure structural integrity and structural life of the individual elements from delivery to orbit through assembly and planned on-orbit operations have never been totally summarized. This paper is intended to give the reader an overview of some of the key decisions made during the structural verification planning for the elements of the U.S. On-Orbit Segment (USOS) as well as to summarize the many structural tests and structural analyses that were performed on its major elements. An effort is made for this paper to be summarily comprehensive, but as with all knowledge capture efforts of this kind, there are bound to be errors of omission. Should the reader discover any of these, please feel free to contact the principal author. The ISS (Figure 1) is composed of pre-integrated truss segments and pressurized elements supplied by NASA, the Russian Federal Space Agency (RSA), the European Space Agency (ESA) and the Japanese Aerospace Exploration Agency (JAXA). Each of these elements was delivered to orbit by a launch vehicle and connected to one another either robotically or

  10. Guidelines for Formal Verification Systems

    Science.gov (United States)

    1989-04-01

    This document explains the requirements for formal verification systems that are candidates for the NCSC’s Endorsed Tools List (ETL). This document...is primarily intended for developers of verification systems to use in the development of production-quality formal verification systems. It explains...the requirements and the process used to evaluate formal verification systems submitted to the NCSC for endorsement.

  11. NON-PERFORMING LOANS: ANALYSIS AND REGULATION

    OpenAIRE

    Lobozynska, Sophia

    2014-01-01

    The analysis of the level of non-performing loans and the volumes of forming the insurance reserves for them with a time lag of 6 years (2004–2009) in developing countries (China, Poland, Russia, Ukraine) was made. The estimation of efficiency of regulatory processes to combat non-performing loans of the banking sector which were applied by these countries was provided. The correlation-regression model of establishing cause-effect relationship between effective index – the share of non-perfor...

  12. Performance Analysis Using Coloured Petri Nets

    DEFF Research Database (Denmark)

    Wells, Lisa Marie

    2002-01-01

    This paper provides an overview of improved facilities for performance analysis using coloured Petri nets. Coloured Petri nets is a formal method that is well suited for modeling and analyzing large and complex systems. The paper describes steps that have been taken to make a distinction between...... modeling the behavior of a system and observing the behavior of a model. Performance-related facilities are discussed, including facilities for collecting data, running multiple simulations, generating statistically reliable simulation output, and comparing alternative system configurations....

  13. Retinal Verification Using a Feature Points-Based Biometric Pattern

    Directory of Open Access Journals (Sweden)

    M. Ortega

    2009-01-01

    Full Text Available Biometrics refer to identity verification of individuals based on some physiologic or behavioural characteristics. The typical authentication process of a person consists in extracting a biometric pattern of him/her and matching it with the stored pattern for the authorised user obtaining a similarity value between patterns. In this work an efficient method for persons authentication is showed. The biometric pattern of the system is a set of feature points representing landmarks in the retinal vessel tree. The pattern extraction and matching is described. Also, a deep analysis of similarity metrics performance is presented for the biometric system. A database with samples of retina images from users on different moments of time is used, thus simulating a hard and real environment of verification. Even in this scenario, the system allows to establish a wide confidence band for the metric threshold where no errors are obtained for training and test sets.

  14. Performance Analysis Based on Timing Simulation

    DEFF Research Database (Denmark)

    Nielsen, Christian Dalsgaard; Kishinevsky, Michael

    1994-01-01

    Determining the cycle time and a critical cycle is a fundamental problem in the analysis of concurrent systems. We solve this problemusing timing simulation of an underlying Signal Graph (an extension of Marked Graphs). For a Signal Graph with n vertices and m arcs our algorithm has the polynomial...... time complexity O(b2m), where b is the number of vertices with initially marked in-arcs (typically b≪n). The algorithm has a clear semantic and a low descriptive complexity. We illustrate the use of the algorithm by applying it to performance analysis of asynchronous circuits....

  15. Verification-Based Interval-Passing Algorithm for Compressed Sensing

    OpenAIRE

    Wu, Xiaofu; Yang, Zhen

    2013-01-01

    We propose a verification-based Interval-Passing (IP) algorithm for iteratively reconstruction of nonnegative sparse signals using parity check matrices of low-density parity check (LDPC) codes as measurement matrices. The proposed algorithm can be considered as an improved IP algorithm by further incorporation of the mechanism of verification algorithm. It is proved that the proposed algorithm performs always better than either the IP algorithm or the verification algorithm. Simulation resul...

  16. Z-2 Architecture Description and Requirements Verification Results

    Science.gov (United States)

    Graziosi, Dave; Jones, Bobby; Ferl, Jinny; Scarborough, Steve; Hewes, Linda; Ross, Amy; Rhodes, Richard

    2016-01-01

    , partial pressure relief valve, purge valve, donning stand and ISS Body Restraint Tether (BRT). Examples of manned requirements include verification of anthropometric range, suit self-don/doff, secondary suit exit method, donning stand self-ingress/egress and manned mobility covering eight functional tasks. The eight functional tasks include kneeling with object pick-up, standing toe touch, cross-body reach, walking, reach to the SIP and helmet visor. This paper will provide an overview of the Z-2 design. Z-2 requirements verification testing was performed with NASA at the ILC Houston test facility. This paper will also discuss pre-delivery manned and unmanned test results as well as analysis performed in support of requirements verification.

  17. Principal Component Analysis of Students Academic Performance

    Directory of Open Access Journals (Sweden)

    Frank B. K. Twenefour

    2015-02-01

    Full Text Available The purpose of this study was to identify a metric for measuring students’ performance in mathematics and statistics in the Department of Mathematics and Statistics of a public university in Ghana. Some of the students of the department are of the view that the current grading system used by the Department does not do a good job of distinguishing between the performances of students, as at times students of different academic performance could end up with the same Grade Point Average (GPA, a performance measure. Data for the research which covers the 2012/2013 third year students of the Department were obtained from the university’s student records unit. Principal Component Analysis (PCA was used to analyze the data. Three principal components were retained as rules or indices for the classification of students’ performance. A derivative of the first principal component, RSI, could serve as a new performance measure for the Department as it takes into consideration differences in the raw scores of the students.

  18. Testing Equation Method Modification for Demanding Energy Measurements Verification

    Directory of Open Access Journals (Sweden)

    Elena Kochneva

    2016-01-01

    Full Text Available The paper is devoted to the mathematical approaches of the measurements received from Automatic Meter Reading Systems verification. Reliability of metering data can be improved by application of the new issue named Energy Flow Problem. The paper considers demanding energy measurements verification method based on verification expressions groups analysis. Bad data detection and estimates accuracy calculation is presented using the Automatic Meter Reading system data from the Russian power system fragment.

  19. Experimental verification of the asymtotic modal analysis method as applied to a rectangular acoustic cavity excited by structural vibration

    Science.gov (United States)

    Peretti, L. F.; Dowell, E. H.

    1992-01-01

    An experiment was performed on a rigid wall rectangular acoustic cavity driven by a flexible plate mounted in a quarter of one end wall and excited by white noise. The experiment was designed so that the assumptions of Asymptotic Modal Analysis (AMA) were satisfied for certain bandwidths and center frequencies. Measurements of sound pressure levels at points along the boundaries and incrementally into tbe interior were taken. These were compared with the theoretical results predicted with AMA, and found to be in good agreement, particularly for moderate (1/3 octave) bandwidths and sufficiently high center frequencies. Sound pressure level measurements were also taken well into the cavity interior at various points along the 5 totally rigid walls. The AMA theory, including boundary intensification effects, was shown to be accurate provided the assumption of large number of acoustic modes is satisfied, and variables such as power spectra of the wall acceleration, frequency, and damping are slowly varying in the frequency of bandwidth.

  20. Performance analysis of hybrid district heating system

    DEFF Research Database (Denmark)

    Mikulandric, Robert; Krajačić, Goran; Khavin, Gennadii

    2013-01-01

    could reach up to 20% with utilisation of solar energy as supplement energy source in traditional fossil fuel based district heating systems. In this work, the performance of hybrid district energy system for a particular location will be analysed. For performance analysis, mathematical model...... more extensively used in district heating systems either separately or as a supplement to traditional fossil fuels in order to achieve national energy policy objectives. However, they are still facing problems such as high intermittences, high energy production costs and low load factors as well...... sources that can complement each other on daily and yearly basis and reduce negative aspects of particular energy source utilisation. In district heating systems, hybridisation could be performed through utilisation of renewable and non-renewable energy sources. Potential of fuel and emission reduction...

  1. [Analysis of MGMT methylation with the therascreen(®) MGMT Pyro(®) Kit (Qiagen). A method verification].

    Science.gov (United States)

    Luquain, Alexandra; Magnin, Sandrine; Guenat, David; Prétet, Jean-Luc; Viennet, Gabriel; Valmary-Degano, Séverine; Mougin, Christiane

    2015-01-01

    Promoter methylation of the MGMT gene, encoding the enzyme O6-methylguanine-ubiquitous methyltransferase, is a theranostic good prognosis marker of glioblastomas treated with alkylating chemotherapy (temozolomide, Temodal(®)). Among the methylation analysis techniques, pyrosequencing is a reproducible and sensitive quantitative method. As part of the accreditation of the hospital platform of molecular genetics of cancer, Besançon, our objective was to verify the performance of the pyrosequencing commercial kit therascreen(®) MGMT Pyro(®) (Qiagen) in terms of repeatability, reproducibility, limit of blank (LOB), limit of detection (LOD), linearity and contamination by the guide SH GTA 04 delivered by the Cofrac. The repeatability tests show an average methylation of 3.22% [standard deviation (SD) = 0.41, coefficient of variation (CV) = 12.75%] for the unmethylated control and 70.16% (SD = 2.20, CV = 3.14%) for the methylated control. Reproducibility demontrates an average methylation of 1.39% (SD = 0.25, CV = 18.25%) for the unmethylated control and of 94.03% (SD = 2.56, CV = 2.73%) for the methylated control. The percentages of LOB and LOD are respectively 3.43% and 6.22% methylation. The regression coefficient of 0,983 confirms the linearity of the assay from 0% to 100% methylation. No contamination has been observed. Over 40% of glioblastomas studied in 2013 in our laboratory have shown a methylated MGMT gene. Our results confirms that the theraScreen(®) MGMT Pyro(®) kit (Qiagen) is performant in compliance with the quality requirements of the NF EN ISO 15189 for the routine analysis of methylation status of MGMT in glioblastomas.

  2. Integrated Advance Microwave Sounding Unit-A (AMSU-A). Performance Verification Report: EOS AMSU-A1 and AMSU-A2 Receivers Assemblies

    Science.gov (United States)

    2000-01-01

    This test report presents the test data of the EOS AMSU-A Flight Model No.1 (FM-1) receiver subsystem. The tests are performed per the Acceptance Test Procedure for the AMSU-A Reseiver Subsystem, AE-26002/6A. The functional performance tests are conducted either at the component or subsystem level. While the component-level tests are performed over the entire operating temperature range predicted by thermal analysis, the subsystem-level test are conducted at ambient temperature only.

  3. Voltage verification unit

    Science.gov (United States)

    Martin, Edward J [Virginia Beach, VA

    2008-01-15

    A voltage verification unit and method for determining the absence of potentially dangerous potentials within a power supply enclosure without Mode 2 work is disclosed. With this device and method, a qualified worker, following a relatively simple protocol that involves a function test (hot, cold, hot) of the voltage verification unit before Lock Out/Tag Out and, and once the Lock Out/Tag Out is completed, testing or "trying" by simply reading a display on the voltage verification unit can be accomplished without exposure of the operator to the interior of the voltage supply enclosure. According to a preferred embodiment, the voltage verification unit includes test leads to allow diagnostics with other meters, without the necessity of accessing potentially dangerous bus bars or the like.

  4. SSN Verification Service

    Data.gov (United States)

    Social Security Administration — The SSN Verification Service is used by Java applications to execute the GUVERF02 service using the WebSphere/CICS Interface. It accepts several input data fields...

  5. Environmental Technology Verification Report - Electric Power and Heat Production Using Renewable Biogas at Patterson Farms

    Science.gov (United States)

    The U.S. EPA operates the Environmental Technology Verification program to facilitate the deployment of innovative technologies through performance verification and information dissemination. A technology area of interest is distributed electrical power generation, particularly w...

  6. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT--BAGHOUSE FILTRATION PRODUCTS, DONALDSON COMPANY, INC., 6282 FILTRATION MEDIA

    Science.gov (United States)

    The Environmental Technology Verification (ETV) Program, established by the U.S. EPA, is designed to accelerate the developmentand commercialization of new or improved technologies through third-party verification and reporting of performance. The Air Pollution Control Technology...

  7. Environmental Technology Verification: Baghouse Filtration Products--Donaldson Co., Inc., Tetratec #6255-3 Filtration Media

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. ETV seeks to ach...

  8. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: BAGHOUSE FILTRATION PRODUCTS--DONALDSON COMPANY, INC., TETRATEC #6255 FILTRATION MEDIA

    Science.gov (United States)

    The Environmental Technology Verification (ETV) Program, established by the U.S. EPA, is designed to accelerate the development and commercialization of new or improved technologies through third-party verification and reporting of performance. The Air Pollution Control Technolog...

  9. Environmental Technology Verification: Baghouse Filtration Products--TDC Filter Manufacturing, Inc., SB025 Filtration Media

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. ETV seeks to ach...

  10. Environmental Technology Verification: Baghouse Filtration Products--Sinoma Science & Technology Co. Ltd FT-806 Filtration Media

    Science.gov (United States)

    EPA created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. It seeks to achieve this goal by providing high-quality, peer r...

  11. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, GROUNDWATER SAMPLING TECHNOLOGIES, GEOLOG, INC., MICRO-FLO BLADDER PUMP MODEL 57400

    Science.gov (United States)

    The U.S. Environmental Protection Agency has created the Environmental Technology Verification Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the ETV Program...

  12. Automated Cache Performance Analysis And Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Mohror, Kathryn [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-12-23

    While there is no lack of performance counter tools for coarse-grained measurement of cache activity, there is a critical lack of tools for relating data layout to cache behavior to application performance. Generally, any nontrivial optimizations are either not done at all, or are done ”by hand” requiring significant time and expertise. To the best of our knowledge no tool available to users measures the latency of memory reference instructions for partic- ular addresses and makes this information available to users in an easy-to-use and intuitive way. In this project, we worked to enable the Open|SpeedShop performance analysis tool to gather memory reference latency information for specific instructions and memory ad- dresses, and to gather and display this information in an easy-to-use and intuitive way to aid performance analysts in identifying problematic data structures in their codes. This tool was primarily designed for use in the supercomputer domain as well as grid, cluster, cloud-based parallel e-commerce, and engineering systems and middleware. Ultimately, we envision a tool to automate optimization of application cache layout and utilization in the Open|SpeedShop performance analysis tool. To commercialize this soft- ware, we worked to develop core capabilities for gathering enhanced memory usage per- formance data from applications and create and apply novel methods for automatic data structure layout optimizations, tailoring the overall approach to support existing supercom- puter and cluster programming models and constraints. In this Phase I project, we focused on infrastructure necessary to gather performance data and present it in an intuitive way to users. With the advent of enhanced Precise Event-Based Sampling (PEBS) counters on recent Intel processor architectures and equivalent technology on AMD processors, we are now in a position to access memory reference information for particular addresses. Prior to the introduction of PEBS counters

  13. Performance Analysis of the Romanian Administration

    Directory of Open Access Journals (Sweden)

    Marius Constantin PROFIROIU

    2013-10-01

    Full Text Available The performance of public administration is one of the top priorities of the national governments worldwide, not only for Romania. The role of a performing management system at the level of public administration is to ensure a high quality and efficiency of the adopted policies and strategies, of the provided public services and of the administrative act itself, and to guarantee the advantage of a competitive and efficient administration both in relation to its own citizens, and in competition with other cities and countries throughout Europe and all around the world. Following these considerations, and based upon an empirical research conducted with the aid of a survey regarding ‘The analysis of the performance level of the Romanian public administration’ the article aims to (1 identify modern management tools that determine and influence the performance of Romanian public institutions, (2 analyze the effects of using project management as organizational capacity development instruments by public administration in Romania, and (3 determine the influence and effects of the external factors on the performance and development of Romanian public administration.

  14. [Analysis of gene effects on performance characteristics].

    Science.gov (United States)

    Geldermann, H

    1996-10-01

    In farm animals, associations between individually identified genotypes and the values of performance traits were investigated since more than 30 years. The topic of research was largely determined by the Veterinary Institute of the University of Göttingen. For the experimental analysis of gene loci, for which allelic variants are connected with alterations of trait values, new techniques of DNA diagnostic were of crucial significance. Thereby, two approaches of analysis of quantitative trait loci (QTL) can be distinguished. By considering informative groups of animals, the first approach uses marker loci in order to trace the inheritance of their alleles and thus simultaneously the transfer of specific chromosome sections to individuals of the offspring generation. By this manner, the associations between the marked chromosome regions and trait values are calculated. Results are shown for examples from experiments with milk performance in cattle and with fatting and carcass traits in pigs. In the second approach, genetic effects of trait values are assigned to distinct genes or gene clusters. For this purpose, variants of the gene structure are identified and then analysed for their effects on the formation of specific trait values. As an example of such a functional analysis of single gene positions, the milk protein coding genes in cattle are given. From the data we see that DNA techniques allow a direct access to genotypic information and so far-reaching potential for tracing back effects on trait values to single nucleotide differences. However, such a functional analysis need specific test systems which are able to consider the complex net work between single gene effects and the multifactorially caused values of performance traits. This will be possible by proceedings, which identify the gene effects in vivo and the balancing forces of haplotype combinations in populations. Genetic parameters of such investigations are needed for farm animal populations

  15. Performance management in healthcare: a critical analysis.

    Science.gov (United States)

    Hewko, Sarah J; Cummings, Greta G

    2016-01-01

    Purpose - The purpose of this paper is to explore the underlying theoretical assumptions and implications of current micro-level performance management and evaluation (PME) practices, specifically within health-care organizations. PME encompasses all activities that are designed and conducted to align employee outputs with organizational goals. Design/methodology/approach - PME, in the context of healthcare, is analyzed through the lens of critical theory. Specifically, Habermas' theory of communicative action is used to highlight some of the questions that arise in looking critically at PME. To provide a richer definition of key theoretical concepts, the authors conducted a preliminary, exploratory hermeneutic semantic analysis of the key words "performance" and "management" and of the term "performance management". Findings - Analysis reveals that existing micro-level PME systems in health-care organizations have the potential to create a workforce that is compliant, dependent, technically oriented and passive, and to support health-care systems in which inequalities and power imbalances are perpetually reinforced. Practical implications - At a time when the health-care system is under increasing pressure to provide high-quality, affordable services with fewer resources, it may be wise to investigate new sector-specific ways of evaluating and managing performance. Originality/value - In this paper, written for health-care leaders and health human resource specialists, the theoretical assumptions and implications of current PME practices within health-care organizations are explored. It is hoped that readers will be inspired to support innovative PME practices within their organizations that encourage peak performance among health-care professionals.

  16. Abstraction and Learning for Infinite-State Compositional Verification

    Directory of Open Access Journals (Sweden)

    Dimitra Giannakopoulou

    2013-09-01

    Full Text Available Despite many advances that enable the application of model checking techniques to the verification of large systems, the state-explosion problem remains the main challenge for scalability. Compositional verification addresses this challenge by decomposing the verification of a large system into the verification of its components. Recent techniques use learning-based approaches to automate compositional verification based on the assume-guarantee style reasoning. However, these techniques are only applicable to finite-state systems. In this work, we propose a new framework that interleaves abstraction and learning to perform automated compositional verification of infinite-state systems. We also discuss the role of learning and abstraction in the related context of interface generation for infinite-state components.

  17. Semi-automated repair verification of aerial images

    Science.gov (United States)

    Poortinga, Eric; Schereubl, Thomas; Richter, Rigo

    2009-04-01

    Using aerial image metrology to qualify repairs of defects on photomasks is an industry standard. Aerial image metrology provides reasonable matching of lithographic imaging performance without the need for wafer prints. Utilization of this capability by photomask manufacturers has risen due to the increased complexity of layouts incorporating RET and phase shift technologies. Tighter specifications by end-users have pushed aerial image metrology activities to now include CD performance results in addition to the traditional intensity performance results. Discussed is the computer implemented semi-automated analysis of aerial images for repair verification activities. Newly designed user interfaces and algorithms could guide users through predefined analysis routines as to minimize errors. There are two main routines discussed here, one allowing multiple reference sites along with a test/defect site on a single image of repeating features. The second routine compares a test/defect measurement image with a reference measurement image. This paper highlights new functionality desirable for aerial image analysis as well as describes possible ways of its realization. Using structured analysis processes and innovative analysis tools could lead to a highly efficient and more reliable result reporting of repair verification metrology.

  18. Financial Performance Analysis Of Financial Service Cooperative

    Directory of Open Access Journals (Sweden)

    Eyo Asro Sasmita

    2015-08-01

    Full Text Available This research is aimed to test and identify empirical evidence regarding the effect of capital structure and loan to financial performance of cooperative where the relationship between loan and financial performance is moderated by non-performing loan. The population of this research is 257 Financial Service Cooperative hereinafter referred to as KJK as the abbreviation for Koperasi Jasa Keuangan of Urban Village Community Economic Empowerment hereinafter referred to as PEMK as the abbreviation for Pemberdayaan Ekonomi Masyarakat Kelurahan in Jakarta 2011 to 2013. Sample is determined by using purposive sampling method. The data is secondary data which is obtained from the Revolving Fund Management Unit hereinafter referred to as UPDB as the abbreviation for Unit Pengelola Dana Bergulir Jakarta. Hypothesis is tested by using multiple linear regression analysis with SPSS 20.00. The number of sample used in this research is 120. Research findings explain that 1 Capital Structure hereinafter referred to as SM as the abbreviation for Struktur Modal has positive and significant impact on financial performance hereinafter referred to as KIN as the abbreviation for Kinerja Keuangan because the probability value of 0000 is smaller than amp945 0.05. Calculation shows that if the capital structure rises 1 assuming that the loan and non-performing loan variables remain the same then the financial performance will increase 0.017. 2 Loans hereinafter referred to as PIN as the abbreviation for Pinjaman given has positive and significant impact on KIN because the probability value of 0001 is smaller than amp945 0.05. If the loan rises 1 assuming that the capital structure and non-performing loan variables remain the same then the KIN will increase 0.013. 3 Non-performing loan has negative and significant effect on KIN because the probability value of 0000 is smaller than amp945 0.05. PBR varible increase 1 assuming that the loan and capital structure variables

  19. Performance Analysis of Photovoltaic Water Heating System

    Directory of Open Access Journals (Sweden)

    Tomas Matuska

    2017-01-01

    Full Text Available Performance of solar photovoltaic water heating systems with direct coupling of PV array to DC resistive heating elements has been studied and compared with solar photothermal systems. An analysis of optimum fixed load resistance for different climate conditions has been performed for simple PV heating systems. The optimum value of the fixed load resistance depends on the climate, especially on annual solar irradiation level. Use of maximum power point tracking compared to fixed optimized load resistance increases the annual yield by 20 to 35%. While total annual efficiency of the PV water heating systems in Europe ranges from 10% for PV systems without MPP tracking up to 15% for system with advanced MPP trackers, the efficiency of solar photothermal system for identical hot water load and climate conditions is more than 3 times higher.

  20. Performance measurement with fuzzy data envelopment analysis

    CERN Document Server

    Tavana, Madjid

    2014-01-01

    The intensity of global competition and ever-increasing economic uncertainties has led organizations to search for more efficient and effective ways to manage their business operations.  Data envelopment analysis (DEA) has been widely used as a conceptually simple yet powerful tool for evaluating organizational productivity and performance. Fuzzy DEA (FDEA) is a promising extension of the conventional DEA proposed for dealing with imprecise and ambiguous data in performance measurement problems. This book is the first volume in the literature to present the state-of-the-art developments and applications of FDEA. It is designed for students, educators, researchers, consultants and practicing managers in business, industry, and government with a basic understanding of the DEA and fuzzy logic concepts.

  1. Diversity Performance Analysis on Multiple HAP Networks

    Directory of Open Access Journals (Sweden)

    Feihong Dong

    2015-06-01

    Full Text Available One of the main design challenges in wireless sensor networks (WSNs is achieving a high-data-rate transmission for individual sensor devices. The high altitude platform (HAP is an important communication relay platform for WSNs and next-generation wireless networks. Multiple-input multiple-output (MIMO techniques provide the diversity and multiplexing gain, which can improve the network performance effectively. In this paper, a virtual MIMO (V-MIMO model is proposed by networking multiple HAPs with the concept of multiple assets in view (MAV. In a shadowed Rician fading channel, the diversity performance is investigated. The probability density function (PDF and cumulative distribution function (CDF of the received signal-to-noise ratio (SNR are derived. In addition, the average symbol error rate (ASER with BPSK and QPSK is given for the V-MIMO model. The system capacity is studied for both perfect channel state information (CSI and unknown CSI individually. The ergodic capacity with various SNR and Rician factors for different network configurations is also analyzed. The simulation results validate the effectiveness of the performance analysis. It is shown that the performance of the HAPs network in WSNs can be significantly improved by utilizing the MAV to achieve overlapping coverage, with the help of the V-MIMO techniques.

  2. Performance Analysis of 3G Communication Network

    Directory of Open Access Journals (Sweden)

    Toni Anwar

    2013-09-01

    Full Text Available In this project, third generation (3G technologies research had been carried out to design and optimization conditions for 3G network. The 3G wireless mobile communication networks are growing at an ever faster rate, and this is likely to continue in the foreseeable future. Some services such as e-mail, web browsing etc allow the transition of the network from circuit switched to packet switched operation, resulting in increased overall network performance. Higher reliability, better coverage and services, higher capacity, mobility management, and wireless multimedia are all parts of the network performance. Throughput and spectral efficiency are fundamental parameters in capacity planning for 3G cellular network deployments. This project investigates also the downlink (DL and uplink (UL throughput and spectral efficiency performance of the standard Universal Mobile Telecommunications system (UMTS system for different scenarios of user and different technologies. Power consumption comparison for different mobile technology is also discussed. The analysis can significantly help system engineers to obtain crucial performance characteristics of 3G network. At the end of the paper, coverage area of 3G from one of the mobile network in Malaysia is presented.

  3. Analysis, testing and verification of the behavior of composite pavements under Florida conditions using a heavy vehicle simulator

    Science.gov (United States)

    Tapia Gutierrez, Patricio Enrique

    Whitetopping (WT) is a rehabilitation method to resurface deteriorated asphalt pavements. While some of these composite pavements have performed very well carrying heavy load, other have shown poor performance with early cracking. With the objective of analyzing the applicability of WT pavements under Florida conditions, a total of nine full-scale WT test sections were constructed and tested using a Heavy Vehicle Simulator (HVS) in the APT facility at the FDOT Material Research Park. The test sections were instrumented to monitor both strain and temperature. A 3-D finite element model was developed to analyze the WT test sections. The model was calibrated and verified using measured FWD deflections and HVS load-induced strains from the test sections. The model was then used to evaluate the potential performance of these test sections under critical temperature-load condition in Florida. Six of the WT pavement test sections had a bonded concrete-asphalt interface by milling, cleaning and spraying with water the asphalt surface. This method produced excellent bonding at the interface, with shear strength of 195 to 220 psi. Three of the test sections were intended to have an unbonded concrete-asphalt interface by applying a debonding agent in the asphalt surface. However, shear strengths between 119 and 135 psi and a careful analysis of the strain and the temperature data indicated a partial bond condition. The computer model was able to satisfactorily model the behavior of the composite pavement by mainly considering material properties from standard laboratory tests and calibrating the spring elements used to model the interface. Reasonable matches between the measured and the calculated strains were achieved when a temperature-dependent AC elastic modulus was included in the analytical model. The expected numbers of repetitions of the 24-kip single axle loads at critical thermal condition were computed for the nine test sections based on maximum tensile stresses

  4. Embedded software verification and debugging

    CERN Document Server

    Winterholer, Markus

    2017-01-01

    This book provides comprehensive coverage of verification and debugging techniques for embedded software, which is frequently used in safety critical applications (e.g., automotive), where failures are unacceptable. Since the verification of complex systems needs to encompass the verification of both hardware and embedded software modules, this book focuses on verification and debugging approaches for embedded software with hardware dependencies. Coverage includes the entire flow of design, verification and debugging of embedded software and all key approaches to debugging, dynamic, static, and hybrid verification. This book discusses the current, industrial embedded software verification flow, as well as emerging trends with focus on formal and hybrid verification and debugging approaches. Includes in a single source the entire flow of design, verification and debugging of embedded software; Addresses the main techniques that are currently being used in the industry for assuring the quality of embedded softw...

  5. Idaho National Laboratory Quarterly Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, Lisbeth [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-11-01

    This report is published quarterly by the Idaho National Laboratory (INL) Quality and Performance Management Organization. The Department of Energy (DOE) Occurrence Reporting and Processing System (ORPS), as prescribed in DOE Order 232.2, “Occurrence Reporting and Processing of Operations Information,” requires a quarterly analysis of events, both reportable and not reportable, for the previous 12 months. This report is the analysis of 60 reportable events (23 from the 4th Qtr FY14 and 37 from the prior three reporting quarters) as well as 58 other issue reports (including not reportable events and Significant Category A and B conditions) identified at INL from July 2013 through October 2014. Battelle Energy Alliance (BEA) operates the INL under contract DE AC07 051D14517.

  6. Building America Performance Analysis Procedures: Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Hendron, R.; Anderson, R.; Judkoff, R.; Christensen, C.; Eastment, M.; Norton, P.; Reeves, P.; Hancock, E.

    2004-06-01

    To measure progress toward multi-year Building America research goals, cost and performance trade-offs are evaluated through a series of controlled field and laboratory experiments supported by energy analysis techniques that use test data to''calibrate'' energy simulation models. This report summarizes the guidelines for reporting such analytical results using the Building America Research Benchmark (Version 3.1) in studies that also include consideration of current Regional and Builder Standard Practice. Version 3.1 of the Benchmark is generally consistent with the 1999 Home Energy Rating System (HERS) Reference Home, with additions that allow evaluation of all home energy uses.

  7. Performing data analysis using IBM SPSS

    CERN Document Server

    Meyers, Lawrence S; Guarino, A J

    2013-01-01

    This book is designed to be a user's guide for students and other interested readers to perform statistical data analysis with IBM SPSS, which is a major statistical software package used extensively in academic, government, and business settings. This book addresses the needs, level of sophistication, and interest in introductory statistical methodology on the part of undergraduate and graduate students in social and behavioral science, business, health-related, and education programs.  Each chapter covers a particular statistical procedure and has the following format: an example pr

  8. PERFORMANCE TESTING AND ANALYSIS OF CUPOLA FURNACE

    OpenAIRE

    PROF.HEMANT R. BHAGAT-PATIL; MEGHA S. LONDHEKAR

    2013-01-01

    In today’s industrial scenario huge losses/wastage occur in the manufacturing shop floor and foundry industries. The efficiency of any foundry largely depends on the efficiency of the melting process amulti-step operation where the metal is heated, treated, alloyed, and transported into die or mold cavities to form a casting. In this paper we represents the performance testing and analysis of Cupola Furnace, and reduces the problems occurs to give the best results. Our main focus in this work...

  9. Performance Analysis of Microfinance Institutions of India

    Directory of Open Access Journals (Sweden)

    Muhammad Azhar Ikram Ahmad

    2014-12-01

    Full Text Available This is a study of Microfinance Institutions-MFIs of India. It includes analysis of MFIs of India. This study includes analysis of performance of microfinance institutions with reference to both financial and non-financial ways. Performance of microfinance institutions is measured using four parameters, which are sustainability/profitability, outreach, operational and financial efficiency. Data is taken of 99 Microfinance Institutions of India from the Microfinance Information Exchange for a period of 11 years. Variables of this study are both in absolute and relative terms. The endogenous variables are Return on Assets and Return on Equity for sustainability, Number of Borrowers per Staff Member for operational efficiency, Cost per Borrower for financial efficiency, and Number of Active Borrowers for outreach. Panel data analysis is done after checking the assumptions of the model. Hausman Test is applied to find out the suitability of Fixed or Random Effect Model. Both random and fixed effect were found suitable for application. In addition to this descriptive analysis of the variables is also done. The results show that most of the variables used in the study are significant in outreach model; other than rank, financial revenue to assets ratio, portfolio at risk, deposits, and capital to assets ratio all other variables are significant in case of sustainability using ROA model and same variables are found insignificant in ROE model except financial expense to assets ratio; in financial efficiency model both significant and insignificant variables are found; and in case of operational efficiency all variables are found significant.

  10. Verification of Anderson Superexchange in MnO via Magnetic Pair Distribution Function Analysis and ab initio Theory.

    Science.gov (United States)

    Frandsen, Benjamin A; Brunelli, Michela; Page, Katharine; Uemura, Yasutomo J; Staunton, Julie B; Billinge, Simon J L

    2016-05-13

    We present a temperature-dependent atomic and magnetic pair distribution function (PDF) analysis of neutron total scattering measurements of antiferromagnetic MnO, an archetypal strongly correlated transition-metal oxide. The known antiferromagnetic ground-state structure fits the low-temperature data closely with refined parameters that agree with conventional techniques, confirming the reliability of the newly developed magnetic PDF method. The measurements performed in the paramagnetic phase reveal significant short-range magnetic correlations on a ∼1  nm length scale that differ substantially from the low-temperature long-range spin arrangement. Ab initio calculations using a self-interaction-corrected local spin density approximation of density functional theory predict magnetic interactions dominated by Anderson superexchange and reproduce the measured short-range magnetic correlations to a high degree of accuracy. Further calculations simulating an additional contribution from a direct exchange interaction show much worse agreement with the data. The Anderson superexchange model for MnO is thus verified by experimentation and confirmed by ab initio theory.

  11. SUBSONIC WIND TUNNEL PERFORMANCE ANALYSIS SOFTWARE

    Science.gov (United States)

    Eckert, W. T.

    1994-01-01

    This program was developed as an aid in the design and analysis of subsonic wind tunnels. It brings together and refines previously scattered and over-simplified techniques used for the design and loss prediction of the components of subsonic wind tunnels. It implements a system of equations for determining the total pressure losses and provides general guidelines for the design of diffusers, contractions, corners and the inlets and exits of non-return tunnels. The algorithms used in the program are applicable to compressible flow through most closed- or open-throated, single-, double- or non-return wind tunnels or ducts. A comparison between calculated performance and that actually achieved by several existing facilities produced generally good agreement. Any system through which air is flowing which involves turns, fans, contractions etc. (e.g., an HVAC system) may benefit from analysis using this software. This program is an update of ARC-11138 which includes PC compatibility and an improved user interface. The method of loss analysis used by the program is a synthesis of theoretical and empirical techniques. Generally, the algorithms used are those which have been substantiated by experimental test. The basic flow-state parameters used by the program are determined from input information about the reference control section and the test section. These parameters were derived from standard relationships for compressible flow. The local flow conditions, including Mach number, Reynolds number and friction coefficient are determined for each end of each component or section. The loss in total pressure caused by each section is calculated in a form non-dimensionalized by local dynamic pressure. The individual losses are based on the nature of the section, local flow conditions and input geometry and parameter information. The loss forms for typical wind tunnel sections considered by the program include: constant area ducts, open throat ducts, contractions, constant

  12. Nominal Performance Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M. Wasiolek

    2004-09-08

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standard. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA-LA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the ''Biosphere Model Report'' in Figure 1-1, contain detailed description of the model input parameters, their development, and the relationship between the parameters and specific features events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. The objectives of this analysis are to develop BDCFs for the groundwater exposure scenario for the three climate states considered in the TSPA-LA as well as conversion factors for evaluating compliance with the groundwater protection standard. The BDCFs will be used in performance assessment for calculating all-pathway annual doses for a given concentration of radionuclides in groundwater. The conversion factors will be used for calculating gross alpha particle

  13. Nominal Performance Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M. Wasiolek

    2000-12-21

    The purpose of this report was to document the process leading to development of the Biosphere Dose Conversion Factors (BDCFs) for the postclosure nominal performance of the potential repository at Yucca Mountain. BDCF calculations concerned twenty-four radionuclides. This selection included sixteen radionuclides that may be significant nominal performance dose contributors during the compliance period of up to 10,000 years, five additional radionuclides of importance for up to 1 million years postclosure, and three relatively short-lived radionuclides important for the human intrusion scenario. Consideration of radionuclide buildup in soil caused by previous irrigation with contaminated groundwater was taken into account in the BDCF development. The effect of climate evolution, from the current arid conditions to a wetter and cooler climate, on the BDCF values was evaluated. The analysis included consideration of different exposure pathway's contribution to the BDCFs. Calculations of nominal performance BDCFs used the GENII-S computer code in a series of probabilistic realizations to propagate the uncertainties of input parameters into the output. BDCFs for the nominal performance, when combined with the concentrations of radionuclides in groundwater allow calculation of potential radiation doses to the receptor of interest. Calculated estimates of radionuclide concentration in groundwater result from the saturated zone modeling. The integration of the biosphere modeling results (BDCFs) with the outcomes of the other component models is accomplished in the Total System Performance Assessment (TSPA) to calculate doses to the receptor of interest from radionuclides postulated to be released to the environment from the potential repository at Yucca Mountain.

  14. [Importance-Performance Analysis for services management].

    Science.gov (United States)

    Abalo Piñeiro, Javier; Varela Mallou, Jesús; Rial Boubeta, Antonio

    2006-11-01

    Importance-Performance Analysis (IPA) constitutes an indirect approximation to user's satisfaction measurement that allows to represent, in an easy and functional way, the main points and improvement areas of a specific product or service. Beginning from the importance and judgements concerning the performance that users grant to each prominent attributes of a service, it is possible to obtain a graphic divided into four quadrants in which recommendations for the organization economic resources management are included. Nevertheless, this tool has raised controversies since its origins, referred fundamentally to the placement of the axes that define the quadrants and the conception and measurement of the importance of attributes that compose the service. The primary goal of this article is to propose an alternative to the IPA representation that allows to overcome the limitations and contradictions derived from the original technique, without rejecting the classical graph. The analysis applies to data obtained in a survey about satisfaction with primary health care services of Galicia. Results will permit to advise to primary health care managers with a view toward the planning of future strategic actions.

  15. Grip-pattern verification for a smart gun

    NARCIS (Netherlands)

    Shang, X.; Groenland, J.P.J.; Groenland, J.P.J.; Veldhuis, Raymond N.J.

    In the biometric verification system of a smart gun, the rightful user of the gun is recognized based on grip-pattern recognition. It was found that the verification performance of grip-pattern recognition degrades strongly when the data for training and testing the classifier, respectively, have

  16. Student-Teacher Linkage Verification: Model Process and Recommendations

    Science.gov (United States)

    Watson, Jeffery; Graham, Matthew; Thorn, Christopher A.

    2012-01-01

    As momentum grows for tracking the role of individual educators in student performance, school districts across the country are implementing projects that involve linking teachers to their students. Programs that link teachers to student outcomes require a verification process for student-teacher linkages. Linkage verification improves accuracy by…

  17. Importance Performance Analysis as a Trade Show Performance Evaluation and Benchmarking Tool

    OpenAIRE

    Tafesse, Wondwesen; Skallerud, Kåre; Korneliussen, Tor

    2010-01-01

    The purpose of this study is to introduce importance performance analysis as a trade show performance evaluation and benchmarking tool. Importance performance analysis considers exhibitors’ performance expectation and perceived performance in unison to evaluate and benchmark trade show performance. The present study uses data obtained from exhibitors of an international trade show to demonstrate how importance performance analysis can be used to evaluate and benchmark trade show performance. ...

  18. A high-resolution simulation of Supertyphoon Rammasun (2014)—Part I: Model verification and surface energetics analysis

    Science.gov (United States)

    Zhang, Xinghai; Duan, Yihong; Wang, Yuqing; Wei, Na; Hu, Hao

    2017-06-01

    A 72-h high-resolution simulation of Supertyphoon Rammasun (2014) is performed using the Advanced Research Weather Research and Forecasting model. The model covers an initial 18-h spin-up, the 36-h rapid intensification (RI) period in the northern South China Sea, and the 18-h period of weakening after landfall. The results show that the model reproduces the track, intensity, structure of the storm, and environmental circulations reasonably well. Analysis of the surface energetics under the storm indicates that the storm's intensification is closely related to the net energy gain rate ( ɛ g), defined as the difference between the energy production ( P D) due to surface entropy flux and the energy dissipation ( D S) due to surface friction near the radius of maximum wind (RMW). Before and during the RI stage, the ɛ g is high, indicating sufficient energy supply for the storm to intensify. However, the ɛ g decreases rapidly as the storm quickly intensifies, because the DS increases more rapidly than the P D near the RMW. By the time the storm reaches its peak intensity, the D S is about 20% larger than the P D near the RMW, leading to a local energetics deficit under the eyewall. During the mature stage, the P D and D S can reach a balance within a radius of 86 km from the storm center (about 2.3 times the RMW). This implies that the local P D under the eyewall is not large enough to balance the D S, and the radially inward energy transport from outside the eyewall must play an important role in maintaining the storm's intensity, as well as its intensification.

  19. Multiple imputation to correct for partial verification bias revisited

    NARCIS (Netherlands)

    de Groot, J. A. H.; Janssen, K. J. M.; Zwinderman, A. H.; Moons, K. G. M.; Reitsma, J. B.

    2008-01-01

    Partial verification refers to the situation where a subset of patients is not verified by the reference (gold) standard and is excluded from the analysis. If partial verification is present, the present, the observed (naive) measures of accuracy such as sensitivity and specificity are most likely

  20. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: BAGHOUSE FILTRATION PRODUCTS—SOUTHERN FILTER MEDIA, LLC, PE-16/M-SPES FILTER SAMPLE

    Science.gov (United States)

    The U.S. EPA has created the Environmental Technology Verification program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The program tested the performance of baghouse filtrati...

  1. AUTOMATED, HIGHLY ACCURATE VERIFICATION OF RELAP5-3D

    Energy Technology Data Exchange (ETDEWEB)

    George L Mesina; David Aumiller; Francis Buschman

    2014-07-01

    Computer programs that analyze light water reactor safety solve complex systems of governing, closure and special process equations to model the underlying physics. In addition, these programs incorporate many other features and are quite large. RELAP5-3D[1] has over 300,000 lines of coding for physics, input, output, data management, user-interaction, and post-processing. For software quality assurance, the code must be verified and validated before being released to users. Verification ensures that a program is built right by checking that it meets its design specifications. Recently, there has been an increased importance on the development of automated verification processes that compare coding against its documented algorithms and equations and compares its calculations against analytical solutions and the method of manufactured solutions[2]. For the first time, the ability exists to ensure that the data transfer operations associated with timestep advancement/repeating and writing/reading a solution to a file have no unintended consequences. To ensure that the code performs as intended over its extensive list of applications, an automated and highly accurate verification method has been modified and applied to RELAP5-3D. Furthermore, mathematical analysis of the adequacy of the checks used in the comparisons is provided.

  2. An Efficient Location Verification Scheme for Static Wireless Sensor Networks.

    Science.gov (United States)

    Kim, In-Hwan; Kim, Bo-Sung; Song, JooSeok

    2017-01-24

    In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors.

  3. Formal Verification at System Level

    Science.gov (United States)

    Mazzini, S.; Puri, S.; Mari, F.; Melatti, I.; Tronci, E.

    2009-05-01

    System Level Analysis calls for a language comprehensible to experts with different background and yet precise enough to support meaningful analyses. SysML is emerging as an effective balance between such conflicting goals. In this paper we outline some the results obtained as for SysML based system level functional formal verification by an ESA/ESTEC study, with a collaboration among INTECS and La Sapienza University of Roma. The study focuses on SysML based system level functional requirements techniques.

  4. Nuclear disarmament verification

    Energy Technology Data Exchange (ETDEWEB)

    DeVolpi, A.

    1993-12-31

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification.

  5. Weak lensing magnification in the Dark Energy Survey Science Verification data

    Science.gov (United States)

    Garcia-Fernandez, M.; Sanchez, E.; Sevilla-Noarbe, I.; Suchyta, E.; Huff, E. M.; Gaztanaga, E.; Aleksić, J.; Ponce, R.; Castander, F. J.; Hoyle, B.; Abbott, T. M. C.; Abdalla, F. B.; Allam, S.; Annis, J.; Benoit-Lévy, A.; Bernstein, G. M.; Bertin, E.; Brooks, D.; Buckley-Geer, E.; Burke, D. L.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Crocce, M.; Cunha, C. E.; D'Andrea, C. B.; da Costa, L. N.; DePoy, D. L.; Desai, S.; Diehl, H. T.; Eifler, T. F.; Evrard, A. E.; Fernandez, E.; Flaugher, B.; Fosalba, P.; Frieman, J.; García-Bellido, J.; Gerdes, D. W.; Giannantonio, T.; Gruen, D.; Gruendl, R. A.; Gschwend, J.; Gutierrez, G.; James, D. J.; Jarvis, M.; Kirk, D.; Krause, E.; Kuehn, K.; Kuropatkin, N.; Lahav, O.; Lima, M.; MacCrann, N.; Maia, M. A. G.; March, M.; Marshall, J. L.; Melchior, P.; Miquel, R.; Mohr, J. J.; Plazas, A. A.; Romer, A. K.; Roodman, A.; Rykoff, E. S.; Scarpine, V.; Schubnell, M.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Tarle, G.; Thomas, D.; Walker, A. R.; Wester, W.

    2018-02-01

    In this paper the effect of weak lensing magnification on galaxy number counts is studied by cross-correlating the positions of two galaxy samples, separated by redshift, using the Dark Energy Survey Science Verification dataset. This analysis is carried out for galaxies that are selected only by its photometric redshift. An extensive analysis of the systematic effects, using new methods based on simulations is performed, including a Monte Carlo sampling of the selection function of the survey.

  6. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  7. The bakery protocol : a comparitive case-study in formal verification

    NARCIS (Netherlands)

    W.O.D. Griffioen; H.P. Korver

    1995-01-01

    textabstractA Comparative Case-Study in Formal Verification Groote and the second author verified (a version of) the Bakery Protocol in $muCRL$. Their process-algebraic verification is rather complex compared to the protocol. Now the question is: How do other verification techniques perform on this

  8. 24 CFR 985.3 - Indicators, HUD verification methods and ratings.

    Science.gov (United States)

    2010-04-01

    ..., HUD verification methods and ratings. This section states the performance indicators that are used to assess PHA Section 8 management. HUD will use the verification method identified for each indicator in... and will assign a rating for each indicator as shown. If the HUD verification method for the indicator...

  9. RISKIND verification and benchmark comparisons

    Energy Technology Data Exchange (ETDEWEB)

    Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.

    1997-08-01

    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models.

  10. Importance Performance Analysis as a Trade Show Performance Evaluation and Benchmarking Tool

    OpenAIRE

    Tafesse, Wondwesen; Skallerud, Kåre; Korneliussen, Tor

    2010-01-01

    Author's accepted version (post-print). The purpose of this study is to introduce importance performance analysis as a trade show performance evaluation and benchmarking tool. Importance performance analysis considers exhibitors’ performance expectation and perceived performance in unison to evaluate and benchmark trade show performance. The present study uses data obtained from exhibitors of an international trade show to demonstrate how importance performance analysis can be used to eval...

  11. Primary HPV testing verification: A retrospective ad-hoc analysis of screening algorithms on women doubly tested for cytology and HPV.

    Science.gov (United States)

    Tracht, Jessica; Wrenn, Allison; Eltoum, Isam-Eldin

    2017-07-01

    To evaluate human papillomavirus (HPV) testing as a primary screening tool, we retrospectively analyzed data comparing (1) HPV testing to the algorithms of the ATHENA Study: (2) cytology alone, (3) cytology with ASCUS triage in women 25-29 and (4) cotesting ≥ 30 or (5) cotesting ≥ 25. We retrospectively analyzed data from women tested with both cytology and HPV testing from 2010 to 2013. Cumulative risk (CR) for CIN3+ was calculated. Crude and verification bias adjusted (VBA) sensitivity, specificity, predictive values, likelihood ratios, colposcopy rate, and screening test numbers were compared. About 15,173 women (25-95, 7.1% testing. Nearly 1,184 (8.4%) had biopsies. About 19.4% had positive cytology, 14.5% had positive HPV. HPV testing unassociated with ASCUS was requested in 40% of women testing per CIN3+ diagnosed. While HPV-/NILM cotesting results are associated with low CIN3+ risk, HPV testing had similar screening performance to cotesting and to cytology alone. Additionally, HPV testing and cytology incur false negatives in nonoverlapping subsets of patients. Diagn. Cytopathol. 2017;45:580-586. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  12. Safety Injection Tank Performance Analysis Using CFD

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Jai Oan; Lee, Jeong Ik; Nietiadi Yohanes Setiawan [KAIST, Daejeon (Korea, Republic of); Addad Yacine [KUSTAR, Abu Dhabi (United Arab Emirates); Bang, Young Seok; Yoo, Seung Hun [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2016-10-15

    This may affect the core cooling capability and threaten the fuel integrity during LOCA situations. However, information on the nitrogen flow rate during discharge is very limited due to the associated experimental measurement difficulties, and these phenomena are hardly reflected in current 1D system codes. In the current study, a CFD analysis is presented which hopefully should allow obtaining a more realistic prediction of the SIT performance which can then be reflected on 1D system codes to simulate various accident scenarios. Current Computational Fluid Dynamics (CFD) calculations have had limited success in predicting the fluid flow accurately. This study aims to find a better CFD prediction and more accurate modeling to predict the system performance during accident scenarios. The safety injection tank with fluidic device was analyzed using commercial CFD. A fine resolution grid was used to capture the vortex of the fluidic device. The calculation so far has shown good consistency with the experiment. Calculation should complete by the conference date and will be thoroughly analyzed to be discussed. Once a detailed CFD computation is finished, a small-scale experiment will be conducted for the given conditions. Using the experimental results and the CFD model, physical models can be validated to give more reliable results. The data from CFD and experiments will provide a more accurate K-factor of the fluidic device which can later be applied in system code inputs.

  13. Deep Space Optical Link ARQ Performance Analysis

    Science.gov (United States)

    Clare, Loren; Miles, Gregory

    2016-01-01

    Substantial advancements have been made toward the use of optical communications for deep space exploration missions, promising a much higher volume of data to be communicated in comparison with present -day Radio Frequency (RF) based systems. One or more ground-based optical terminals are assumed to communicate with the spacecraft. Both short-term and long-term link outages will arise due to weather at the ground station(s), space platform pointing stability, and other effects. To mitigate these outages, an Automatic Repeat Query (ARQ) retransmission method is assumed, together with a reliable back channel for acknowledgement traffic. Specifically, the Licklider Transmission Protocol (LTP) is used, which is a component of the Disruption-Tolerant Networking (DTN) protocol suite that is well suited for high bandwidth-delay product links subject to disruptions. We provide an analysis of envisioned deep space mission scenarios and quantify buffering, latency and throughput performance, using a simulation in which long-term weather effects are modeled with a Gilbert -Elliot Markov chain, short-term outages occur as a Bernoulli process, and scheduled outages arising from geometric visibility or operational constraints are represented. We find that both short- and long-term effects impact throughput, but long-term weather effects dominate buffer sizing and overflow losses as well as latency performance.

  14. Approaches to verification of two-dimensional water quality models

    Energy Technology Data Exchange (ETDEWEB)

    Butkus, S.R. (Tennessee Valley Authority, Chattanooga, TN (USA). Water Quality Dept.)

    1990-11-01

    The verification of a water quality model is the one procedure most needed by decision making evaluating a model predictions, but is often not adequate or done at all. The results of a properly conducted verification provide the decision makers with an estimate of the uncertainty associated with model predictions. Several statistical tests are available for quantifying of the performance of a model. Six methods of verification were evaluated using an application of the BETTER two-dimensional water quality model for Chickamauga reservoir. Model predictions for ten state variables were compared to observed conditions from 1989. Spatial distributions of the verification measures showed the model predictions were generally adequate, except at a few specific locations in the reservoir. The most useful statistics were the mean standard error of the residuals. Quantifiable measures of model performance should be calculated during calibration and verification of future applications of the BETTER model. 25 refs., 5 figs., 7 tabs.

  15. Verification and quality control of routine hematology analyzers.

    Science.gov (United States)

    Vis, J Y; Huisman, A

    2016-05-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and linearity throughout the expected range of results. Yet, which standard should be met or which verification limit be used is at the discretion of the laboratory specialist. This paper offers practical guidance on verification and quality control of automated hematology analyzers and provides an expert opinion on the performance standard that should be met by the contemporary generation of hematology analyzers. Therefore (i) the state-of-the-art performance of hematology analyzers for complete blood count parameters is summarized, (ii) considerations, challenges, and pitfalls concerning the development of a verification plan are discussed, (iii) guidance is given regarding the establishment of reference intervals, and (iv) different methods on quality control of hematology analyzers are reviewed. © 2016 John Wiley & Sons Ltd.

  16. Sampling and Analysis Plan for Verification Sampling of LANL-Derived Residual Radionuclides in Soils within Tract A-18-2 for Land Conveyance

    Energy Technology Data Exchange (ETDEWEB)

    Ruedig, Elizabeth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-30

    Public Law 105-119 directs the U.S. Department of Energy (DOE) to convey or transfer parcels of land to the Incorporated County of Los Alamos or their designees and to the Department of Interior, Bureau of Indian Affairs, in trust for the Pueblo de San Ildefonso. Los Alamos National Security is tasked to support DOE in conveyance and/or transfer of identified land parcels no later than September 2022. Under DOE Order 458.1, Radiation Protection of the Public and the Environment (O458.1, 2013) and Los Alamos National Laboratory (LANL or the Laboratory) implementing Policy 412 (P412, 2014), real property with the potential to contain residual radioactive material must meet the criteria for clearance and release to the public. This Sampling and Analysis Plan (SAP) is a second investigation of Tract A-18-2 for the purpose of verifying the previous sampling results (LANL 2017). This sample plan requires 18 projectspecific soil samples for use in radiological clearance decisions consistent with LANL Procedure ENV-ES-TP-238 (2015a) and guidance in the Multi-Agency Radiation Survey and Site Investigation Manual (MARSSIM, 2000). The sampling work will be conducted by LANL, and samples will be evaluated by a LANL-contracted independent lab. However, there will be federal review (verification) of all steps of the sampling process.

  17. Memory Efficient Data Structures for Explicit Verification of Timed Systems

    DEFF Research Database (Denmark)

    Taankvist, Jakob Haahr; Srba, Jiri; Larsen, Kim Guldstrand

    2014-01-01

    Timed analysis of real-time systems can be performed using continuous (symbolic) or discrete (explicit) techniques. The explicit state-space exploration can be considerably faster for models with moderately small constants, however, at the expense of high memory consumption. In the setting of timed......-arc Petri nets, we explore new data structures for lowering the used memory: PTries for efficient storing of configurations and time darts for semi-symbolic description of the state-space. Both methods are implemented as a part of the tool TAPAAL and the experiments document at least one order of magnitude...... of memory savings while preserving comparable verification times....

  18. Verification of Model of Calculation of Intra-Chamber Parameters In Hybrid Solid-Propellant Rocket Engines

    Directory of Open Access Journals (Sweden)

    Zhukov Ilya S.

    2016-01-01

    Full Text Available On the basis of obtained analytical estimate of characteristics of hybrid solid-propellant rocket engine verification of earlier developed physical and mathematical model of processes in a hybrid solid-propellant rocket engine for quasi-steady-state flow regime was performed. Comparative analysis of calculated and analytical data indicated satisfactory comparability of simulation results.

  19. Towards Verification and Validation for Increased Autonomy

    Science.gov (United States)

    Giannakopoulou, Dimitra

    2017-01-01

    This presentation goes over the work we have performed over the last few years on verification and validation of the next generation onboard collision avoidance system, ACAS X, for commercial aircraft. It describes our work on probabilistic verification and synthesis of the model that ACAS X is based on, and goes on to the validation of that model with respect to actual simulation and flight data. The presentation then moves on to identify the characteristics of ACAS X that are related to autonomy and to discuss the challenges that autonomy pauses on VV. All work presented has already been published.

  20. AVU/BAM: software refurbishment (design and implementation) for the CU3 Gaia verification pipeline

    Science.gov (United States)

    Buzzi, Raffaella; Riva, Alberto; Pecoraro, Marco; Licata, Enrico; Messineo, Rosario; Gai, Mario; Drimmel, Ronald; Lattanzi, Mario G.

    2016-07-01

    AVU/BAM is the Gaia software for the Astrometric Verification Unit (AVU) devoted to the monitoring of the Basic Angle Monitoring (BAM), one of the metrology instruments onboard of the Gaia Payload. AVU/BAM is integrated and operative at the Data Processing Center of Turin (DPCT), since the beginning of the Gaia Mission. The DPCT infrastructure performs the ingestion of pre-elaborated data coming from the satellite and it's responsible of running the code of different Verification Packages. The new structure of the pipeline consists of three phases: the first is a pre-analysis in which a preliminary study data is performed, with the calculation of quantities needed to the analysis; the second one processes the interferograms coming from the instrument; the third phase analyzes the data obtained from the previous processing. Also it has been changed part of the long-term analysis and was added a phase of calibration of the data obtained from the processing.

  1. Performance analysis of memory hierachies in high performance systems

    Energy Technology Data Exchange (ETDEWEB)

    Yogesh, Agrawel [Iowa State Univ., Ames, IA (United States)

    1993-07-01

    This thesis studies memory bandwidth as a performance predictor of programs. The focus of this work is on computationally intensive programs. These programs are the most likely to access large amounts of data, stressing the memory system. Computationally intensive programs are also likely to use highly optimizing compilers to produce the fastest executables possible. Methods to reduce the amount of data traffic by increasing the average number of references to each item while it resides in the cache are explored. Increasing the average number of references to each cache item reduces the number of memory requests. Chapter 2 describes the DLX architecture. This is the architecture on which all the experiments were performed. Chapter 3 studies memory moves as a performance predictor for a group of application programs. Chapter 4 introduces a model to study the performance of programs in the presence of memory hierarchies. Chapter 5 explores some compiler optimizations that can help increase the references to each item while it resides in the cache.

  2. Structural verification of an aged composite reflector

    Science.gov (United States)

    Lou, Michael C.; Tsuha, Walter S.

    1991-01-01

    A structural verification program applied to qualifying two heritage composite antenna reflectors for flight on the TOPEX satellite is outlined. The verification requirements and an integrated analyses/test approach employed to meet these requirements are described. Structural analysis results and qualification vibration test data are presented and discussed. It was determined that degradation of the composite and bonding materials caused by long-term exposure to an uncontrolled environment had not severely impaired the integrity of the reflector structures. The reflectors were assessed to be structurally adequate for the intended TOPEX application.

  3. 14 CFR 460.17 - Verification program.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Verification program. 460.17 Section 460.17 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF... program. An operator must successfully verify the integrated performance of a vehicle's hardware and any...

  4. ENVIRONMENTAL TECHNOLOGY VERIFICATION OF BAGHOUSE FILTRATION PRODUCTS

    Science.gov (United States)

    The Environmental Technology Verification Program (ETV) was started by EPA in 1995 to generate independent credible data on the performance of innovative technologies that have potential to improve protection of public health and the environment. ETV does not approve or certify p...

  5. Sensitivity analysis for thermo-hydraulics model of a Westinghouse type PWR. Verification of the simulation results

    Energy Technology Data Exchange (ETDEWEB)

    Farahani, Aref Zarnooshe [Islamic Azad Univ., Tehran (Iran, Islamic Republic of). Dept. of Nuclear Engineering, Science and Research Branch; Yousefpour, Faramarz [Nuclear Science and Technology Research Institute, Tehran (Iran, Islamic Republic of); Hoseyni, Seyed Mohsen [Islamic Azad Univ., Tehran (Iran, Islamic Republic of). Dept. of Basic Sciences; Islamic Azad Univ., Tehran (Iran, Islamic Republic of). Young Researchers and Elite Club

    2017-07-15

    Development of a steady-state model is the first step in nuclear safety analysis. The developed model should be qualitatively analyzed first, then a sensitivity analysis is required on the number of nodes for models of different systems to ensure the reliability of the obtained results. This contribution aims to show through sensitivity analysis, the independence of modeling results to the number of nodes in a qualified MELCOR model for a Westinghouse type pressurized power plant. For this purpose, and to minimize user error, the nuclear analysis software, SNAP, is employed. Different sensitivity cases were developed by modification of the existing model and refinement of the nodes for the simulated systems including steam generators, reactor coolant system and also reactor core and its connecting flow paths. By comparing the obtained results to those of the original model no significant difference is observed which is indicative of the model independence to the finer nodes.

  6. Writer identification and verification

    NARCIS (Netherlands)

    Schomaker, Lambert; Ratha, N; Govindaraju, V

    2008-01-01

    Writer identification and verification have gained increased interest recently, especially in the fields of forensic document examination and biometrics. Writer identification assigns a handwriting to one writer out of a set of writers. It determines whether or not a given handwritten text has in

  7. The space shuttle launch vehicle aerodynamic verification challenges

    Science.gov (United States)

    Wallace, R. O.; Austin, L. D.; Hondros, J. G.; Surber, T. E.; Gaines, L. M.; Hamilton, J. T.

    1985-01-01

    The Space Shuttle aerodynamics and performance communities were challenged to verify the Space Shuttle vehicle (SSV) aerodynamics and system performance by flight measurements. Historically, launch vehicle flight test programs which faced these same challenges were unmanned instrumented flights of simple aerodynamically shaped vehicles. However, the manned SSV flight test program made these challenges more complex because of the unique aerodynamic configuration powered by the first man-rated solid rocket boosters (SRB). The analyses of flight data did not verify the aerodynamics or performance preflight predictions of the first flight of the Space Transportation System (STS-1). However, these analyses have defined the SSV aerodynamics and verified system performance. The aerodynamics community also was challenged to understand the discrepancy between the wind tunnel and flight defined aerodynamics. The preflight analysis challenges, the aerodynamic extraction challenges, and the postflight analyses challenges which led to the SSV system performance verification and which will lead to the verification of the operational ascent aerodynamics data base are presented.

  8. Mesoscale model forecast verification during monsoon 2008

    Indian Academy of Sciences (India)

    Almost all the studies are based on either National Center for Environmental Prediction (NCEP), USA, final analysis fields (NCEP FNL) or the reanalysis data used as initial and lateral boundary conditions for driving the mesoscale model. Here we present a mesoscale model forecast verification and intercomparison study ...

  9. and application to autopilot performance analysis

    Directory of Open Access Journals (Sweden)

    Daniel E. Davison

    2000-01-01

    Full Text Available This paper deals with the notion of disturbance model uncertainty. The disturbance is modeled as the output of a first-order filter which is driven by white noise and whose bandwidth and gain are uncertain. An analytical expression for the steady-state output variance as a function of the uncertain bandwidth and gain is derived, and several properties of this variance function are analyzed. Two notions, those of disturbance bandwidth margin and disturbance gain margin are also introduced. These tools are then applied to the analysis of a simple altitude-hold autopilot system in the presence of turbulence where the turbulence scale is treated as an uncertain parameter. It is shown that the autopilot, which is satisfactory for nominal turbulence scale, may be inadequate when the uncertainty is taken into account. Moreover, it is proven that, in order to obtain a design that provides robust performance in the face of turbulence scale uncertainty, it is necessary to substantially increase the controller bandwidth, even if one is willing to sacrifice the autopilot's holding ability and stability robustness.

  10. Development of verification program for safety evaluation of KNGR on-site and off-site power system design

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Kem Joong; Ryu, Eun Sook; Choi, Jang Hong; Lee, Byung Il; Han, Hyun Kyu; Oh, Seong Kyun; Kim, Han Kee; Park, Chul Woo; Kim, Min Jeong [Chungnam National Univ., Taejon (Korea, Republic of)

    2001-04-15

    In order to verify the adequacy of the design and analysis of the on-site and off-site power system, we developed the regulatory analysis program. We established the methodology for electric power system and constructed the algorithm of steady-state load flow analysis, fault analysis, transient stability analysis. The developed program to be an advantage of GUI and C++ programming technique. The design of input made easy to access the common use PSS/E format and that of output made users to work with Excel spreadsheet. The performance of program was verified to compare with PSS/E results. The case studies as follows. The verification of load flow analysis of KNGR on-site power system. The evaluation of load flow and transient stability analysis of off-site power system of KNGR. The verification of load flow and transient stability analysis. The frequency drop analysis of loss of generation.

  11. Assessing SRI fund performance research : best practices in empirical analysis

    NARCIS (Netherlands)

    Chegut, Andrea; Schenk, H.; Scholtens, B.

    2011-01-01

    We review the socially responsible investment (SRI) mutual fund performance literature to provide best practices in SRI performance attribution analysis. Based on meta-ethnography and content analysis, five themes in this literature require specific attention: data quality, social responsibility

  12. On-line high-performance liquid chromatography-ultraviolet-nuclear magnetic resonance method of the markers of nerve agents for verification of the Chemical Weapons Convention.

    Science.gov (United States)

    Mazumder, Avik; Gupta, Hemendra K; Garg, Prabhat; Jain, Rajeev; Dubey, Devendra K

    2009-07-03

    This paper details an on-flow liquid chromatography-ultraviolet-nuclear magnetic resonance (LC-UV-NMR) method for the retrospective detection and identification of alkyl alkylphosphonic acids (AAPAs) and alkylphosphonic acids (APAs), the markers of the toxic nerve agents for verification of the Chemical Weapons Convention (CWC). Initially, the LC-UV-NMR parameters were optimized for benzyl derivatives of the APAs and AAPAs. The optimized parameters include stationary phase C(18), mobile phase methanol:water 78:22 (v/v), UV detection at 268nm and (1)H NMR acquisition conditions. The protocol described herein allowed the detection of analytes through acquisition of high quality NMR spectra from the aqueous solution of the APAs and AAPAs with high concentrations of interfering background chemicals which have been removed by preceding sample preparation. The reported standard deviation for the quantification is related to the UV detector which showed relative standard deviations (RSDs) for quantification within +/-1.1%, while lower limit of detection upto 16mug (in mug absolute) for the NMR detector. Finally the developed LC-UV-NMR method was applied to identify the APAs and AAPAs in real water samples, consequent to solid phase extraction and derivatization. The method is fast (total experiment time approximately 2h), sensitive, rugged and efficient.

  13. Formal verification of an oral messages algorithm for interactive consistency

    Science.gov (United States)

    Rushby, John

    1992-01-01

    The formal specification and verification of an algorithm for Interactive Consistency based on the Oral Messages algorithm for Byzantine Agreement is described. We compare our treatment with that of Bevier and Young, who presented a formal specification and verification for a very similar algorithm. Unlike Bevier and Young, who observed that 'the invariant maintained in the recursive subcases of the algorithm is significantly more complicated than is suggested by the published proof' and who found its formal verification 'a fairly difficult exercise in mechanical theorem proving,' our treatment is very close to the previously published analysis of the algorithm, and our formal specification and verification are straightforward. This example illustrates how delicate choices in the formulation of the problem can have significant impact on the readability of its formal specification and on the tractability of its formal verification.

  14. Unmanned Aircraft Systems Detect and Avoid System: End-to-End Verification and Validation Simulation Study of Minimum Operations Performance Standards for Integrating Unmanned Aircraft into the National Airspace System

    Science.gov (United States)

    Ghatas, Rania W.; Jack, Devin P.; Tsakpinis, Dimitrios; Sturdy, James L.; Vincent, Michael J.; Hoffler, Keith D.; Myer, Robert R.; DeHaven, Anna M.

    2017-01-01

    As Unmanned Aircraft Systems (UAS) make their way to mainstream aviation operations within the National Airspace System (NAS), research efforts are underway to develop a safe and effective environment for their integration into the NAS. Detect and Avoid (DAA) systems are required to account for the lack of "eyes in the sky" due to having no human on-board the aircraft. The technique, results, and lessons learned from a detailed End-to-End Verification and Validation (E2-V2) simulation study of a DAA system representative of RTCA SC-228's proposed Phase I DAA Minimum Operational Performance Standards (MOPS), based on specific test vectors and encounter cases, will be presented in this paper.

  15. Consistency analysis for the performance of planar detector systems used in advanced radiotherapy

    Directory of Open Access Journals (Sweden)

    Kanan Jassal

    2015-03-01

    Full Text Available Purpose: To evaluate the performance linked to the consistency of a-Si EPID and ion-chamber array detectors for dose verification in advanced radiotherapy.Methods: Planar measurements were made for 250 patients using an array of ion chamber and a-Si EPID. For pre-treatment verification, the plans were generated on the phantom for re-calculation of doses. The γ-evaluation method with the criteria: dose-difference (DD ≤ 3% and distance-to-agreement (DTA ≤ 3 mm was used for the comparison of measurements. Also, the central axis (CAX doses were measured using 0.125cc ion chamber and were compared with the central chamber of array and central pixel correlated dose value from EPID image. Two types of statistical approaches were applied for the analysis. Conventional statistics used analysis of variance (ANOVA and unpaired t-test to evaluate the performance of the detectors. And statistical process control (SPC was utilized to study the statistical variation for the measured data. Control charts (CC based on an average , standard deviation ( and exponentially weighted moving averages (EWMA were prepared. The capability index (Cpm was determined as an indicator for the performance consistency of the two systems.Results: Array and EPID measurements had the average gamma pass rates as 99.9% ± 0.15% and 98.9% ± 1.06% respectively. For the point doses, the 0.125cc chamber results were within 2.1% ± 0.5% of the central chamber of the array. Similarly, CAX doses from EPID and chamber matched within 1.5% ± 0.3%. The control charts showed that both the detectors were performing optimally and all the data points were within ± 5%. EWMA charts revealed that both the detectors had a slow drift along the mean of the processes but was found well within ± 3%. Further, higher Cpm values for EPID demonstrate its higher efficiency for radiotherapy techniques.Conclusion: The performances of both the detectors were seen to be of high quality irrespective of the

  16. Quantitative reactive modeling and verification.

    Science.gov (United States)

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness, which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  17. Analysis, Design, and Experimental Verification of A Synchronous Reference Frame Voltage Control for Single-Phase Inverters

    DEFF Research Database (Denmark)

    Monfared, Mohammad; Golestan, Saeed; Guerrero, Josep M.

    2014-01-01

    -loop control strategy for single phase inverter-based islanded distributed generation (DG) systems. The proposed controller uses a synchronous reference frame PI (SRFPI) controller to regulate the instantaneous output voltage, a capacitor current shaping loop in the stationary reference frame to provide active...... damping and improve both transient and steady-state performances, a voltage decoupling feedforward to improve the system robustness, and a multi-resonant harmonic compensator to prevent low-order load current harmonics to distort the inverter output voltage. Since, the voltage loop works...

  18. Operational Modal Analysis and the Performance Assessment of Vehicle Suspension Systems

    Directory of Open Access Journals (Sweden)

    L. Soria

    2012-01-01

    Full Text Available Comfort, road holding and safety of passenger cars are mainly influenced by an appropriate design of suspension systems. Improvements of the dynamic behaviour can be achieved by implementing semi-active or active suspension systems. In these cases, the correct design of a well-performing suspension control strategy is of fundamental importance to obtain satisfying results. Operational Modal Analysis allows the experimental structural identification in those that are the real operating conditions: Moving from output-only data, leading to modal models linearised around the more interesting working points and, in the case of controlled systems, providing the needed information for the optimal design and verification of the controller performance. All these characters are needed for the experimental assessment of vehicle suspension systems. In the paper two suspension architectures are considered equipping the same car type. The former is a semi-active commercial system, the latter a novel prototypic active system. For the assessment of suspension performance, two different kinds of tests have been considered, proving ground tests on different road profiles and laboratory four poster rig tests. By OMA-processing the signals acquired in the different testing conditions and by comparing the results, it is shown how this tool can be effectively utilised to verify the operation and the performance of those systems, by only carrying out a simple, cost-effective road test.

  19. Verification of an acoustic transmission matrix analysis of sound propagation in a variable area duct without flow

    Science.gov (United States)

    Miles, J. H.

    1981-01-01

    A predicted standing wave pressure and phase angle profile for a hard wall rectangular duct with a region of converging-diverging area variation is compared to published experimental measurements in a study of sound propagation without flow. The factor of 1/2 area variation used is sufficient magnitude to produce large reflections. The prediction is based on a transmission matrix approach developed for the analysis of sound propagation in a variable area duct with and without flow. The agreement between the measured and predicted results is shown to be excellent.

  20. Financial Performance Analysis of Selected Commercial Banks in ...

    African Journals Online (AJOL)

    user

    viability of the banking industry. Keywords: Ratio analysis, financial performance, Bank performance, Ethiopia ... Financial Performance of Banks: A Ratio Analysis. EJBE Vol. 4 No. 2/2014. Page 252. 1. ..... The main advantage of FRA is its ability and effectiveness in distinguishing high performance firms from others and the ...

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM; BAGHOUSE FILTRATION PRODUCTS

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This techno...

  2. Development of a multi-dimensional realistic thermal-hydraulic system analysis code, MARS 1.3 and its verification

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Won Jae; Chung, Bub Dong; Jeong, Jae Jun; Ha, Kwi Seok [Korea Atomic Energy Research Institute, Taejon (Korea)

    1998-06-01

    A multi-dimensional realistic thermal-hydraulic system analysis code, MARS version 1.3 has been developed. Main purpose of MARS 1.3 development is to have the realistic analysis capability of transient two-phase thermal-hydraulics of Pressurized Water Reactors (PWRs) especially during Large Break Loss of Coolant Accidents (LBLOCAs) where the multi-dimensional phenomena domain the transients. MARS code is a unified version of USNRC developed COBRA-TF, domain the transients. MARS code is a unified version of USNRC developed COBRA-TF, three-dimensional (3D) reactor vessel analysis code, and RELAP5/MOD3.2.1.2, one-dimensional (1D) reactor system analysis code., Developmental requirements for MARS are chosen not only to best utilize the existing capability of the codes but also to have the enhanced capability in code maintenance, user accessibility, user friendliness, code portability, code readability, and code flexibility. For the maintenance of existing codes capability and the enhancement of code maintenance capability, user accessibility and user friendliness, MARS has been unified to be a single code consisting of 1D module (RELAP5) and 3D module (COBRA-TF). This is realized by implicitly integrating the system pressure matrix equations of hydrodynamic models and solving them simultaneously, by modifying the 1D/3D calculation sequence operable under a single Central Processor Unit (CPU) and by unifying the input structure and the light water property routines of both modules. In addition, the code structure of 1D module is completely restructured using the modular data structure of standard FORTRAN 90, which greatly improves the code maintenance capability, readability and portability. For the code flexibility, a dynamic memory management scheme is applied in both modules. MARS 1.3 now runs on PC/Windows and HP/UNIX platforms having a single CPU, and users have the options to select the 3D module to model the 3D thermal-hydraulics in the reactor vessel or other

  3. Analysis, Control and Experimental Verification of a Single-Phase Capacitive-Coupling Grid-Connected Inverter

    DEFF Research Database (Denmark)

    Dai, Ning-Yi; Zhang, Wen-Chen; Wong, Man-Chung

    2015-01-01

    This study proposes a capacitive-coupling grid-connected inverter (CGCI), which consists of a full-bridge single-phase inverter coupled to a power grid via one capacitor in series with an inductor. The fundamental-frequency impedance of the coupling branch is capacitive. In contrast...... with the conventional inductive-coupling grid-connected inverter (IGCI), this structure provides an alternative interface for use between a low-voltage DC microgrid and an AC grid. A comparison between the CGCI and the IGCI is performed. It is concluded that the CGCI is able to transfer active power and provide lagging...... reactive power at an operational voltage much lower than that of the IGCI. This reduces the system's initial cost and operational losses, as well as the energy stored in the DC-link capacitor. The CGCI has been analysed and a DC voltage selection method is proposed. Using this method, the DC-link voltage...

  4. Environmental Technology Verification: Baghouse filtration products--W.L. Gore & Associates L3650 filtration media (tested November--December 2009)

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. ETV seeks to ach...

  5. Environmental Technology Verification Report: Baghouse Filtration Products, Donaldson Company, Inc. Tetratex® 6277 Filtration Media (Tested March 2011)

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. ETV seeks to ach...

  6. Environmental Technology Verification Report: Baghouse Filtration Products, Donaldson Company, Inc. Tetratex® 6262 Filtration Media (Tested March 2011)

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. ETV seeks to ach...

  7. Environmental Technology Verification Report: Baghouse Filtration Products, Donaldson Company, Inc. Tetratex® 6282 Filtration Media (Tested March - April 2011)

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. ETV seeks to ach...

  8. Performance Analysis of Cone Detection Algorithms

    CERN Document Server

    Mariotti, Letizia

    2015-01-01

    Many algorithms have been proposed to help clinicians evaluate cone density and spacing, as these may be related to the onset of retinal diseases. However, there has been no rigorous comparison of the performance of these algorithms. In addition, the performance of such algorithms is typically determined by comparison with human observers. Here we propose a technique to simulate realistic images of the cone mosaic. We use the simulated images to test the performance of two popular cone detection algorithms and we introduce an algorithm which is used by astronomers to detect stars in astronomical images. We use Free Response Operating Characteristic (FROC) curves to evaluate and compare the performance of the three algorithms. This allows us to optimize the performance of each algorithm. We observe that performance is significantly enhanced by up-sampling the images. We investigate the effect of noise and image quality on cone mosaic parameters estimated using the different algorithms, finding that the estimat...

  9. Analysis approaches and interventions with occupational performance

    OpenAIRE

    Ahn, Sinae

    2016-01-01

    [Purpose] The purpose of this study was to analyze approaches and interventions with occupational performance in patients with stroke. [Subjects and Methods] In this study, articles published in the past 10?years were searched. The key terms used were ?occupational performance AND stroke? and ?occupational performance AND CVA?. A total 252 articles were identified, and 79 articles were selected. All interventions were classified according to their approaches according to 6 theories. All inter...

  10. Failure analysis of high performance ballistic fibers

    OpenAIRE

    Spatola, Jennifer S

    2015-01-01

    High performance fibers have a high tensile strength and modulus, good wear resistance, and a low density, making them ideal for applications in ballistic impact resistance, such as body armor. However, the observed ballistic performance of these fibers is much lower than the predicted values. Since the predictions assume only tensile stress failure, it is safe to assume that the stress state is affecting fiber performance. The purpose of this research was to determine if there are failure mo...

  11. Analysis and Experimental Verification of New Power Flow Control for Grid-Connected Inverter with LCL Filter in Microgrid

    Directory of Open Access Journals (Sweden)

    Herong Gu

    2014-01-01

    Full Text Available Microgrid is an effective way to integrate the distributed energy resources into the utility networks. One of the most important issues is the power flow control of grid-connected voltage-source inverter in microgrid. In this paper, the small-signal model of the power flow control for the grid-connected inverter is established, from which it can be observed that the conventional power flow control may suffer from the poor damping and slow transient response. While the new power flow control can mitigate these problems without affecting the steady-state power flow regulation. Results of continuous-domain simulations in MATLAB and digital control experiments based on a 32-bit fixed-point TMS320F2812 DSP are in good agreement, which verify the small signal model analysis and effectiveness of the proposed method.

  12. Analysis and experimental verification of new power flow control for grid-connected inverter with LCL filter in microgrid.

    Science.gov (United States)

    Gu, Herong; Guan, Yajuan; Wang, Huaibao; Wei, Baoze; Guo, Xiaoqiang

    2014-01-01

    Microgrid is an effective way to integrate the distributed energy resources into the utility networks. One of the most important issues is the power flow control of grid-connected voltage-source inverter in microgrid. In this paper, the small-signal model of the power flow control for the grid-connected inverter is established, from which it can be observed that the conventional power flow control may suffer from the poor damping and slow transient response. While the new power flow control can mitigate these problems without affecting the steady-state power flow regulation. Results of continuous-domain simulations in MATLAB and digital control experiments based on a 32-bit fixed-point TMS320F2812 DSP are in good agreement, which verify the small signal model analysis and effectiveness of the proposed method.

  13. Multileaf collimator leaf position verification and analysis for adaptive radiation therapy using a video-optical method

    Science.gov (United States)

    Sethna, Sohrab B.

    External beam radiation therapy is commonly used to eliminate and control cancerous tumors. High-energy beams are shaped to match the patient's specific tumor volume, whereby maximizing radiation dose to malignant cells and limiting dose to normal tissue. A multileaf collimator (MLC) consisting of multiple pairs of tungsten leaves is used to conform the radiation beam to the desired treatment field. Advanced treatment methods utilize dynamic MLC settings to conform to multiple treatment fields and provide intensity modulated radiation therapy (IMRT). Future methods would further increase conformity by actively tracking tumor motion caused by patient cardiac and respiratory motion. Leaf position quality assurance for a dynamic MLC is critical as variation between the planned and actual leaf positions could induce significant errors in radiation dose. The goal of this research project is to prototype a video-optical quality assurance system for MLC leaf positions. The system captures light-field images of MLC leaf sequences during dynamic therapy. Image acquisition and analysis software was developed to determine leaf edge positions. The mean absolute difference between QA prototype predicted and caliper measured leaf positions was found to be 0.6 mm with an uncertainty of +/- 0.3 mm. Maximum errors in predicted positions were below 1.0 mm for static fields. The prototype served as a proof of concept for quality assurance of future tumor tracking methods. Specifically, a lung tumor phantom was created to mimic a lung tumor's motion from respiration. The lung tumor video images were superimposed on MLC field video images for visualization and analysis. The toolbox is capable of displaying leaf position, leaf velocity, tumor position, and determining errors between planned and actual treatment fields for dynamic radiation therapy.

  14. Spent fuel verification options for final repository safeguards in Finland. A study on verification methods, their feasibility and safety aspects

    Energy Technology Data Exchange (ETDEWEB)

    Hautamaeki, J.; Tiitta, A. [VTT Chemical Technology, Espoo (Finland)

    2000-12-01

    The verification possibilities of the spent fuel assemblies from the Olkiluoto and Loviisa NPPs and the fuel rods from the research reactor of VTT are contemplated in this report. The spent fuel assemblies have to be verified at the partial defect level before the final disposal into the geologic repository. The rods from the research reactor may be verified at the gross defect level. Developing a measurement system for partial defect verification is a complicated and time-consuming task. The Passive High Energy Gamma Emission Tomography and the Fork Detector combined with Gamma Spectrometry are the most potential measurement principles to be developed for this purpose. The whole verification process has to be planned to be as slick as possible. An early start in the planning of the verification and developing the measurement devices is important in order to enable a smooth integration of the verification measurements into the conditioning and disposal process. The IAEA and Euratom have not yet concluded the safeguards criteria for the final disposal. E.g. criteria connected to the selection of the best place to perform the verification. Measurements have not yet been concluded. Options for the verification places have been considered in this report. One option for a verification measurement place is the intermediate storage. The other option is the encapsulation plant. Crucial viewpoints are such as which one offers the best practical possibilities to perform the measurements effectively and which would be the better place in the safeguards point of view. Verification measurements may be needed both in the intermediate storages and in the encapsulation plant. In this report also the integrity of the fuel assemblies after wet intermediate storage period is assessed, because the assemblies have to stand the handling operations of the verification measurements. (orig.)

  15. GRAVITY Science Verification

    Science.gov (United States)

    Mérand, A.; Berger, J.-P.; de Wit, W.-J.; Eisenhauer, F.; Haubois, X.; Paumard, T.; Schoeller, M.; Wittkowski, M.; Woillez, J.; Wolff, B.

    2017-12-01

    In the time between successfully commissioning an instrument and before offering it in the Call for Proposals for the first time, ESO gives the community at large an opportunity to apply for short Science Verification (SV) programmes. In 2016, ESO offered SV time for the second-generation Very Large Telescope Interferometer instrument GRAVITY. In this article we describe the selection process, outline the range of science cases covered by the approved SV programmes, and highlight some of the early scientific results.

  16. Simplifying EPID dosimetry for IMRT treatment verification

    Energy Technology Data Exchange (ETDEWEB)

    Pecharroman-Gallego, R.; Mans, Anton; Sonke, Jan-Jakob; Stroom, Joep C.; Olaciregui-Ruiz, Igor; Herk, Marcel van; Mijnheer, Ben J. [Department of Radiation Oncology, The Netherlands Cancer Institute-Antoni van Leeuwenhoek Hospital, Plesmanlaan 121, 1066 CX Amsterdam (Netherlands)

    2011-02-15

    Purpose: Electronic portal imaging devices (EPIDs) are increasingly used for IMRT dose verification, both pretreatment and in vivo. In this study, an earlier developed backprojection model has been modified to avoid the need for patient-specific transmission measurements and, consequently, leads to a faster procedure. Methods: Currently, the transmission, an essential ingredient of the backprojection model, is estimated from the ratio of EPID measurements with and without a phantom/patient in the beam. Thus, an additional irradiation to obtain ''open images'' under the same conditions as the actual phantom/patient irradiation is required. However, by calculating the transmission of the phantom/patient in the direction of the beam instead of using open images, this extra measurement can be avoided. This was achieved by using a model that includes the effect of beam hardening and off-axis dependence of the EPID response on photon beam spectral changes. The parameters in the model were empirically obtained by performing EPID measurements using polystyrene slab phantoms of different thickness in 6, 10, and 18 MV photon beams. A theoretical analysis to verify the sensitivity of the model with patient thickness changes was performed. The new model was finally applied for the analysis of EPID dose verification measurements of step-and-shoot IMRT treatments of head and neck, lung, breast, cervix, prostate, and rectum patients. All measurements were carried out using Elekta SL20i linear accelerators equipped with a hydrogenated amorphous silicon EPID, and the IMRT plans were made using PINNACLE software (Philips Medical Systems). Results: The results showed generally good agreement with the dose determined using the old model applying the measured transmission. The average differences between EPID-based in vivo dose at the isocenter determined using either the new model for transmission and its measured value were 2.6{+-}3.1%, 0.2{+-}3.1%, and 2

  17. Survey of SNMP performance analysis studies

    NARCIS (Netherlands)

    Andrey, Laurent; Festor, Olivier; Lahmadi, Abdelkader; Pras, Aiko; Schönwälder, Jürgen

    This paper provides a survey of Simple Network Management Protocol (SNMP)-related performance studies. Over the last 10 years, a variety of such studies have been published. Performance benchmarking of SNMP, like all benchmarking studies, is a non-trivial task that requires substantial effort to be

  18. Classification model and analysis on students' performance ...

    African Journals Online (AJOL)

    The purpose of this paper is to propose a classification model for classifying students' performance in SijilPelajaran ... along with the examination data.This research shows that first semester results can be used to identify students' performance. Keywords: educational data mining; classification model; feature selection ...

  19. NOKIA PERFORMANCE AND CASH FLOW ANALYSIS

    Directory of Open Access Journals (Sweden)

    Moscviciov Andrei

    2011-12-01

    Full Text Available In this paper the author presents the ways to analyze the performance of the company Nokia. Thus based on a system of indicators are highlighted the key situations that emphasize performance, namely: operational activity, financial balance, cash flows.

  20. Personality and team performance: a meta analysis

    NARCIS (Netherlands)

    Peeters, Miranda A.G.; van Tuijl, Harrie F.J.M.; Rutte, Christel G.; Reymen, Isabelle

    2006-01-01

    Using a meta-analytical procedure, the relationship between team composition in terms of the Big-Five personality traits (trait elevation and variability) and team performance were researched. The number of teams upon which analyses were performed ranged from 106 to 527. For the total sample,

  1. Formal Methods Specification and Analysis Guidebook for the Verification of Software and Computer Systems. Volume 2; A Practitioner's Companion

    Science.gov (United States)

    1995-01-01

    This guidebook, the second of a two-volume series, is intended to facilitate the transfer of formal methods to the avionics and aerospace community. The 1st volume concentrates on administrative and planning issues [NASA-95a], and the second volume focuses on the technical issues involved in applying formal methods to avionics and aerospace software systems. Hereafter, the term "guidebook" refers exclusively to the second volume of the series. The title of this second volume, A Practitioner's Companion, conveys its intent. The guidebook is written primarily for the nonexpert and requires little or no prior experience with formal methods techniques and tools. However, it does attempt to distill some of the more subtle ingredients in the productive application of formal methods. To the extent that it succeeds, those conversant with formal methods will also nd the guidebook useful. The discussion is illustrated through the development of a realistic example, relevant fragments of which appear in each chapter. The guidebook focuses primarily on the use of formal methods for analysis of requirements and high-level design, the stages at which formal methods have been most productively applied. Although much of the discussion applies to low-level design and implementation, the guidebook does not discuss issues involved in the later life cycle application of formal methods.

  2. Human performance variation analysis: A process for human performance problem solving

    Directory of Open Access Journals (Sweden)

    Anerie Rademeyer

    2009-04-01

    Full Text Available Problem-solving ability is a much sought-after trait in executives, especially if it includes the ability to solve human performance problems. This paper proposes a systematic root cause analysis process that effectively and consistently uncovers the root causes of human performance problems and controls the causes in a way that prevents the problems from recurring. Applying action research the study brings into being a Human Performance Variation Analysis (HPVA process, which consists of three phases: (1 performance variation assessment, (2 performance variation analysis, and (3 performance variation resolution. The HPVA provides much-needed capability in solving human performance problems in organisations.

  3. NES++: number system for encryption based privacy preserving speaker verification

    Science.gov (United States)

    Xu, Lei; Feng, Tao; Zhao, Xi; Shi, Weidong

    2014-05-01

    As speech based operation becomes a main hand-free interaction solution between human and mobile devices (i.e., smartphones, Google Glass), privacy preserving speaker verification receives much attention nowadays. Privacy preserving speaker verification can be achieved through many different ways, such as fuzzy vault and encryption. Encryption based solutions are promising as cryptography is based on solid mathematic foundations and the security properties can be easily analyzed in a well established framework. Most current asymmetric encryption schemes work on finite algebraic structures, such as finite group and finite fields. However, the encryption scheme for privacy preserving speaker verification must handle floating point numbers. This gap must be filled to make the overall scheme practical. In this paper, we propose a number system that meets the requirements of both speaker verification and the encryption scheme used in the process. It also supports addition homomorphic property of Pailliers encryption, which is crucial for privacy preserving speaker verification. As asymmetric encryption is expensive, we propose a method of packing several numbers into one plain-text and the computation overhead is greatly reduced. To evaluate the performance of this method, we implement Pailliers encryption scheme over proposed number system and the packing technique. Our findings show that the proposed solution can fulfill the gap between speaker verification and encryption scheme very well, and the packing technique improves the overall performance. Furthermore, our solution is a building block of encryption based privacy preserving speaker verification, the privacy protection and accuracy rate are not affected.

  4. Performance analysis of photovoltaic thermal air heaters

    Energy Technology Data Exchange (ETDEWEB)

    Sopian, K.; Yigit, K.S.; Liu, H.T.; Kakac, S.; Veziroglu, T.N. [Miami Univ., Coral Gables, FL (United States). Dept. of Mechanical Engineering

    1996-11-01

    The performance of single-pass and double-pass combined photovoltaic thermal collectors are analyzed with steady-state models. The working fluid is air and the models are based on energy conservation at various nodes of the collector. Closed form solutions have been obtained for the differential equations of both the single-pass and double-pass collectors. Comparisons are made between the performances of the two types of combined photovoltaic thermal collectors. The results show that the new design, the double-pass photovoltaic thermal collector, has superior performance. Important parameters for both types of collector are identified, and their effects on the performances of the two types of collectors are presented in detail. (author)

  5. Endogeneity in Strategy-Performance Analysis

    DEFF Research Database (Denmark)

    Rocha, Vera; Van Praag, Mirjam; B. Folta, Timothy

    2018-01-01

    Managers engage in a variety of strategies, not randomly, but having in mind their performance implications. Therefore, strategic choices are endogenous in performance equations. Despite increasing efforts by various scholars in solving endogeneity bias, prior attempts have almost exclusively......, such as employees, strategic partners, customers, or investors, whose choices and preferences also affect the final decision. We discuss how endogeneity can plague the measurement of the performance effects of these two-sided strategic decisions—which are more complex, but more realistic, than prior representations...... of organizational decision making. We provide an empirical demonstration of possible methods to deal with three different sources of bias, by analyzing the performance effects of two human capital choices made by founders at startup: the size and average quality of the initial workforce....

  6. Analysis Of Employee Engagement And Company Performance

    OpenAIRE

    Mekel, Peggy A.; Saerang, David P.E; Silalahi, Immanuel Maradopan

    2014-01-01

    Employee could be a competitive advantage of a company if company manages its employees well. The success of a company could be seen from how a company manages their employees and engages their employees. Most of big companies put their employees in top priority in order to keep their top performance. These big companies manage their employees and try to engage their employees so that their employees could generate high performance. In this study, employee engagement is the factor to examine ...

  7. Integrated Advanced Microwave Sounding Unit-A (AMSU-A). Performance Verification Report: METSAT (S/N) AMSU-A1 Receiver Assemblies P/N 1356429-1 S/N F06 and P/N 1356409-1 S/N F06

    Science.gov (United States)

    1999-01-01

    This is the Performance Verification Report, METSAT (S/N 109) AMSU-A1 Receiver Assemblies, P/N 1356429-1 S/N F06 and P/N 1356409 S/N F06, for the Integrated Advanced Microwave Sounding Unit-A (AMSU-A).

  8. Style Analysis and Performance Evaluation of Dutch Mutual Funds

    NARCIS (Netherlands)

    Ter Horst, J.R.; Nijman, T.E.; de Roon, F.A.

    1998-01-01

    In this paper we show how style analysis of mutual funds can be used to circumvent the problem of self-reported investment styles, and to improve relative performance evaluation. Subsequently, we relate style analysis to performance evaluation and present results on the performance of Dutch mutual

  9. Scalable Techniques for Formal Verification

    CERN Document Server

    Ray, Sandip

    2010-01-01

    This book presents state-of-the-art approaches to formal verification techniques to seamlessly integrate different formal verification methods within a single logical foundation. It should benefit researchers and practitioners looking to get a broad overview of the spectrum of formal verification techniques, as well as approaches to combining such techniques within a single framework. Coverage includes a range of case studies showing how such combination is fruitful in developing a scalable verification methodology for industrial designs. This book outlines both theoretical and practical issue

  10. Single Crystal Diamond RF-FET Uniformity Performance Analysis

    Science.gov (United States)

    2017-03-01

    Single Crystal Diamond RF-FET Uniformity Performance Analysis Pankaj B Shah, James Weil, Glen Birdwell, and Tony Ivanov US Army Research...dopant, non- uniform hydrogenation and subsurface polish damage. Transient switching analysis along with capacitance / conductance based interface...hydrogenation and then AFM and Raman analysis were performed to investigate the chemical and structural quality of the surface. Then the polished

  11. Performance Analysis using Coloured Petri Nets

    DEFF Research Database (Denmark)

    Wells, Lisa Marie

    an explicit separation between modelling the behaviour of a system and monitoring the behaviour of the model. As a result, cleaner and more understandable models can be created. The third paper presents a novel method for adding auxiliary information to coloured Petri net models. Coloured Petri nets models...... in a very limited and predictable manner, and it is easy to enable and disable the auxiliary information. The fourth paper is a case study in which the performance of a web server was analysed using coloured Petri nets. This case study has shown that it is relatively easy to analyse the performance...

  12. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    Science.gov (United States)

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; Price, Stephen; Hoffman, Matthew; Lipscomb, William H.; Fyke, Jeremy; Vargo, Lauren; Boghozian, Adrianna; Norman, Matthew; Worley, Patrick H.

    2017-06-01

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptops to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Ultimately, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.

  13. Energetic performance analysis of drying agricultural products ...

    African Journals Online (AJOL)

    Renewable energy sources such as solar energy for drying purposes in a more effective and efficient way is inevitable for preservation of agricultural products in developing nations with inadequate access to electricity. This study investigates the effects of using a solar tracking device on the energy performance of drying ...

  14. Performance analysis of railway infrastructure and operations

    NARCIS (Netherlands)

    Hansen, I.A.; Wiggenraad, P.B.L.; Wolff, J.W.

    2013-01-01

    Research on performance assessment of railway networks and companies has been stimulated by the European policy of deregulation of transport markets, the opening of national railway networks and markets to new entrants and separation of infrastructure and train operation. Recent international

  15. COMPARATIVE ANALYSIS OF THE PERFORMANCE OF ...

    African Journals Online (AJOL)

    The growth in the good number of real-time and non-real-time applications has sparked a renewed interest in exploring resource allocation schemes that can be efficient and fair to all the applications in overloaded scenarios. In this paper, the performance of six scheduling algorithms for Long Term Evolution (LTE) downlink ...

  16. Method comparison of automated matching software-assisted cone-beam CT and stereoscopic kilovoltage x-ray positional verification image-guided radiation therapy for head and neck cancer: a prospective analysis.

    Science.gov (United States)

    Fuller, Clifton D; Scarbrough, Todd J; Sonke, Jan-Jakob; Rasch, Coen R N; Choi, Mehee; Ting, Joe Y; Wang, Samuel J; Papanikolaou, Niko; Rosenthal, David I

    2009-12-21

    We sought to characterize interchangeability and agreement between cone-beam computed tomography (CBCT) and digital stereoscopic kV x-ray (KVX) acquisition, two methods of isocenter positional verification currently used for IGRT of head and neck cancers (HNC). A cohort of 33 patients were near-simultaneously imaged by in-room KVX and CBCT. KVX and CBCT shifts were suggested using manufacturer software for the lateral (X), vertical (Y) and longitudinal (Z) dimensions. Intra-method repeatability, systematic and random error components were calculated for each imaging modality, as were recipe-based PTV expansion margins. Inter-method agreement in each axis was compared using limits of agreement (LOA) methodology, concordance analysis and orthogonal regression. 100 daily positional assessments were performed before daily therapy in 33 patients with head and neck cancer. Systematic error was greater for CBCT in all axes, with larger random error components in the Y- and Z-axis. Repeatability ranged from 9 to 14 mm for all axes, with CBCT showing greater repeatability in 2/3 axes. LOA showed paired shifts to agree 95% of the time within +/-11.3 mm in the X-axis, +/-9.4 mm in the Y-axis and +/-5.5 mm in the Z-axis. Concordance ranged from 'mediocre' to 'satisfactory'. Proportional bias was noted between paired X- and Z-axis measures, with a constant bias component in the Z-axis. Our data suggest non-negligible differences in software-derived CBCT and KVX image-guided directional shifts using formal method comparison statistics.

  17. Conscientiousness and Academic Performance: A Mediational Analysis

    Science.gov (United States)

    Conrad, Nicole; Patry, Marc W.

    2012-01-01

    Previous research has established that a relationship exists between the personality trait of conscientiousness and academic achievement. The current study extends prior research by using a path analysis model to explore various proximal traits that may mediate this relationship in a sample of two hundred and twenty three undergraduate university…

  18. Survey on Offline Finger Print Verification System

    NARCIS (Netherlands)

    Suman, R.; Kaur, R.

    2012-01-01

    The fingerprint verification, means where "verification" implies a user matching a fingerprint against a single fingerprint associated with the identity that the user claims. Biometrics can be classified into two types Behavioral (signature verification, keystroke dynamics, etc.) And Physiological

  19. Multi-canister overpack project -- verification and validation, MCNP 4A

    Energy Technology Data Exchange (ETDEWEB)

    Goldmann, L.H.

    1997-11-10

    This supporting document contains the software verification and validation (V and V) package used for Phase 2 design of the Spent Nuclear Fuel Multi-Canister Overpack. V and V packages for both ANSYS and MCNP are included. Description of Verification Run(s): This software requires that it be compiled specifically for the machine it is to be used on. Therefore to facilitate ease in the verification process the software automatically runs 25 sample problems to ensure proper installation and compilation. Once the runs are completed the software checks for verification by performing a file comparison on the new output file and the old output file. Any differences between any of the files will cause a verification error. Due to the manner in which the verification is completed a verification error does not necessarily indicate a problem. This indicates that a closer look at the output files is needed to determine the cause of the error.

  20. Optical secure image verification system based on ghost imaging

    Science.gov (United States)

    Wu, Jingjing; Haobogedewude, Buyinggaridi; Liu, Zhengjun; Liu, Shutian

    2017-09-01

    The ghost imaging can perform Fourier-space filtering by tailoring the configuration. We proposed a novel optical secure image verification system based on this theory with the help of phase matched filtering. In the verification process, the system key and the ID card which contain the information of the correct image and the information to be verified are put in the reference and the test paths, respectively. We demonstrate that the ghost imaging configuration can perform an incoherent correlation between the system key and the ID card. The correct verification manifests itself with a correlation peak in the ghost image. The primary image and the image to be verified are encrypted and encoded into pure phase masks beforehand for security. Multi-image secure verifications can also be implemented in the proposed system.

  1. Performance analysis for sparse support recovery

    CERN Document Server

    Tang, Gongguo

    2009-01-01

    In this paper, the performance of estimating the common support for jointly sparse signals based on their projections onto lower-dimensional space is analyzed. Support recovery is formulated as a multiple-hypothesis testing problem and both upper and lower bounds on the probability of error are derived for general measurement matrices, by using the Chernoff bound and Fano's inequality, respectively. The form of the upper bound shows that the performance is determined by a single quantity that is a measure of the incoherence of the measurement matrix, while the lower bound reveals the importance of the total measurement gain. To demonstrate its immediate applicability, the lower bound is applied to derive the minimal number of samples needed for accurate direction of arrival (DOA) estimation for an algorithm based on sparse representation. When applied to Gaussian measurement ensembles, these bounds give necessary and sufficient conditions to guarantee a vanishing probability of error for majority realizations...

  2. Performance Analysis of Tyler's Covariance Estimator

    Science.gov (United States)

    Soloveychik, Ilya; Wiesel, Ami

    2015-01-01

    This paper analyzes the performance of Tyler's M-estimator of the scatter matrix in elliptical populations. We focus on the non-asymptotic setting and derive the estimation error bounds depending on the number of samples n and the dimension p. We show that under quite mild conditions the squared Frobenius norm of the error of the inverse estimator decays like p^2/n with high probability.

  3. High-level verification

    CERN Document Server

    Lerner, Sorin; Kundu, Sudipta

    2011-01-01

    Given the growing size and heterogeneity of Systems on Chip (SOC), the design process from initial specification to chip fabrication has become increasingly complex. This growing complexity provides incentive for designers to use high-level languages such as C, SystemC, and SystemVerilog for system-level design. While a major goal of these high-level languages is to enable verification at a higher level of abstraction, allowing early exploration of system-level designs, the focus so far for validation purposes has been on traditional testing techniques such as random testing and scenario-based

  4. Metamaterial polarization converter analysis: limits of performance

    DEFF Research Database (Denmark)

    Markovich, Dmitry L.; Andryieuski, Andrei; Zalkovskij, Maksim

    2013-01-01

    In this paper, we analyze the theoretical limits of a metamaterial-based converter with orthogonal linear eigenpolarizations that allow linear-to-elliptical polarization transformation with any desired ellipticity and ellipse orientation. We employ the transmission line approach providing a needed...... level of the design generalization. Our analysis reveals that the maximal conversion efficiency for transmission through a single metamaterial layer is 50 %, while the realistic reflection configuration can give the conversion efficiency up to 90 %. We show that a double layer transmission converter...... and a single layer with a ground plane can have 100 % polarization conversion efficiency. We tested our conclusions numerically reaching the designated limits of efficiency using a simple metamaterial design. Our general analysis provides useful guidelines for the metamaterial polarization converter design...

  5. Fusion of PCA-Based and LDA-Based Similarity Measures for Face Verification

    Directory of Open Access Journals (Sweden)

    Kittler Josef

    2010-01-01

    Full Text Available The problem of fusing similarity measure-based classifiers is considered in the context of face verification. The performance of face verification systems using different similarity measures in two well-known appearance-based representation spaces, namely Principle Component Analysis (PCA and Linear Discriminant Analysis (LDA is experimentally studied. The study is performed for both manually and automatically registered face images. The experimental results confirm that our optimised Gradient Direction (GD metric within the LDA feature space outperforms the other adopted metrics. Different methods of selection and fusion of the similarity measure-based classifiers are then examined. The experimental results demonstrate that the combined classifiers outperform any individual verification algorithm. In our studies, the Support Vector Machines (SVMs and Weighted Averaging of similarity measures appear to be the best fusion rules. Another interesting achievement of the work is that although features derived from the LDA approach lead to better results than those of the PCA algorithm for all the adopted scoring functions, fusing the PCA- and LDA-based scores improves the performance of the system.

  6. Formal modelling and verification of interlocking systems featuring sequential release

    DEFF Research Database (Denmark)

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2017-01-01

    In this article, we present a method and an associated toolchain for the formal verification of the new Danish railway interlocking systems that are compatible with the European Train Control System (ETCS) Level 2. We have made a generic and reconfigurable model of the system behaviour and generic...... efficient for finding bugs in the railway interlocking designs. Additionally, benchmarking results comparing the performance of our approach with alternative verification techniques on the interlocking models are presented....

  7. Cost and Training Effectiveness Analysis Performance Guide

    Science.gov (United States)

    1980-07-23

    tfi» obalracl onion * In Block 20, II Mttotont Inm Rap.irr) IS. SUPPLEMENTARY NOTES It. KEY «ORDS (ConUnuo on tomtoo old» II nocotoory tot...proposed, the CTEA assesses the likelihooo of each plan producing soldiers trair-ed to criterion level or chat level at which they perform the tasks to...Qulteltaee Aaalleatlon of Learein« I. Maaa a V (Hah« »ancilt la basea raaraaaotin« entaria a«a ».tenth« Q«i4aliaaa ea* aitantha a (rava> chat

  8. Performance analysis of ATM/DQDB interworking

    DEFF Research Database (Denmark)

    Christiansen, Henning; Kvols, Kenn

    1992-01-01

    The cell loss ratio and cell delay variation of a distributed-queue dual-bus (DQDB) network receiving traffic from a number of asynchronous transfer mode (ATM) connections are considered. Every connection carries either connection oriented or connectionless traffic. In the analysis of the access ...... to the bus, it is shown that consecutive service times of the local access queue are correlated. Two models, one of which includes the correlation, are presented. The correlation effect is illustrated and the models are evaluated by means of a number of simulation cases...

  9. Performance Analysis of Digital loudspeaker Arrays

    DEFF Research Database (Denmark)

    Pedersen, Bo Rohde; Kontomichos, Fotios; Mourjopoulos, John

    2008-01-01

    An analysis of digital loudspeaker arrays shows that the ways in which bits are mapped to the drivers influence the quality of the audio result. Specifically, a "bit-summed" rather than the traditional "bit-mapped" strategy greatly reduces the number of times drivers make binary transitions per...... period of the input frequency. Detailed simulations compare the results for a 32-loudspeaker array with a similar configuration with analog excitation of the drivers. Ideally, drivers in digital arrays should be very small and span a small area, but that sets limits on the low-frequency response...

  10. Hydrogen engine performance analysis. First annual report

    Energy Technology Data Exchange (ETDEWEB)

    Adt, Jr., R. R.; Swain, M. R.; Pappas, J. M.

    1978-08-01

    Many problems associated with the design and development of hydrogen-air breathing internal combustion engines for automotive applications have been identified by various domestic and foreign researchers. This project addresses the problems identified in the literature, seeks to evaluate potential solutions to these problems, and will obtain and document a design data-base convering the performance, operational and emissions characteristics essential for making rational decisions regarding the selection and design of prototype hydrogen-fueled, airbreathing engines suitable for manufacture for general automotive use. Information is included on the operation, safety, emission, and cost characteristics of hydrogen engines, the selection of a test engine and testing facilities, and experimental results. Baseline data for throttled and unthrottled, carburetted, hydrogen engine configurations with and without exhaust gas recirculation and water injection are presented. In addition to basic data gathering concerning performance and emissions, the test program conducted was formulated to address in detail the two major problems that must be overcome if hydrogen-fueled engines are to become viable: flashback and comparatively high NO/sub x/ emissions at high loads. In addition, the results of other hydrogen engine investigators were adjusted, using accepted methods, in order to make comparisons with the results of the present study. The comparisons revealed no major conflicts. In fact, with a few exceptions, there was found to be very good agreement between the results of the various studies.

  11. Biometric verification by cross-correlation analysis of 12-lead ECG patterns: Ranking of the most reliable peripheral and chest leads.

    Science.gov (United States)

    Krasteva, Vessela; Jekova, Irena; Abächerli, Roger

    Electrocardiogram (ECG)-based biometrics relies on the most stable and unique beat patterns, i.e. those with maximal intra-subject and minimal inter-subject waveform differences seen from different leads. We investigated methodology to evaluate those differences, aiming to rank the most prominent single and multi-lead ECG sets for biometric verification across a large population. A clinical standard 12-lead resting ECG database, including 460 pairs of remote recordings (distanced 1year apart) was used. Inter-subject beat waveform differences were studied by cross-correlation and amplitude relations of average PQRST (500ms) and QRS (100ms) patterns, using 8 features/lead in 12-leads. Biometric verification models based on stepwise linear discriminant classifier were trained on the first half of records. True verification rate (TVR) on the remaining test data was further reported as a common mean of the correctly verified equal subjects (true acceptance rate) and correctly rejected different subjects (true rejection rate). In single-lead ECG human identity applications, we found maximal TVR (87-89%) for the frontal plane leads (I, -aVR, II) within (0-60°) sector. Other leads were ranked: inferior (85%), lateral to septal (82-81%), with intermittent V3 drop (77.6%), suggesting anatomical landmark displacements. ECG pattern view from multi-lead sets improved TVR: chest (91.3%), limb (94.6%), 12-leads (96.3%). Copyright © 2017 Elsevier Inc. All rights reserved.

  12. ATLAS Distributed Data Analysis: performance and challenges

    CERN Document Server

    Fassi, Farida; The ATLAS collaboration

    2015-01-01

    In the LHC operations era the key goal is to analyse the results of the collisions of high-energy particles as a way of probing the fundamental forces of nature. The ATLAS experiment at the LHC at CERN is recording and simulating several 10's of PetaBytes of data per year. The ATLAS Computing Model was designed around the concepts of Grid Computing. Large data volumes from the detectors and simulations require a large number of CPUs and storage space for data processing. To cope with this challenge a global network known as the Worldwide LHC Computing Grid (WLCG) was built. This is the most sophisticated data taking and analysis system ever built. ATLAS accumulated more than 140 PB of data between 2009 and 2014. To analyse these data ATLAS developed, deployed and now operates a mature and stable distributed analysis (DA) service on the WLCG. The service is actively used: more than half a million user jobs run daily on DA resources, submitted by more than 1500 ATLAS physicists. A significant reliability of the...

  13. ATLAS Distributed Data Analysis: challenges and performance

    CERN Document Server

    Fassi, Farida; The ATLAS collaboration

    2015-01-01

    In the LHC operations era the key goal is to analyse the results of the collisions of high-energy particles as a way of probing the fundamental forces of nature. The ATLAS experiment at the LHC at CERN is recording and simulating several 10's of PetaBytes of data per year. The ATLAS Computing Model was designed around the concepts of Grid Computing. Large data volumes from the detectors and simulations require a large number of CPUs and storage space for data processing. To cope with this challenge a global network known as the Worldwide LHC Computing Grid (WLCG) was built. This is the most sophisticated data taking and analysis system ever built. ATLAS accumulated more than 140 PB of data between 2009 and 2014. To analyse these data ATLAS developed, deployed and now operates a mature and stable distributed analysis (DA) service on the WLCG. The service is actively used: more than half a million user jobs run daily on DA resources, submitted by more than 1500 ATLAS physicists. A significant reliability of the...

  14. Engineering a static verification tool for GPU kernels

    OpenAIRE

    Bardsley, E; Betts, A; Chong, N; Collingbourne, P; Deligiannis, P; Donaldson, AF; Ketema, J; Liew, D; Qadeer, S

    2014-01-01

    We report on practical experiences over the last 2.5 years related to the engineering of GPUVerify, a static verification tool for OpenCL and CUDA GPU kernels, plotting the progress of GPUVerify from a prototype to a fully functional and relatively efficient analysis tool. Our hope is that this experience report will serve the verification community by helping to inform future tooling efforts. ? 2014 Springer International Publishing.

  15. Enhanced verification test suite for physics simulation codes

    Energy Technology Data Exchange (ETDEWEB)

    Kamm, James R.; Brock, Jerry S.; Brandon, Scott T.; Cotrell, David L.; Johnson, Bryan; Knupp, Patrick; Rider, William J.; Trucano, Timothy G.; Weirs, V. Gregory

    2008-09-01

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations.

  16. Formal Verification, Engineering and Business Value

    Directory of Open Access Journals (Sweden)

    Ralf Huuck

    2012-12-01

    Full Text Available How to apply automated verification technology such as model checking and static program analysis to millions of lines of embedded C/C++ code? How to package this technology in a way that it can be used by software developers and engineers, who might have no background in formal verification? And how to convince business managers to actually pay for such a software? This work addresses a number of those questions. Based on our own experience on developing and distributing the Goanna source code analyzer for detecting software bugs and security vulnerabilities in C/C++ code, we explain the underlying technology of model checking, static analysis and SMT solving, steps involved in creating industrial-proof tools.

  17. Systems Approach to Arms Control Verification

    Energy Technology Data Exchange (ETDEWEB)

    Allen, K; Neimeyer, I; Listner, C; Stein, G; Chen, C; Dreicer, M

    2015-05-15

    Using the decades of experience of developing concepts and technologies for verifying bilateral and multilateral arms control agreements, a broad conceptual systems approach is being developed that takes into account varying levels of information and risk. The IAEA has already demonstrated the applicability of a systems approach by implementing safeguards at the State level, with acquisition path analysis as the key element. In order to test whether such an approach could also be implemented for arms control verification, an exercise was conducted in November 2014 at the JRC ITU Ispra. Based on the scenario of a hypothetical treaty between two model nuclear weapons states aimed at capping their nuclear arsenals at existing levels, the goal of this exercise was to explore how to use acquisition path analysis in an arms control context. Our contribution will present the scenario, objectives and results of this exercise, and attempt to define future workshops aimed at further developing verification measures that will deter or detect treaty violations.

  18. TFE Verification Program

    Energy Technology Data Exchange (ETDEWEB)

    1993-05-01

    The objective of the semiannual progress report is to summarize the technical results obtained during the latest reporting period. The information presented herein will include evaluated test data, design evaluations, the results of analyses and the significance of results. The program objective is to demonstrate the technology readiness of a TFE (thermionic fuel element) suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. The TFE Verification Program builds directly on the technology and data base developed in the 1960s and early 1970s in an AEC/NASA program, and in the SP-100 program conducted in 1983, 1984 and 1985. In the SP-100 program, the attractive features of thermionic power conversion technology were recognized but concern was expressed over the lack of fast reactor irradiation data. The TFE Verification Program addresses this concern.

  19. Verification of the databases EXFOR and ENDF

    Directory of Open Access Journals (Sweden)

    Berton Gottfried

    2017-01-01

    Full Text Available The objective of this work is for the verification of large experimental (EXFOR and evaluated nuclear reaction databases (JEFF, ENDF, JENDL, TENDL…. The work is applied to neutron reactions in EXFOR data, including threshold reactions, isomeric transitions, angular distributions and data in the resonance region of both isotopes and natural elements. Finally, a comparison of the resonance integrals compiled in EXFOR database with those derived from the evaluated libraries is also performed.

  20. Pedestrian flow simulation validation and verification techniques

    OpenAIRE

    Dridi, Mohamed H.

    2015-01-01

    For the verification and validation of microscopic simulation models of pedestrian flow, we have performed experiments for different kind of facilities and sites where most conflicts and congestion happens e.g. corridors, narrow passages, and crosswalks. The validity of the model should compare the experimental conditions and simulation results with video recording carried out in the same condition like in real life e.g. pedestrian flux and density distributions. The strategy in this techniqu...

  1. Data Packages for the Hanford Immobilized Low Activity Tank Waste Performance Assessment 2001 Version [SEC 1 THRU 5

    Energy Technology Data Exchange (ETDEWEB)

    MANN, F.M.

    2000-03-02

    Data package supporting the 2001 Immobilized Low-Activity Waste Performance Analysis. Geology, hydrology, geochemistry, facility, waste form, and dosimetry data based on recent investigation are provided. Verification and benchmarking packages for selected software codes are provided.

  2. Analysis and performance prediction of Stirling cryogenerator

    Science.gov (United States)

    Ghosh, R.; Atrey, M. D.; Narayankhedkar, K. G.

    2002-05-01

    The ratio of swept volume of the compression space to the swept volume of the expansion space is an important design parameter for the Stirling cryogenerator. The swept volume ratio can be varied by changing diameter either of the piston, displacer or both of them. In this paper cyclic simulation of Stirling cycle has been carried out to predict the performance of a Stirling cryogenerator for varying swept volume ratios. The dimensions of other components like cooler, regenerator and condenser are kept constant. For a given diameter of the displacer, piston diameter has been optimized based on COP. Also, an attempt has been made to find the optimum combination of piston and displacer diameters for optimum COP of the cryogenerator. The results can be extended to find out the best combination of the piston and the displacer diameters for a given refrigerating load.

  3. Performance analysis of microphone array methods

    Science.gov (United States)

    Herold, Gert; Sarradj, Ennes

    2017-08-01

    Microphone array methods aim at the characterization of multiple simultaneously operating sound sources. However, existing data processing algorithms have been shown to yield different results when applied to the same input data. The present paper introduces a method for estimating the reliability of such algorithms. Using Monte Carlo simulations, data sets with random variation of selected parameters are generated. Four different microphone array methods are applied to analyze the simulated data sets. The calculated results are compared with the expected outcome, and the dependency of the reliability on several parameters is quantified. It is shown not only that the performance of a method depends on the given source distribution, but also that the methods differ in terms of their sensitivity to imperfect input data.

  4. Shuttle TPS thermal performance and analysis methodology

    Science.gov (United States)

    Neuenschwander, W. E.; Mcbride, D. U.; Armour, G. A.

    1983-01-01

    Thermal performance of the thermal protection system was approximately as predicted. The only extensive anomalies were filler bar scorching and over-predictions in the high Delta p gap heating regions of the orbiter. A technique to predict filler bar scorching has been developed that can aid in defining a solution. Improvement in high Delta p gap heating methodology is still under study. Minor anomalies were also examined for improvements in modeling techniques and prediction capabilities. These include improved definition of low Delta p gap heating, an analytical model for inner mode line convection heat transfer, better modeling of structure, and inclusion of sneak heating. The limited number of problems related to penetration items that presented themselves during orbital flight tests were resolved expeditiously, and designs were changed and proved successful within the time frame of that program.

  5. Nanomechanical analysis of high performance materials

    CERN Document Server

    2014-01-01

    This book is intended for researchers who are interested in investigating the nanomechanical properties of materials using advanced instrumentation techniques. The chapters of the book are written in an easy-to-follow format, just like solved examples. The book comprehensively covers a broad range of materials such as polymers, ceramics, hybrids, biomaterials, metal oxides, nanoparticles, minerals, carbon nanotubes and welded joints. Each chapter describes the application of techniques on the selected material and also mentions the methodology adopted for the extraction of information from the raw data. This is a unique book in which both equipment manufacturers and equipment users have contributed chapters. Novices will learn the techniques directly from the inventors and senior researchers will gain in-depth information on the new technologies that are suitable for advanced analysis. On the one hand, fundamental concepts that are needed to understand the nanomechanical behavior of materials is included in t...

  6. Actual target coverage after setup verification using surgical clips compared with external skin markers in postoperative breast cancer radiation therapy.

    Science.gov (United States)

    van der Salm, Anke; Murrer, Lars; Steenbakkers, Inge; Houben, Ruud; Boersma, Liesbeth J

    After changing from offline setup verification to online setup verification using external skin markers in breast cancer patients, we noticed an increase in localized acute skin toxicity beneath the markers. Also, in vivo 3-dimensional dose measurements showed deviations between the delivered and the planned dose distributions; therefore, we investigated the accuracy of setup verification using surgical clips in the tumor bed, with a focus on target coverage of whole breast and tumor bed. Orthogonal kilovoltage images were acquired before every fraction in 35 breast cancer patients, deriving an online 3-dimensional setup error by matching on external skin markers. In retrospect, a rematch was performed using surgical clips. For 155 fractions (ie, 5-6 fractions/patient), a cone beam computed tomography (CT) scan was available. Analysis concerned: (1) visibility of the clips, (2) migration of the clips, (3) comparison of setup errors according to both match methods, and (4) comparison of target coverage by recalculating the dose on the online setup-corrected cone beam CT scan with the patient setup according to both match methods. External validation of the surgical clip-based online setup verification was performed in 23 patients by analyzing kilovoltage images of 100 fractions, obtained after treatment. All types of surgical clips could be visualized. The clip to center-of-mass distance decreased on average by 2 mm (standard deviation, 1) over the course of treatment. Setup differences between match methods were on average validation in 23 patients showed reassuring setup errors verification using surgical clips results in comparable setup corrections and target volume coverage as verification using skin markers. By omitting skin markers acute skin toxicity beneath the markers is prevented. Copyright © 2017 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  7. Compendium of Arms Control Verification Proposals.

    Science.gov (United States)

    1982-03-01

    leurs forces militaires *en vue de dimintier les risques de d~iletichement d’uiie guerrc. Comme les bienfaits d’un tel accord Dour chaque pays...complicate verification. Bank Credit : To check on the possibility that clandestine military expenditures might be channeled through the banking system in...the guise of extensions of credit , it would be neces- sary to use similar methods to those employed concerning budget expenditures (i.e. trend analysis

  8. Long term energy performance analysis of Egbin thermal power ...

    African Journals Online (AJOL)

    This study is aimed at providing an energy performance analysis of Egbin thermal power plant. The plant operates on Regenerative Rankine cycle with steam as its working fluid .The model equations were formulated based on some performance parameters used in power plant analysis. The considered criteria were plant ...

  9. Battery Technology Life Verification Test Manual Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Jon P. Christophersen

    2012-12-01

    The purpose of this Technology Life Verification Test (TLVT) Manual is to help guide developers in their effort to successfully commercialize advanced energy storage devices such as battery and ultracapacitor technologies. The experimental design and data analysis discussed herein are focused on automotive applications based on the United States Advanced Battery Consortium (USABC) electric vehicle, hybrid electric vehicle, and plug-in hybrid electric vehicle (EV, HEV, and PHEV, respectively) performance targets. However, the methodology can be equally applied to other applications as well. This manual supersedes the February 2005 version of the TLVT Manual (Reference 1). It includes criteria for statistically-based life test matrix designs as well as requirements for test data analysis and reporting. Calendar life modeling and estimation techniques, including a user’s guide to the corresponding software tool is now provided in the Battery Life Estimator (BLE) Manual (Reference 2).

  10. Finite Countermodel Based Verification for Program Transformation (A Case Study

    Directory of Open Access Journals (Sweden)

    Alexei P. Lisitsa

    2015-12-01

    Full Text Available Both automatic program verification and program transformation are based on program analysis. In the past decade a number of approaches using various automatic general-purpose program transformation techniques (partial deduction, specialization, supercompilation for verification of unreachability properties of computing systems were introduced and demonstrated. On the other hand, the semantics based unfold-fold program transformation methods pose themselves diverse kinds of reachability tasks and try to solve them, aiming at improving the semantics tree of the program being transformed. That means some general-purpose verification methods may be used for strengthening program transformation techniques. This paper considers the question how finite countermodels for safety verification method might be used in Turchin's supercompilation method. We extract a number of supercompilation sub-algorithms trying to solve reachability problems and demonstrate use of an external countermodel finder for solving some of the problems.

  11. Optothermistor as a breakthrough in the quantification of lycopene content of thermally processed tomato-based foods: verification versus absorption spectrophotometry and high-performance liquid chromatography

    NARCIS (Netherlands)

    Bicanic, D.D.; Swarts, J.J.; Luterotti, S.; Helander, P.; Fogliano, V.; Enese, M.

    2005-01-01

    This study reports on the first use of the "optothermistor" as a novel, precise, fast, and low-cost detector of lycopene in a wide range of commercially available processed-tomato products. The quantitative performance of the new device was evaluated by comparing data obtained to that acquired by

  12. Verification: an enabler for model based data preparation

    Science.gov (United States)

    Schiavone, Patrick; Chagoya, Alexandre; Martin, Luc; Annezo, Vincent; Blanchemain, Alexis

    2013-06-01

    With the technology node progress, the requirements on mask data preparation become more and more stringent. Standard long range dose modulation starts showing difficulties to meet the specifications in terms of correction accuracy and the so called Model Based Data Preparation (MBDP) is gaining more and more interest in order to maintain the required pattern fidelity. This type of correction which often includes a geometry change on top of the dose modulation cannot be checked conventionally using standard Mask Rule Check software tools. A new methodology and software tool to perform verification after Model Based e-beam Proximity Correction is presented to overcome this issue. A basic functionality is to do verification at the shot level taking into account the possible movement of the edges as well as the dose assignment. A second brick allows going one step further: a Model-Based Verification is performed all over the edges of the design, checking by simulation the deviation of the printed pattern to the target after correction. The verification tool is capable to identify hot spots as well as deviations to the targeted design occurring with a very low frequency, making it almost impossible to spot without the systematic use of a verification tool. The verification can be inserted either in the maskshop flow or at the semiconductor manufacturer as a help for improving the OPC flow or as an complementary check to be run with the OPC check.

  13. Overview of Code Verification

    Science.gov (United States)

    1983-01-01

    The verified code for the SIFT Executive is not the code that executes on the SIFT system as delivered. The running versions of the SIFT Executive contain optimizations and special code relating to the messy interface to the hardware broadcast interface and to packing of data to conserve space in the store of the BDX930 processors. The running code was in fact developed prior to and without consideration of any mechanical verification. This was regarded as necessary experimentation with the SIFT hardware and special purpose Pascal compiler. The Pascal code sections cover: the selection of a schedule from the global executive broadcast, scheduling, dispatching, three way voting, and error reporting actions of the SIFT Executive. Not included in these sections of Pascal code are: the global executive, five way voting, clock synchronization, interactive consistency, low level broadcasting, and program loading, initialization, and schedule construction.

  14. comparative analysis of the growth performance and haemolymph

    African Journals Online (AJOL)

    Dr Osondu

    4 No.2 2011. COMPARATIVE ANALYSIS OF THE GROWTH PERFORMANCE AND HAEMOLYMPH .... Statistical Analysis. Data collected from the experiments were analyzed by one way analysis of variance. (ANOVA) and student- Newman kuel's test was used for the ... bigger and grow faster than albino snails. The lack of.

  15. Automated Formal Verification for PLC Control Systems

    CERN Multimedia

    Fernández Adiego, Borja

    2014-01-01

    Programmable Logic Controllers (PLCs) are widely used devices used in industrial control systems. Ensuring that the PLC software is compliant with its specification is a challenging task. Formal verification has become a recommended practice to ensure the correctness of the safety-critical software. However, these techniques are still not widely applied in industry due to the complexity of building formal models, which represent the system and the formalization of requirement specifications. We propose a general methodology to perform automated model checking of complex properties expressed in temporal logics (e.g. CTL, LTL) on PLC programs. This methodology is based on an Intermediate Model (IM), meant to transform PLC programs written in any of the languages described in the IEC 61131-3 standard (ST, IL, etc.) to different modeling languages of verification tools. This approach has been applied to CERN PLC programs validating the methodology.

  16. Sensor-fusion-based biometric identity verification

    Energy Technology Data Exchange (ETDEWEB)

    Carlson, J.J.; Bouchard, A.M.; Osbourn, G.C.; Martinez, R.F.; Bartholomew, J.W. [Sandia National Labs., Albuquerque, NM (United States); Jordan, J.B.; Flachs, G.M.; Bao, Z.; Zhu, L. [New Mexico State Univ., Las Cruces, NM (United States). Electronic Vision Research Lab.

    1998-02-01

    Future generation automated human biometric identification and verification will require multiple features/sensors together with internal and external information sources to achieve high performance, accuracy, and reliability in uncontrolled environments. The primary objective of the proposed research is to develop a theoretical and practical basis for identifying and verifying people using standoff biometric features that can be obtained with minimal inconvenience during the verification process. The basic problem involves selecting sensors and discovering features that provide sufficient information to reliably verify a person`s identity under the uncertainties caused by measurement errors and tactics of uncooperative subjects. A system was developed for discovering hand, face, ear, and voice features and fusing them to verify the identity of people. The system obtains its robustness and reliability by fusing many coarse and easily measured features into a near minimal probability of error decision algorithm.

  17. Verification of precipitation forecasts from two numerical weather prediction models in the Middle Atlantic Region of the USA: A precursory analysis to hydrologic forecasting

    Science.gov (United States)

    Siddique, Ridwan; Mejia, Alfonso; Brown, James; Reed, Seann; Ahnert, Peter

    2015-10-01

    Accurate precipitation forecasts are required for accurate flood forecasting. The structures of different precipitation forecasting systems are constantly evolving, with improvements in forecasting techniques, increases in spatial and temporal resolution, improvements in model physics and numerical techniques, and better understanding of, and accounting for, predictive uncertainty. Hence, routine verification is necessary to understand the quality of forecasts as inputs to hydrologic modeling. In this study, we verify precipitation forecasts from the National Centers for Environmental Prediction (NCEP) 11-member Global Ensemble Forecast System Reforecast version 2 (GEFSRv2), as well as the 21-member Short Range Ensemble Forecast (SREF) system. Specifically, basin averaged precipitation forecasts are verified for different basin sizes (spatial scales) in the operating domain of the Middle Atlantic River Forecast Center (MARFC), using multi-sensor precipitation estimates (MPEs) as the observed data. The quality of the ensemble forecasts is evaluated conditionally upon precipitation amounts, forecast lead times, accumulation periods, and seasonality using different verification metrics. Overall, both GEFSRv2 and SREF tend to overforecast light to moderate precipitation and underforecast heavy precipitation. In addition, precipitation forecasts from both systems become increasingly reliable with increasing basin size and decreasing precipitation threshold, and the 24-hourly forecasts show slightly better skill than the 6-hourly forecasts. Both systems show a strong seasonal trend, characterized by better skill during the cool season than the warm season. Ultimately, the verification results lead to guidance on the expected quality of the precipitation forecasts, together with an assessment of their relative quality and unique information content, which is useful and necessary for their application in hydrologic forecasting.

  18. Automatic analysis of intrinsic positional verification films brachytherapy using MATLAB; Analisis automatico de peliculas de verificacion posicional intrinsica en braqueterapia mediante MATLAB

    Energy Technology Data Exchange (ETDEWEB)

    Quiros Higueras, J. D.; Marco Blancas, N. de; Ruiz Rodriguez, J. C.

    2011-07-01

    One of the essential tests in quality control of brachytherapy equipment is verification auto load intrinsic positional radioactive source. A classic method for evaluation is the use of x-ray film and measuring the distance between the marks left by autoradiography of the source with respect to a reference. In our center has developed an automated method of measurement by the radiochromic film scanning and implementation of a macro developed in Matlab, in order to optimize time and reduce uncertainty in the measurement. The purpose of this paper is to describe the method developed, assess their uncertainty and quantify their advantages over the manual method. (Author)

  19. 40 CFR 1065.341 - CVS and batch sampler verification (propane check).

    Science.gov (United States)

    2010-07-01

    ... (CONTINUED) AIR POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Calibrations and Verifications Flow-Related... engineering judgment and safe practices, this check may be performed using a gas other than propane, such as... components. (3) Poor mixing. Perform the verification as described in this section while traversing a...

  20. Reconfigurable system design and verification

    CERN Document Server

    Hsiung, Pao-Ann; Huang, Chun-Hsian

    2009-01-01

    Reconfigurable systems have pervaded nearly all fields of computation and will continue to do so for the foreseeable future. Reconfigurable System Design and Verification provides a compendium of design and verification techniques for reconfigurable systems, allowing you to quickly search for a technique and determine if it is appropriate to the task at hand. It bridges the gap between the need for reconfigurable computing education and the burgeoning development of numerous different techniques in the design and verification of reconfigurable systems in various application domains. The text e