WorldWideScience

Sample records for operation cross-platform validation

  1. Wide Angle Imaging Lidar (WAIL): Theory of Operation and Results from Cross-Platform Validation at the ARM Southern Great Plains Site

    Science.gov (United States)

    Polonsky, I. N.; Davis, A. B.; Love, S. P.

    2004-05-01

    WAIL was designed to determine physical and geometrical characteristics of optically thick clouds using the off-beam component of the lidar return that can be accurately modeled within the 3D photon diffusion approximation. The theory shows that the WAIL signal depends not only on the cloud optical characteristics (phase function, extinction and scattering coefficients) but also on the outer thickness of the cloud layer. This makes it possible to estimate the mean optical and geometrical thicknesses of the cloud. The comparison with Monte Carlo simulation demonstrates the high accuracy of the diffusion approximation for moderately to very dense clouds. During operation WAIL is able to collect a complete data set from a cloud every few minutes, with averaging over horizontal scale of a kilometer or so. In order to validate WAIL's ability to deliver cloud properties, the LANL instrument was deployed as a part of the THickness from Off-beam Returns (THOR) validation IOP. The goal was to probe clouds above the SGP CART site at night in March 2002 from below (WAIL and ARM instruments) and from NASA's P3 aircraft (carrying THOR, the GSFC counterpart of WAIL) flying above the clouds. The permanent cloud instruments we used to compare with the results obtained from WAIL were ARM's laser ceilometer, micro-pulse lidar (MPL), millimeter-wavelength cloud radar (MMCR), and micro-wave radiometer (MWR). The comparison shows that, in spite of an unusually low cloud ceiling, an unfavorable observation condition for WAIL's present configuration, cloud properties obtained from the new instrument are in good agreement with their counterparts obtained by other instruments. So WAIL can duplicate, at least for single-layer clouds, the cloud products of the MWR and MMCR together. But WAIL does this with green laser light, which is far more representative than microwaves of photon transport processes at work in the climate system.

  2. Cross-platform validation and analysis environment for particle physics

    Science.gov (United States)

    Chekanov, S. V.; Pogrebnyak, I.; Wilbern, D.

    2017-11-01

    A multi-platform validation and analysis framework for public Monte Carlo simulation for high-energy particle collisions is discussed. The front-end of this framework uses the Python programming language, while the back-end is written in Java, which provides a multi-platform environment that can be run from a web browser and can easily be deployed at the grid sites. The analysis package includes all major software tools used in high-energy physics, such as Lorentz vectors, jet algorithms, histogram packages, graphic canvases, and tools for providing data access. This multi-platform software suite, designed to minimize OS-specific maintenance and deployment time, is used for online validation of Monte Carlo event samples through a web interface.

  3. Cross-Platform Technologies

    Directory of Open Access Journals (Sweden)

    Maria Cristina ENACHE

    2017-04-01

    Full Text Available Cross-platform - a concept becoming increasingly used in recent years especially in the development of mobile apps, but this consistently over time and in the development of conventional desktop applications. The notion of cross-platform software (multi-platform or platform-independent refers to a software application that can run on more than one operating system or computing architecture. Thus, a cross-platform application can operate independent of software or hardware platform on which it is execute. As a generic definition presents a wide range of meanings for purposes of this paper we individualize this definition as follows: we will reduce the horizon of meaning and we use functionally following definition: a cross-platform application is a software application that can run on more than one operating system (desktop or mobile identical or in a similar way.

  4. ReSOLV: Applying Cryptocurrency Blockchain Methods to Enable Global Cross-Platform Software License Validation

    Directory of Open Access Journals (Sweden)

    Alan Litchfield

    2018-05-01

    Full Text Available This paper presents a method for a decentralised peer-to-peer software license validation system using cryptocurrency blockchain technology to ameliorate software piracy, and to provide a mechanism for software developers to protect copyrighted works. Protecting software copyright has been an issue since the late 1970s and software license validation has been a primary method employed in an attempt to minimise software piracy and protect software copyright. The method described creates an ecosystem in which the rights and privileges of participants are observed.

  5. Xamarin cross-platform application development

    CERN Document Server

    Peppers, Jonathan

    2015-01-01

    If you are a developer with experience in C# and are just getting into mobile development, this is the book for you. If you have experience with desktop applications or the Web, this book will give you a head start on cross-platform development.

  6. Cross platform development using Delphi and Kylix

    International Nuclear Information System (INIS)

    McDonald, J.L.; Nishimura, H.; Timossi, C.

    2002-01-01

    A cross platform component for EPICS Simple Channel Access (SCA) has been developed for the use with Delphi on Windows and Kylix on Linux. An EPICS controls GUI application developed on Windows runs on Linux by simply rebuilding it, and vice versa. This paper describes the technical details of the component

  7. PR-PR: cross-platform laboratory automation system.

    Science.gov (United States)

    Linshiz, Gregory; Stawski, Nina; Goyal, Garima; Bi, Changhao; Poust, Sean; Sharma, Monica; Mutalik, Vivek; Keasling, Jay D; Hillson, Nathan J

    2014-08-15

    To enable protocol standardization, sharing, and efficient implementation across laboratory automation platforms, we have further developed the PR-PR open-source high-level biology-friendly robot programming language as a cross-platform laboratory automation system. Beyond liquid-handling robotics, PR-PR now supports microfluidic and microscopy platforms, as well as protocol translation into human languages, such as English. While the same set of basic PR-PR commands and features are available for each supported platform, the underlying optimization and translation modules vary from platform to platform. Here, we describe these further developments to PR-PR, and demonstrate the experimental implementation and validation of PR-PR protocols for combinatorial modified Golden Gate DNA assembly across liquid-handling robotic, microfluidic, and manual platforms. To further test PR-PR cross-platform performance, we then implement and assess PR-PR protocols for Kunkel DNA mutagenesis and hierarchical Gibson DNA assembly for microfluidic and manual platforms.

  8. NMRFx Processor: a cross-platform NMR data processing program

    International Nuclear Information System (INIS)

    Norris, Michael; Fetler, Bayard; Marchant, Jan; Johnson, Bruce A.

    2016-01-01

    NMRFx Processor is a new program for the processing of NMR data. Written in the Java programming language, NMRFx Processor is a cross-platform application and runs on Linux, Mac OS X and Windows operating systems. The application can be run in both a graphical user interface (GUI) mode and from the command line. Processing scripts are written in the Python programming language and executed so that the low-level Java commands are automatically run in parallel on computers with multiple cores or CPUs. Processing scripts can be generated automatically from the parameters of NMR experiments or interactively constructed in the GUI. A wide variety of processing operations are provided, including methods for processing of non-uniformly sampled datasets using iterative soft thresholding. The interactive GUI also enables the use of the program as an educational tool for teaching basic and advanced techniques in NMR data analysis.

  9. NMRFx Processor: a cross-platform NMR data processing program

    Energy Technology Data Exchange (ETDEWEB)

    Norris, Michael; Fetler, Bayard [One Moon Scientific, Inc. (United States); Marchant, Jan [University of Maryland Baltimore County, Howard Hughes Medical Institute (United States); Johnson, Bruce A., E-mail: bruce.johnson@asrc.cuny.edu [One Moon Scientific, Inc. (United States)

    2016-08-15

    NMRFx Processor is a new program for the processing of NMR data. Written in the Java programming language, NMRFx Processor is a cross-platform application and runs on Linux, Mac OS X and Windows operating systems. The application can be run in both a graphical user interface (GUI) mode and from the command line. Processing scripts are written in the Python programming language and executed so that the low-level Java commands are automatically run in parallel on computers with multiple cores or CPUs. Processing scripts can be generated automatically from the parameters of NMR experiments or interactively constructed in the GUI. A wide variety of processing operations are provided, including methods for processing of non-uniformly sampled datasets using iterative soft thresholding. The interactive GUI also enables the use of the program as an educational tool for teaching basic and advanced techniques in NMR data analysis.

  10. Cross platform SCA component using C++ builder and KYLIX

    International Nuclear Information System (INIS)

    Nishimura, Hiroshi; Timossi, Chiris; McDonald, James L.

    2003-01-01

    A cross-platform component for EPICS Simple Channel Access (SCA) has been developed. EPICS client programs with GUI become portable at their C++ source-code level both on Windows and Linux by using Borland C++ Builder 6 and Kylix 3 on these platforms respectively

  11. Analysis of the development of cross-platform mobile applications

    OpenAIRE

    Pinedo Escribano, Diego

    2012-01-01

    The development of mobile phone applications is a huge market nowadays. There are many companies investing lot of money to develop successful and profitable applications. The problem emerges when trying to develop an application to be used by every user independently of the platform they are using (Android, iOS, BlackBerry OS, Windows Phone, etc.). For this reason, on the last years many different technologies have appeared that making the development of cross-platform applications easier. In...

  12. Researching intimacy through social media: A cross-platform approach

    OpenAIRE

    Miguel, C

    2016-01-01

    This paper aims to contribute to the understanding of how to study the way people build intimacy and manage privacy through social media interaction. It explores the research design and methodology of a research project based on a multi-sited case study composed of three different social media platforms: Badoo, CouchSurfing, and Facebook. This cross-platform approach is useful to observe how intimacy is often negotiated across different platforms. The research project focuses on the cities of...

  13. A Cross-Platform Tactile Capabilities Interface for Humanoid Robots

    Directory of Open Access Journals (Sweden)

    Jie eMa

    2016-04-01

    Full Text Available This article presents the core elements of a cross-platform tactile capabilities interface (TCI for humanoid arms. The aim of the interface is to reduce the cost of developing humanoid robot capabilities by supporting reuse through cross-platform deployment. The article presents a comparative analysis of existing robot middleware frameworks, as well as the technical details of the TCI framework that builds on the the existing YARP platform. The TCI framework currently includes robot arm actuators with robot skin sensors. It presents such hardware in a platform independent manner, making it possible to write robot control software that can be executed on different robots through the TCI frameworks. The TCI framework supports multiple humanoid platforms and this article also presents a case study of a cross-platform implementation of a set of tactile protective withdrawal reflexes that have been realised on both the Nao and iCub humanoid robot platforms using the same high-level source code.

  14. Cross-platform digital assessment forms for evaluating surgical skills

    Directory of Open Access Journals (Sweden)

    Steven Arild Wuyts Andersen

    2015-04-01

    Full Text Available A variety of structured assessment tools for use in surgical training have been reported, but extant assessment tools often employ paper-based rating forms. Digital assessment forms for evaluating surgical skills could potentially offer advantages over paper-based forms, especially in complex assessment situations. In this paper, we report on the development of cross-platform digital assessment forms for use with multiple raters in order to facilitate the automatic processing of surgical skills assessments that include structured ratings. The FileMaker 13 platform was used to create a database containing the digital assessment forms, because this software has cross-platform functionality on both desktop computers and handheld devices. The database is hosted online, and the rating forms can therefore also be accessed through most modern web browsers. Cross-platform digital assessment forms were developed for the rating of surgical skills. The database platform used in this study was reasonably priced, intuitive for the user, and flexible. The forms have been provided online as free downloads that may serve as the basis for further development or as inspiration for future efforts. In conclusion, digital assessment forms can be used for the structured rating of surgical skills and have the potential to be especially useful in complex assessment situations with multiple raters, repeated assessments in various times and locations, and situations requiring substantial subsequent data processing or complex score calculations.

  15. Professional Cross-Platform Mobile Development in C#

    CERN Document Server

    Olson, Scott; Horgen, Ben; Goers, Kenny

    2012-01-01

    Develop mobile enterprise applications in a language you already know! With employees, rather than the IT department, now driving the decision of which devices to use on the job, many companies are scrambling to integrate enterprise applications. Fortunately, enterprise developers can now create apps for all major mobile devices using C#/.NET and Mono, languages most already know. A team of authors draws on their vast experiences to teach you how to create cross-platform mobile applications, while delivering the same functionality to PC's, laptops and the web from a single technology platform

  16. Researching intimacy through social media: A cross-platform approach

    Directory of Open Access Journals (Sweden)

    Cristina Miguel

    2016-06-01

    Full Text Available This paper aims to contribute to the understanding of how to study the way people build intimacy and manage privacy through social media interaction. It explores the research design and methodology of a research project based on a multi-sited case study composed of three different social media platforms: Badoo, CouchSurfing, and Facebook. This cross-platform approach is useful to observe how intimacy is often negotiated across different platforms. The research project focuses on the cities of Leeds (UK and Barcelona (Spain. In particular, this article discusses the methods used to recruit participants and collect data for that study - namely, participant observation, semi-structured interviews, and user profiles analysis. This cross-platform approach and multi-method research design is helpful to investigate the nature of intimacy practices facilitated by social media at several levels: online/offline, across different platforms, among different types of relationships, within both new and existing relationships, and in different locations

  17. A Cross-Platform Infrastructure for Scalable Runtime Application Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jack Dongarra; Shirley Moore; Bart Miller, Jeffrey Hollingsworth; Tracy Rafferty

    2005-03-15

    The purpose of this project was to build an extensible cross-platform infrastructure to facilitate the development of accurate and portable performance analysis tools for current and future high performance computing (HPC) architectures. Major accomplishments include tools and techniques for multidimensional performance analysis, as well as improved support for dynamic performance monitoring of multithreaded and multiprocess applications. Previous performance tool development has been limited by the burden of having to re-write a platform-dependent low-level substrate for each architecture/operating system pair in order to obtain the necessary performance data from the system. Manual interpretation of performance data is not scalable for large-scale long-running applications. The infrastructure developed by this project provides a foundation for building portable and scalable performance analysis tools, with the end goal being to provide application developers with the information they need to analyze, understand, and tune the performance of terascale applications on HPC architectures. The backend portion of the infrastructure provides runtime instrumentation capability and access to hardware performance counters, with thread-safety for shared memory environments and a communication substrate to support instrumentation of multiprocess and distributed programs. Front end interfaces provides tool developers with a well-defined, platform-independent set of calls for requesting performance data. End-user tools have been developed that demonstrate runtime data collection, on-line and off-line analysis of performance data, and multidimensional performance analysis. The infrastructure is based on two underlying performance instrumentation technologies. These technologies are the PAPI cross-platform library interface to hardware performance counters and the cross-platform Dyninst library interface for runtime modification of executable images. The Paradyn and KOJAK

  18. Cross-platform digital assessment forms for evaluating surgical skills

    DEFF Research Database (Denmark)

    Andersen, Steven Arild Wuyts

    2015-01-01

    developed for the rating of surgical skills. The database platform used in this study was reasonably priced, intuitive for the user, and flexible. The forms have been provided online as free downloads that may serve as the basis for further development or as inspiration for future efforts. In conclusion......A variety of structured assessment tools for use in surgical training have been reported, but extant assessment tools often employ paper-based rating forms. Digital assessment forms for evaluating surgical skills could potentially offer advantages over paper-based forms, especially in complex...... assessment situations. In this paper, we report on the development of cross-platform digital assessment forms for use with multiple raters in order to facilitate the automatic processing of surgical skills assessments that include structured ratings. The FileMaker 13 platform was used to create a database...

  19. Cross-platform analysis of cancer microarray data improves gene expression based classification of phenotypes

    Directory of Open Access Journals (Sweden)

    Eils Roland

    2005-11-01

    Full Text Available Abstract Background The extensive use of DNA microarray technology in the characterization of the cell transcriptome is leading to an ever increasing amount of microarray data from cancer studies. Although similar questions for the same type of cancer are addressed in these different studies, a comparative analysis of their results is hampered by the use of heterogeneous microarray platforms and analysis methods. Results In contrast to a meta-analysis approach where results of different studies are combined on an interpretative level, we investigate here how to directly integrate raw microarray data from different studies for the purpose of supervised classification analysis. We use median rank scores and quantile discretization to derive numerically comparable measures of gene expression from different platforms. These transformed data are then used for training of classifiers based on support vector machines. We apply this approach to six publicly available cancer microarray gene expression data sets, which consist of three pairs of studies, each examining the same type of cancer, i.e. breast cancer, prostate cancer or acute myeloid leukemia. For each pair, one study was performed by means of cDNA microarrays and the other by means of oligonucleotide microarrays. In each pair, high classification accuracies (> 85% were achieved with training and testing on data instances randomly chosen from both data sets in a cross-validation analysis. To exemplify the potential of this cross-platform classification analysis, we use two leukemia microarray data sets to show that important genes with regard to the biology of leukemia are selected in an integrated analysis, which are missed in either single-set analysis. Conclusion Cross-platform classification of multiple cancer microarray data sets yields discriminative gene expression signatures that are found and validated on a large number of microarray samples, generated by different laboratories and

  20. Cross-platform comparison of microarray data using order restricted inference

    Science.gov (United States)

    Klinglmueller, Florian; Tuechler, Thomas; Posch, Martin

    2013-01-01

    Motivation Titration experiments measuring the gene expression from two different tissues, along with total RNA mixtures of the pure samples, are frequently used for quality evaluation of microarray technologies. Such a design implies that the true mRNA expression of each gene, is either constant or follows a monotonic trend between the mixtures, applying itself to the use of order restricted inference procedures. Exploiting only the postulated monotonicity of titration designs, we propose three statistical analysis methods for the validation of high-throughput genetic data and corresponding preprocessing techniques. Results Our methods allow for inference of accuracy, repeatability and cross-platform agreement, with minimal required assumptions regarding the underlying data generating process. Therefore, they are readily applicable to all sorts of genetic high-throughput data independent of the degree of preprocessing. An application to the EMERALD dataset was used to demonstrate how our methods provide a rich spectrum of easily interpretable quality metrics and allow the comparison of different microarray technologies and normalization methods. The results are on par with previous work, but provide additional new insights that cast doubt on the utility of popular preprocessing techniques, specifically concerning the EMERALD projects dataset. Availability All datasets are available on EBI’s ArrayExpress web site (http://www.ebi.ac.uk/microarray-as/ae/) under accession numbers E-TABM-536, E-TABM-554 and E-TABM-555. Source code implemented in C and R is available at: http://statistics.msi.meduniwien.ac.at/float/cross_platform/. Methods for testing and variance decomposition have been made available in the R-package orQA, which can be downloaded and installed from CRAN http://cran.r-project.org. PMID:21317143

  1. Validation of pig operations through pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Tolmasquim, Sueli Tiomno [TRANSPETRO - PETROBRAS Transporte S.A., Rio de Janeiro, RJ (Brazil); Nieckele, Angela O. [Pontificia Univ. Catolica do Rio de Janeiro, RJ (Brazil). Dept. de Engenharia Mecanica

    2005-07-01

    In the oil industry, pigging operations in pipelines have been largely applied for different purposes: pipe cleaning, inspection, liquid removal and product separation, among others. An efficient and safe pigging operation requires that a number of operational parameters, such as maximum and minimum pressures in the pipeline and pig velocity, to be well evaluated during the planning stage and maintained within stipulated limits while the operation is accomplished. With the objective of providing an efficient tool to assist in the control and design of pig operations through pipelines, a numerical code was developed, based on a finite difference scheme, which allows the simulation of two fluid transient flow, like liquid-liquid, gas-gas or liquid-gas products in the pipeline. Modules to automatically control process variables were included to employ different strategies to reach an efficient operation. Different test cases were investigated, to corroborate the robustness of the methodology. To validate the methodology, the results obtained with the code were compared with a real liquid displacement operation of a section of the OSPAR oil pipeline, belonging to PETROBRAS, with 30'' diameter and 60 km length, presenting good agreement. (author)

  2. Task Characterisation and Cross-Platform Programming Through System Identification

    Directory of Open Access Journals (Sweden)

    Theocharis Kyriacou

    2005-12-01

    Full Text Available Developing robust and reliable control code for autonomous mobile robots is difficult, because the interaction between a physical robot and the environment is highly complex, it is subject to noise and variation, and therefore partly unpredictable. This means that to date it is not possible to predict robot behaviour, based on theoretical models. Instead, current methods to develop robot control code still require a substantial trial-and-error component to the software design process. Such iterative refinement could be reduced, we argue, if a more profound theoretical understanding of robot-environment interaction existed. In this paper, we therefore present a modelling method that generates a faithful model of a robot's interaction with its environment, based on data logged while observing a physical robot's behaviour. Because this modelling method — nonlinear modelling using polynomials — is commonly used in the engineering discipline of system identification, we refer to it here as “robot identification”. We show in this paper that using robot identification to obtain a computer model of robot-environment interaction offers several distinct advantages: Very compact representations (one-line programs of the robot control program are generated The model can be analysed, for example through sensitivity analysis, leading to a better understanding of the essential parameters underlying the robot's behaviour, and The generated, compact robot code can be used for cross-platform robot programming, allowing fast transfer of robot code from one type of robot to another. We demonstrate these points through experiments with a Magellan Pro and a Nomad 200 mobile robot.

  3. A cross-platform GUI to control instruments compliant with SCPI through VISA

    Science.gov (United States)

    Roach, Eric; Liu, Jing

    2015-10-01

    In nuclear physics experiments, it is necessary and important to control instruments from a PC, which automates many tasks that require human operations otherwise. Not only does this make long term measurements possible, but it also makes repetitive operations less error-prone. We created a graphical user interface (GUI) to control instruments connected to a PC through RS232, USB, LAN, etc. The GUI is developed using Qt Creator, a cross-platform integrated development environment, which makes it portable to various operating systems, including those commonly used in mobile devices. NI-VISA library is used in the back end so that the GUI can be used to control instruments connected through various I/O interfaces without any modification. Commonly used SCPI commands can be sent to different instruments using buttons, sliders, knobs, and other various widgets provided by Qt Creator. As an example, we demonstrate how we set and fetch parameters and how to retrieve and display data from an Agilent Digital Storage Oscilloscope X3034A with the GUI. Our GUI can be easily used for other instruments compliant with SCPI and VISA with little or no modification.

  4. Cross-Platform Mobile Application Development: A Pattern-Based Approach

    Science.gov (United States)

    2012-03-01

    TYPE AND DATES COVERED Master’s Thesis 4. TITLE AND SUBTITLE Cross-Platform Mobile Application Development: A Pattern-Based Approach 5. FUNDING...for public release; distribution is unlimited CROSS-PLATFORM MOBILE APPLICATION DEVELOPMENT: A PATTERN-BASED APPROACH Christian G. Acord...occurring design problems. We then discuss common approaches to mobile development, including common aspects of mobile application development, including

  5. SeqKit: A Cross-Platform and Ultrafast Toolkit for FASTA/Q File Manipulation.

    Directory of Open Access Journals (Sweden)

    Wei Shen

    Full Text Available FASTA and FASTQ are basic and ubiquitous formats for storing nucleotide and protein sequences. Common manipulations of FASTA/Q file include converting, searching, filtering, deduplication, splitting, shuffling, and sampling. Existing tools only implement some of these manipulations, and not particularly efficiently, and some are only available for certain operating systems. Furthermore, the complicated installation process of required packages and running environments can render these programs less user friendly. This paper describes a cross-platform ultrafast comprehensive toolkit for FASTA/Q processing. SeqKit provides executable binary files for all major operating systems, including Windows, Linux, and Mac OSX, and can be directly used without any dependencies or pre-configurations. SeqKit demonstrates competitive performance in execution time and memory usage compared to similar tools. The efficiency and usability of SeqKit enable researchers to rapidly accomplish common FASTA/Q file manipulations. SeqKit is open source and available on Github at https://github.com/shenwei356/seqkit.

  6. Arc4nix: A cross-platform geospatial analytical library for cluster and cloud computing

    Science.gov (United States)

    Tang, Jingyin; Matyas, Corene J.

    2018-02-01

    Big Data in geospatial technology is a grand challenge for processing capacity. The ability to use a GIS for geospatial analysis on Cloud Computing and High Performance Computing (HPC) clusters has emerged as a new approach to provide feasible solutions. However, users lack the ability to migrate existing research tools to a Cloud Computing or HPC-based environment because of the incompatibility of the market-dominating ArcGIS software stack and Linux operating system. This manuscript details a cross-platform geospatial library "arc4nix" to bridge this gap. Arc4nix provides an application programming interface compatible with ArcGIS and its Python library "arcpy". Arc4nix uses a decoupled client-server architecture that permits geospatial analytical functions to run on the remote server and other functions to run on the native Python environment. It uses functional programming and meta-programming language to dynamically construct Python codes containing actual geospatial calculations, send them to a server and retrieve results. Arc4nix allows users to employ their arcpy-based script in a Cloud Computing and HPC environment with minimal or no modification. It also supports parallelizing tasks using multiple CPU cores and nodes for large-scale analyses. A case study of geospatial processing of a numerical weather model's output shows that arcpy scales linearly in a distributed environment. Arc4nix is open-source software.

  7. Cross-Platform Learning Media Development of Software Installation on Computer Engineering and Networking Expertise Package

    Directory of Open Access Journals (Sweden)

    Afis Pratama

    2018-03-01

    Full Text Available Software Installation is one of the important lessons that must be mastered by student of computer and network engineering expertise package. But there is a problem about the lack of attention and concentration of students in following the teaching and learning process in the subject of installation of the software. The matter must immediately find a solution. This research refers to the technology development that is always increasing. The technology can be used as a tool to support learning activities. Currently, all grade 10 students in public vocational high school (SMK 8 Semarang Indonesia already have a gadget, either a smartphone or a laptop and the intensity of usage is high enough. Based on this phenomenon, this research aims to create a learning media software installation that is cross-platform. It is practical and can be carried easily in a smartphone and a laptop that has different operating system. So that, this media is expected to improve learning outcomes, understanding and enthusiasm of the students in the software installation lesson.

  8. Virtual network computing: cross-platform remote display and collaboration software.

    Science.gov (United States)

    Konerding, D E

    1999-04-01

    VNC (Virtual Network Computing) is a computer program written to address the problem of cross-platform remote desktop/application display. VNC uses a client/server model in which an image of the desktop of the server is transmitted to the client and displayed. The client collects mouse and keyboard input from the user and transmits them back to the server. The VNC client and server can run on Windows 95/98/NT, MacOS, and Unix (including Linux) operating systems. VNC is multi-user on Unix machines (any number of servers can be run are unrelated to the primary display of the computer), while it is effectively single-user on Macintosh and Windows machines (only one server can be run, displaying the contents of the primary display of the server). The VNC servers can be configured to allow more than one client to connect at one time, effectively allowing collaboration through the shared desktop. I describe the function of VNC, provide details of installation, describe how it achieves its goal, and evaluate the use of VNC for molecular modelling. VNC is an extremely useful tool for collaboration, instruction, software development, and debugging of graphical programs with remote users.

  9. Building cross-platform apps using Titanium, Alloy, and Appcelerator cloud services

    CERN Document Server

    Saunders, Aaron

    2014-01-01

    Skip Objective-C and Java to get your app to market faster, using the skills you already have Building Cross-Platform Apps using Titanium, Alloy, and Appcelerator Cloud Services shows you how to build cross-platform iOS and Android apps without learning Objective-C or Java. With detailed guidance given toward using the Titanium Mobile Platform and Appcelerator Cloud Services, you will quickly develop the skills to build real, native apps- not web apps-using existing HTML, CSS, and JavaScript know-how. This guide takes you step-by-step through the creation of a photo-sharing app that leverages

  10. Query Health: standards-based, cross-platform population health surveillance.

    Science.gov (United States)

    Klann, Jeffrey G; Buck, Michael D; Brown, Jeffrey; Hadley, Marc; Elmore, Richard; Weber, Griffin M; Murphy, Shawn N

    2014-01-01

    Understanding population-level health trends is essential to effectively monitor and improve public health. The Office of the National Coordinator for Health Information Technology (ONC) Query Health initiative is a collaboration to develop a national architecture for distributed, population-level health queries across diverse clinical systems with disparate data models. Here we review Query Health activities, including a standards-based methodology, an open-source reference implementation, and three pilot projects. Query Health defined a standards-based approach for distributed population health queries, using an ontology based on the Quality Data Model and Consolidated Clinical Document Architecture, Health Quality Measures Format (HQMF) as the query language, the Query Envelope as the secure transport layer, and the Quality Reporting Document Architecture as the result language. We implemented this approach using Informatics for Integrating Biology and the Bedside (i2b2) and hQuery for data analytics and PopMedNet for access control, secure query distribution, and response. We deployed the reference implementation at three pilot sites: two public health departments (New York City and Massachusetts) and one pilot designed to support Food and Drug Administration post-market safety surveillance activities. The pilots were successful, although improved cross-platform data normalization is needed. This initiative resulted in a standards-based methodology for population health queries, a reference implementation, and revision of the HQMF standard. It also informed future directions regarding interoperability and data access for ONC's Data Access Framework initiative. Query Health was a test of the learning health system that supplied a functional methodology and reference implementation for distributed population health queries that has been validated at three sites. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under

  11. KOLAM: a cross-platform architecture for scalable visualization and tracking in wide-area imagery

    Science.gov (United States)

    Fraser, Joshua; Haridas, Anoop; Seetharaman, Guna; Rao, Raghuveer M.; Palaniappan, Kannappan

    2013-05-01

    KOLAM is an open, cross-platform, interoperable, scalable and extensible framework supporting a novel multi- scale spatiotemporal dual-cache data structure for big data visualization and visual analytics. This paper focuses on the use of KOLAM for target tracking in high-resolution, high throughput wide format video also known as wide-area motion imagery (WAMI). It was originally developed for the interactive visualization of extremely large geospatial imagery of high spatial and spectral resolution. KOLAM is platform, operating system and (graphics) hardware independent, and supports embedded datasets scalable from hundreds of gigabytes to feasibly petabytes in size on clusters, workstations, desktops and mobile computers. In addition to rapid roam, zoom and hyper- jump spatial operations, a large number of simultaneously viewable embedded pyramid layers (also referred to as multiscale or sparse imagery), interactive colormap and histogram enhancement, spherical projection and terrain maps are supported. The KOLAM software architecture was extended to support airborne wide-area motion imagery by organizing spatiotemporal tiles in very large format video frames using a temporal cache of tiled pyramid cached data structures. The current version supports WAMI animation, fast intelligent inspection, trajectory visualization and target tracking (digital tagging); the latter by interfacing with external automatic tracking software. One of the critical needs for working with WAMI is a supervised tracking and visualization tool that allows analysts to digitally tag multiple targets, quickly review and correct tracking results and apply geospatial visual analytic tools on the generated trajectories. One-click manual tracking combined with multiple automated tracking algorithms are available to assist the analyst and increase human effectiveness.

  12. MONO FOR CROSS-PLATFORM CONTROL SYSTEM ENVIRONMENT

    International Nuclear Information System (INIS)

    Nishimura, Hiroshi; Timossi, Chris

    2006-01-01

    Mono is an independent implementation of the .NET Framework by Novell that runs on multiple operating systems (including Windows, Linux and Macintosh) and allows any .NET compatible application to run unmodified. For instance Mono can run programs with graphical user interfaces (GUI) developed with the C(number s ign) language on Windows with Visual Studio (a full port of WinForm for Mono is in progress). We present the results of tests we performed to evaluate the portability of our controls system .NET applications from MS Windows to Linux

  13. HipMatch: an object-oriented cross-platform program for accurate determination of cup orientation using 2D-3D registration of single standard X-ray radiograph and a CT volume.

    Science.gov (United States)

    Zheng, Guoyan; Zhang, Xuan; Steppacher, Simon D; Murphy, Stephen B; Siebenrock, Klaus A; Tannast, Moritz

    2009-09-01

    The widely used procedure of evaluation of cup orientation following total hip arthroplasty using single standard anteroposterior (AP) radiograph is known inaccurate, largely due to the wide variability in individual pelvic orientation relative to X-ray plate. 2D-3D image registration methods have been introduced for an accurate determination of the post-operative cup alignment with respect to an anatomical reference extracted from the CT data. Although encouraging results have been reported, their extensive usage in clinical routine is still limited. This may be explained by their requirement of a CAD model of the prosthesis, which is often difficult to be organized from the manufacturer due to the proprietary issue, and by their requirement of either multiple radiographs or a radiograph-specific calibration, both of which are not available for most retrospective studies. To address these issues, we developed and validated an object-oriented cross-platform program called "HipMatch" where a hybrid 2D-3D registration scheme combining an iterative landmark-to-ray registration with a 2D-3D intensity-based registration was implemented to estimate a rigid transformation between a pre-operative CT volume and the post-operative X-ray radiograph for a precise estimation of cup alignment. No CAD model of the prosthesis is required. Quantitative and qualitative results evaluated on cadaveric and clinical datasets are given, which indicate the robustness and the accuracy of the program. HipMatch is written in object-oriented programming language C++ using cross-platform software Qt (TrollTech, Oslo, Norway), VTK, and Coin3D and is transportable to any platform.

  14. Validation of MIPAS HNO3 operational data

    Directory of Open Access Journals (Sweden)

    C. D. Boone

    2007-09-01

    Full Text Available Nitric acid (HNO3 is one of the key products that are operationally retrieved by the European Space Agency (ESA from the emission spectra measured by the Michelson Interferometer for Passive Atmospheric Sounding (MIPAS onboard ENVISAT. The product version 4.61/4.62 for the observation period between July 2002 and March 2004 is validated by comparisons with a number of independent observations from ground-based stations, aircraft/balloon campaigns, and satellites. Individual HNO3 profiles of the ESA MIPAS level-2 product show good agreement with those of MIPAS-B and MIPAS-STR (the balloon and aircraft version of MIPAS, respectively, and the balloon-borne infrared spectrometers MkIV and SPIRALE, mostly matching the reference data within the combined instrument error bars. In most cases differences between the correlative measurement pairs are less than 1 ppbv (5–10% throughout the entire altitude range up to about 38 km (~6 hPa, and below 0.5 ppbv (15–20% or more above 30 km (~17 hPa. However, differences up to 4 ppbv compared to MkIV have been found at high latitudes in December 2002 in the presence of polar stratospheric clouds. The degree of consistency is further largely affected by the temporal and spatial coincidence, and differences of 2 ppbv may be observed between 22 and 26 km (~50 and 30 hPa at high latitudes near the vortex boundary, due to large horizontal inhomogeneity of HNO3. Similar features are also observed in the mean differences of the MIPAS ESA HNO3 VMRs with respect to the ground-based FTIR measurements at five stations, aircraft-based SAFIRE-A and ASUR, and the balloon campaign IBEX. The mean relative differences between the MIPAS and FTIR HNO3 partial columns are within ±2%, comparable to the MIPAS systematic error of ~2%. For the vertical profiles, the biases between the MIPAS and FTIR data are generally below 10% in the altitudes of 10 to 30 km. The MIPAS and SAFIRE HNO3 data generally match within their total error

  15. A Set of Free Cross-Platform Authoring Programs for Flexible Web-Based CALL Exercises

    Science.gov (United States)

    O'Brien, Myles

    2012-01-01

    The Mango Suite is a set of three freely downloadable cross-platform authoring programs for flexible network-based CALL exercises. They are Adobe Air applications, so they can be used on Windows, Macintosh, or Linux computers, provided the freely-available Adobe Air has been installed on the computer. The exercises which the programs generate are…

  16. CROPPER: a metagene creator resource for cross-platform and cross-species compendium studies.

    Science.gov (United States)

    Paananen, Jussi; Storvik, Markus; Wong, Garry

    2006-09-22

    Current genomic research methods provide researchers with enormous amounts of data. Combining data from different high-throughput research technologies commonly available in biological databases can lead to novel findings and increase research efficiency. However, combining data from different heterogeneous sources is often a very arduous task. These sources can be different microarray technology platforms, genomic databases, or experiments performed on various species. Our aim was to develop a software program that could facilitate the combining of data from heterogeneous sources, and thus allow researchers to perform genomic cross-platform/cross-species studies and to use existing experimental data for compendium studies. We have developed a web-based software resource, called CROPPER that uses the latest genomic information concerning different data identifiers and orthologous genes from the Ensembl database. CROPPER can be used to combine genomic data from different heterogeneous sources, allowing researchers to perform cross-platform/cross-species compendium studies without the need for complex computational tools or the requirement of setting up one's own in-house database. We also present an example of a simple cross-platform/cross-species compendium study based on publicly available Parkinson's disease data derived from different sources. CROPPER is a user-friendly and freely available web-based software resource that can be successfully used for cross-species/cross-platform compendium studies.

  17. Automated validation of a computer operating system

    Science.gov (United States)

    Dervage, M. M.; Milberg, B. A.

    1970-01-01

    Programs apply selected input/output loads to complex computer operating system and measure performance of that system under such loads. Technique lends itself to checkout of computer software designed to monitor automated complex industrial systems.

  18. Multi-task learning for cross-platform siRNA efficacy prediction: an in-silico study.

    Science.gov (United States)

    Liu, Qi; Xu, Qian; Zheng, Vincent W; Xue, Hong; Cao, Zhiwei; Yang, Qiang

    2010-04-10

    Gene silencing using exogenous small interfering RNAs (siRNAs) is now a widespread molecular tool for gene functional study and new-drug target identification. The key mechanism in this technique is to design efficient siRNAs that incorporated into the RNA-induced silencing complexes (RISC) to bind and interact with the mRNA targets to repress their translations to proteins. Although considerable progress has been made in the computational analysis of siRNA binding efficacy, few joint analysis of different RNAi experiments conducted under different experimental scenarios has been done in research so far, while the joint analysis is an important issue in cross-platform siRNA efficacy prediction. A collective analysis of RNAi mechanisms for different datasets and experimental conditions can often provide new clues on the design of potent siRNAs. An elegant multi-task learning paradigm for cross-platform siRNA efficacy prediction is proposed. Experimental studies were performed on a large dataset of siRNA sequences which encompass several RNAi experiments recently conducted by different research groups. By using our multi-task learning method, the synergy among different experiments is exploited and an efficient multi-task predictor for siRNA efficacy prediction is obtained. The 19 most popular biological features for siRNA according to their jointly importance in multi-task learning were ranked. Furthermore, the hypothesis is validated out that the siRNA binding efficacy on different messenger RNAs(mRNAs) have different conditional distribution, thus the multi-task learning can be conducted by viewing tasks at an "mRNA"-level rather than at the "experiment"-level. Such distribution diversity derived from siRNAs bound to different mRNAs help indicate that the properties of target mRNA have important implications on the siRNA binding efficacy. The knowledge gained from our study provides useful insights on how to analyze various cross-platform RNAi data for uncovering

  19. Multi-task learning for cross-platform siRNA efficacy prediction: an in-silico study

    Directory of Open Access Journals (Sweden)

    Xue Hong

    2010-04-01

    Full Text Available Abstract Background Gene silencing using exogenous small interfering RNAs (siRNAs is now a widespread molecular tool for gene functional study and new-drug target identification. The key mechanism in this technique is to design efficient siRNAs that incorporated into the RNA-induced silencing complexes (RISC to bind and interact with the mRNA targets to repress their translations to proteins. Although considerable progress has been made in the computational analysis of siRNA binding efficacy, few joint analysis of different RNAi experiments conducted under different experimental scenarios has been done in research so far, while the joint analysis is an important issue in cross-platform siRNA efficacy prediction. A collective analysis of RNAi mechanisms for different datasets and experimental conditions can often provide new clues on the design of potent siRNAs. Results An elegant multi-task learning paradigm for cross-platform siRNA efficacy prediction is proposed. Experimental studies were performed on a large dataset of siRNA sequences which encompass several RNAi experiments recently conducted by different research groups. By using our multi-task learning method, the synergy among different experiments is exploited and an efficient multi-task predictor for siRNA efficacy prediction is obtained. The 19 most popular biological features for siRNA according to their jointly importance in multi-task learning were ranked. Furthermore, the hypothesis is validated out that the siRNA binding efficacy on different messenger RNAs(mRNAs have different conditional distribution, thus the multi-task learning can be conducted by viewing tasks at an "mRNA"-level rather than at the "experiment"-level. Such distribution diversity derived from siRNAs bound to different mRNAs help indicate that the properties of target mRNA have important implications on the siRNA binding efficacy. Conclusions The knowledge gained from our study provides useful insights on how to

  20. Cross-platform learning: on the nature of children's learning from multiple media platforms.

    Science.gov (United States)

    Fisch, Shalom M

    2013-01-01

    It is increasingly common for an educational media project to span several media platforms (e.g., TV, Web, hands-on materials), assuming that the benefits of learning from multiple media extend beyond those gained from one medium alone. Yet research typically has investigated learning from a single medium in isolation. This paper reviews several recent studies to explore cross-platform learning (i.e., learning from combined use of multiple media platforms) and how such learning compares to learning from one medium. The paper discusses unique benefits of cross-platform learning, a theoretical mechanism to explain how these benefits might arise, and questions for future research in this emerging field. Copyright © 2013 Wiley Periodicals, Inc., A Wiley Company.

  1. Learning by Doing: How to Develop a Cross-Platform Web App

    Directory of Open Access Journals (Sweden)

    Minh Q. Huynh

    2015-06-01

    Full Text Available As mobile devices become prevalent, there is always a need for apps.  How hard is it to develop an app especially a cross-platform app? The paper shares an experience in a project involved the development of a student services web app that can be run on cross-platform mobile devices.  The paper first describes the background of the project, the clients, and the proposed solution.  Then, it focuses on the step-by-step development processes and provides the illustration of written codes and techniques used.  The goal is for readers to gain an understanding on how to develop a mobile-friendly web app.  The paper concludes with teaching implications and offers thoughts for further development.

  2. Cross-Platform JavaScript Coding: Shifting Sand Dunes and Shimmering Mirages.

    Science.gov (United States)

    Merchant, David

    1999-01-01

    Most libraries don't have the resources to cross-platform and cross-version test all of their JavaScript coding. Many turn to WYSIWYG; however, WYSIWYG editors don't generally produce optimized coding. Web developers should: test their coding on at least one 3.0 browser, code by hand using tools to help speed that process up, and include a simple…

  3. Evaluation of Smartphone Inertial Sensor Performance for Cross-Platform Mobile Applications

    Directory of Open Access Journals (Sweden)

    Anton Kos

    2016-04-01

    Full Text Available Smartphone sensors are being increasingly used in mobile applications. The performance of sensors varies considerably among different smartphone models and the development of a cross-platform mobile application might be a very complex and demanding task. A publicly accessible resource containing real-life-situation smartphone sensor parameters could be of great help for cross-platform developers. To address this issue we have designed and implemented a pilot participatory sensing application for measuring, gathering, and analyzing smartphone sensor parameters. We start with smartphone accelerometer and gyroscope bias and noise parameters. The application database presently includes sensor parameters of more than 60 different smartphone models of different platforms. It is a modest, but important start, offering information on several statistical parameters of the measured smartphone sensors and insights into their performance. The next step, a large-scale cloud-based version of the application, is already planned. The large database of smartphone sensor parameters may prove particularly useful for cross-platform developers. It may also be interesting for individual participants who would be able to check-up and compare their smartphone sensors against a large number of similar or identical models.

  4. Evaluation of Smartphone Inertial Sensor Performance for Cross-Platform Mobile Applications

    Science.gov (United States)

    Kos, Anton; Tomažič, Sašo; Umek, Anton

    2016-01-01

    Smartphone sensors are being increasingly used in mobile applications. The performance of sensors varies considerably among different smartphone models and the development of a cross-platform mobile application might be a very complex and demanding task. A publicly accessible resource containing real-life-situation smartphone sensor parameters could be of great help for cross-platform developers. To address this issue we have designed and implemented a pilot participatory sensing application for measuring, gathering, and analyzing smartphone sensor parameters. We start with smartphone accelerometer and gyroscope bias and noise parameters. The application database presently includes sensor parameters of more than 60 different smartphone models of different platforms. It is a modest, but important start, offering information on several statistical parameters of the measured smartphone sensors and insights into their performance. The next step, a large-scale cloud-based version of the application, is already planned. The large database of smartphone sensor parameters may prove particularly useful for cross-platform developers. It may also be interesting for individual participants who would be able to check-up and compare their smartphone sensors against a large number of similar or identical models. PMID:27049391

  5. Operational validation - current status and opportunities for improvement

    International Nuclear Information System (INIS)

    Davey, E.

    2002-01-01

    The design of nuclear plant systems and operational practices is based on the application of multiple defenses to minimize the risk of occurrence of safety and production challenges and upsets. With such an approach, the effectiveness of individual or combinations of design and operational features in preventing upset challenges should be known. A longstanding industry concern is the adverse impact errors in human performance can have on plant safety and production. To minimize the risk of error occurrence, designers and operations staff routinely employ multiple design and operational defenses. However, the effectiveness of individual or combinations of defensive features in minimizing error occurrence are generally only known in a qualitative sense. More importantly, the margins to error or upset occurrence provided by combinations of design or operational features are generally not characterized during design or operational validation. This paper provides some observations and comments on current validation practice as it relates to operational human performance concerns. The paper also discusses opportunities for future improvement in validation practice in terms of the resilience of validation results to operating changes and characterization of margins to safety or production challenge. (author)

  6. Intent inferencing by an intelligent operator's associate - A validation study

    Science.gov (United States)

    Jones, Patricia M.

    1988-01-01

    In the supervisory control of a complex, dynamic system, one potential form of aiding for the human operator is a computer-based operator's associate. The design philosophy of the operator's associate is that of 'amplifying' rather than automating human skills. In particular, the associate possesses understanding and control properties. Understanding allows it to infer operator intentions and thus form the basis for context-dependent advice and reminders; control properties allow the human operator to dynamically delegate individual tasks or subfunctions to the associate. This paper focuses on the design, implementation, and validation of the intent inferencing function. Two validation studies are described which empirically demonstrate the viability of the proposed approach to intent inferencing.

  7. XML as a cross-platform representation for medical imaging with fuzzy algorithms.

    Science.gov (United States)

    Gal, Norbert; Stoicu-Tivadar, Vasile

    2011-01-01

    Machines that perform linguistic medical image interpretation are based on fuzzy algorithms. There are several frameworks that can edit and simulate fuzzy algorithms, but they are not compatible with most of the implemented applications. This paper suggests a representation for fuzzy algorithms in XML files, and using this XML as a cross-platform between the simulation framework and the software applications. The paper presents a parsing algorithm that can convert files created by simulation framework, and converts them dynamically into an XML file keeping the original logical structure of the files.

  8. Pro Smartphone Cross-Platform Development IPhone, Blackberry, Windows Mobile, and Android Development and Distribution

    CERN Document Server

    Allen, Sarah; Lundrigan, Lee

    2010-01-01

    Learn the theory behind cross-platform development, and put the theory into practice with code using the invaluable information presented in this book. With in-depth coverage of development and distribution techniques for iPhone, BlackBerry, Windows Mobile, and Android, you'll learn the native approach to working with each of these platforms. With detailed coverage of emerging frameworks like PhoneGap and Rhomobile, you'll learn the art of creating applications that will run across all devices. You'll also be introduced to the code-signing process and the distribution of applications through t

  9. Following User Pathways: Cross Platform and Mixed Methods Analysis in Social Media Studies

    DEFF Research Database (Denmark)

    Hall, Margeret; Mazarakis, Athanasios; Peters, Isabella

    2016-01-01

    is the mixed method approach (e.g. qualitative and quantitative methods) in order to better understand how users and society interacts online. The workshop 'Following User Pathways' brings together a community of researchers and professionals to address methodological, analytical, conceptual, and technological......Social media and the resulting tidal wave of available data have changed the ways and methods researchers analyze communities at scale. But the full potential for social scientists (and others) is not yet achieved. Despite the popularity of social media analysis in the past decade, few researchers...... challenges and opportunities of cross-platform, mixed method analysis in social media ecosystems....

  10. A practical guide for operational validation of discrete simulation models

    Directory of Open Access Journals (Sweden)

    Fabiano Leal

    2011-04-01

    Full Text Available As the number of simulation experiments increases, the necessity for validation and verification of these models demands special attention on the part of the simulation practitioners. By analyzing the current scientific literature, it is observed that the operational validation description presented in many papers does not agree on the importance designated to this process and about its applied techniques, subjective or objective. With the expectation of orienting professionals, researchers and students in simulation, this article aims to elaborate a practical guide through the compilation of statistical techniques in the operational validation of discrete simulation models. Finally, the guide's applicability was evaluated by using two study objects, which represent two manufacturing cells, one from the automobile industry and the other from a Brazilian tech company. For each application, the guide identified distinct steps, due to the different aspects that characterize the analyzed distributions

  11. JS-MS: a cross-platform, modular javascript viewer for mass spectrometry signals.

    Science.gov (United States)

    Rosen, Jebediah; Handy, Kyle; Gillan, André; Smith, Rob

    2017-11-06

    Despite the ubiquity of mass spectrometry (MS), data processing tools can be surprisingly limited. To date, there is no stand-alone, cross-platform 3-D visualizer for MS data. Available visualization toolkits require large libraries with multiple dependencies and are not well suited for custom MS data processing modules, such as MS storage systems or data processing algorithms. We present JS-MS, a 3-D, modular JavaScript client application for viewing MS data. JS-MS provides several advantages over existing MS viewers, such as a dependency-free, browser-based, one click, cross-platform install and better navigation interfaces. The client includes a modular Java backend with a novel streaming.mzML parser to demonstrate the API-based serving of MS data to the viewer. JS-MS enables custom MS data processing and evaluation by providing fast, 3-D visualization using improved navigation without dependencies. JS-MS is publicly available with a GPLv2 license at github.com/optimusmoose/jsms.

  12. Cross-Platform Android/iOS-Based Smart Switch Control Middleware in a Digital Home

    Directory of Open Access Journals (Sweden)

    Guo Jie

    2015-01-01

    Full Text Available With technological and economic development, people’s lives have been improved substantially, especially their home environments. One of the key aspects of these improvements is home intellectualization, whose core is the smart home control system. Furthermore, as smart phones have become increasingly popular, we can use them to control the home system through Wi-Fi, Bluetooth, and GSM. This means that control with phones is more convenient and fast and now becomes the primary terminal controller in the smart home. In this paper, we propose middleware for developing a cross-platform Android/iOS-based solution for smart switch control software, focus on the Wi-Fi based communication protocols between the cellphone and the smart switch, achieved a plugin-based smart switch function, defined and implemented the JavaScript interface, and then implemented the cross-platform Android/iOS-based smart switch control software; also the scenarios are illustrated. Finally, tests were performed after the completed realization of the smart switch control system.

  13. The scheme and research of TV series multidimensional comprehensive evaluation on cross-platform

    Science.gov (United States)

    Chai, Jianping; Bai, Xuesong; Zhou, Hongjun; Yin, Fulian

    2016-10-01

    As for shortcomings of the comprehensive evaluation system on traditional TV programs such as single data source, ignorance of new media as well as the high time cost and difficulty of making surveys, a new evaluation of TV series is proposed in this paper, which has a perspective in cross-platform multidimensional evaluation after broadcasting. This scheme considers the data directly collected from cable television and the Internet as research objects. It's based on TOPSIS principle, after preprocessing and calculation of the data, they become primary indicators that reflect different profiles of the viewing of TV series. Then after the process of reasonable empowerment and summation by the six methods(PCA, AHP, etc.), the primary indicators form the composite indices on different channels or websites. The scheme avoids the inefficiency and difficulty of survey and marking; At the same time, it not only reflects different dimensions of viewing, but also combines TV media and new media, completing the multidimensional comprehensive evaluation of TV series on cross-platform.

  14. Validation of BN Reactor Plant Long-Term Operation

    International Nuclear Information System (INIS)

    Vilensky, O.; Vasilyev, B.; Kaidalov, V.

    2013-01-01

    The BN RP operation life time is mainly determined by resource of non-replaceable equipment. The new standard (RD) “Procedure of strength analysis for main components of sodium cooled fast neutron reactor plants” was developed to validate structure strength in view of radiation effects and degradation of material properties within the time period up to 300000 hours and under irradiation, as well as development of postulated crack-like defects. Using this RD, the extension of operation life of BN-600 reactor non-replaceable components from 30 to 45 years, as well as strength and durability of the most loaded non-replaceable components of BN-800 RP under construction were validated for the specified 45-year operation life. Wider application of steel 16Cr-11Ni-3Mo refers to new decisions in BN-1200 RP design that allow increasing of operation life of the most loaded non-replaceable components up to 60 years. High-chromium steel 12Cr-Ni-Mo-V-Nb is a new material, which was proposed for SG design to increase the operation life up to 30 years. In addition, the austenitic steels 18Cr-9Ni and 16Cr-11Ni-3Mo are now under upgrading for future application of them in commercial BN-1200 RP. To provide additional long-term reliable and safe operation of BN-1200 RP equipment and pipelines, it is planned to develop and implement the lifetime operational monitoring system

  15. A Human Proximity Operations System test case validation approach

    Science.gov (United States)

    Huber, Justin; Straub, Jeremy

    A Human Proximity Operations System (HPOS) poses numerous risks in a real world environment. These risks range from mundane tasks such as avoiding walls and fixed obstacles to the critical need to keep people and processes safe in the context of the HPOS's situation-specific decision making. Validating the performance of an HPOS, which must operate in a real-world environment, is an ill posed problem due to the complexity that is introduced by erratic (non-computer) actors. In order to prove the HPOS's usefulness, test cases must be generated to simulate possible actions of these actors, so the HPOS can be shown to be able perform safely in environments where it will be operated. The HPOS must demonstrate its ability to be as safe as a human, across a wide range of foreseeable circumstances. This paper evaluates the use of test cases to validate HPOS performance and utility. It considers an HPOS's safe performance in the context of a common human activity, moving through a crowded corridor, and extrapolates (based on this) to the suitability of using test cases for AI validation in other areas of prospective application.

  16. A cross-platform solution for light field based 3D telemedicine.

    Science.gov (United States)

    Wang, Gengkun; Xiang, Wei; Pickering, Mark

    2016-03-01

    Current telehealth services are dominated by conventional 2D video conferencing systems, which are limited in their capabilities in providing a satisfactory communication experience due to the lack of realism. The "immersiveness" provided by 3D technologies has the potential to promote telehealth services to a wider range of applications. However, conventional stereoscopic 3D technologies are deficient in many aspects, including low resolution and the requirement for complicated multi-camera setup and calibration, and special glasses. The advent of light field (LF) photography enables us to record light rays in a single shot and provide glasses-free 3D display with continuous motion parallax in a wide viewing zone, which is ideally suited for 3D telehealth applications. As far as our literature review suggests, there have been no reports of 3D telemedicine systems using LF technology. In this paper, we propose a cross-platform solution for a LF-based 3D telemedicine system. Firstly, a novel system architecture based on LF technology is established, which is able to capture the LF of a patient, and provide an immersive 3D display at the doctor site. For 3D modeling, we further propose an algorithm which is able to convert the captured LF to a 3D model with a high level of detail. For the software implementation on different platforms (i.e., desktop, web-based and mobile phone platforms), a cross-platform solution is proposed. Demo applications have been developed for 2D/3D video conferencing, 3D model display and edit, blood pressure and heart rate monitoring, and patient data viewing functions. The demo software can be extended to multi-discipline telehealth applications, such as tele-dentistry, tele-wound and tele-psychiatry. The proposed 3D telemedicine solution has the potential to revolutionize next-generation telemedicine technologies by providing a high quality immersive tele-consultation experience. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  17. Browser App Approach: Can It Be an Answer to the Challenges in Cross-Platform App Development?

    Science.gov (United States)

    Huynh, Minh; Ghimire, Prashant

    2017-01-01

    Aim/Purpose: As smartphones proliferate, many different platforms begin to emerge. The challenge to developers as well as IS [Information Systems] educators and students is how to learn the skills to design and develop apps to run on cross-platforms. Background: For developers, the purpose of this paper is to describe an alternative to the complex…

  18. Introducing StatHand: A cross-platform mobile application to support students’ statistical decision making

    Directory of Open Access Journals (Sweden)

    Peter James Allen

    2016-02-01

    Full Text Available Although essential to professional competence in psychology, quantitative research methods are a known area of weakness for many undergraduate psychology students. Students find selecting appropriate statistical tests and procedures for different types of research questions, hypotheses and data types particularly challenging, and these skills are not often practiced in class. Decision trees (a type of graphic organizer are known to facilitate this decision making process, but extant trees have a number of limitations. Furthermore, emerging research suggests that mobile technologies offer many possibilities for facilitating learning. It is within this context that we have developed StatHand, a free cross-platform application designed to support students’ statistical decision making. Developed with the support of the Australian Government Office for Learning and Teaching, StatHand guides users through a series of simple, annotated questions to help them identify a statistical test or procedure appropriate to their circumstances. It further offers the guidance necessary to run these tests and procedures, then interpret and report their results. In this Technology Report we will overview the rationale behind StatHand, before describing the feature set of the application. We will then provide guidelines for integrating StatHand into the research methods curriculum, before concluding by outlining our road map for the ongoing development and evaluation of StatHand.

  19. Introducing StatHand: A Cross-Platform Mobile Application to Support Students' Statistical Decision Making.

    Science.gov (United States)

    Allen, Peter J; Roberts, Lynne D; Baughman, Frank D; Loxton, Natalie J; Van Rooy, Dirk; Rock, Adam J; Finlay, James

    2016-01-01

    Although essential to professional competence in psychology, quantitative research methods are a known area of weakness for many undergraduate psychology students. Students find selecting appropriate statistical tests and procedures for different types of research questions, hypotheses and data types particularly challenging, and these skills are not often practiced in class. Decision trees (a type of graphic organizer) are known to facilitate this decision making process, but extant trees have a number of limitations. Furthermore, emerging research suggests that mobile technologies offer many possibilities for facilitating learning. It is within this context that we have developed StatHand, a free cross-platform application designed to support students' statistical decision making. Developed with the support of the Australian Government Office for Learning and Teaching, StatHand guides users through a series of simple, annotated questions to help them identify a statistical test or procedure appropriate to their circumstances. It further offers the guidance necessary to run these tests and procedures, then interpret and report their results. In this Technology Report we will overview the rationale behind StatHand, before describing the feature set of the application. We will then provide guidelines for integrating StatHand into the research methods curriculum, before concluding by outlining our road map for the ongoing development and evaluation of StatHand.

  20. ATRIUM™ 11 – Validation of performanceand value for BWR operations

    Energy Technology Data Exchange (ETDEWEB)

    Colet, S.; Garner, N.L.; Graebert, R.; Koch, R.; Mollard, P.

    2015-07-01

    AREVA’s ATRIUM™ 11 advanced fuel design for Boiling Water Reactors (BWRs) is the result of a product development program designed to realize a strict set of performance and reliability objectives complying with the industry market demand. The validation of ATRIUM™ 11 performance is given by the now completed out of pile thermal hydraulic and mechanical tests, the results of poolside examinations of initiated lead fuel assembly programs as well as the results of fuel cycle analyses taking benefit of enhanced fuel reliability and operational flexibility. The coming three years will complete the in-service qualification program leading to the anticipated reload deliveries in Europe in 2018 and leading to reload readiness in the US in 2019. The ATRIUM™ 11 Lead Fuel Assembly program is running in Europe since 2012 and in the USA since 2015 and the first irradiation experience data give as-expected results in term of mechanical and thermal-mechanical behavior as well as levels of corrosion. The large gains in term of fuel cycle economy by switching from 10x10 fuel to ATRIUM™ 11 fuel are illustrated specifically for a 1300 MWe US type reactor featuring a symmetric lattice and operated on a 24 month basis. The analytical tools necessary to support cycle design and licensing were initiated in parallel with the product development and, where required by regulatory authorities, submitted for review in time to allow for approval in parallel with the completion of the in-service qualification program. (Author)

  1. A GPU OpenCL based cross-platform Monte Carlo dose calculation engine (goMC)

    Science.gov (United States)

    Tian, Zhen; Shi, Feng; Folkerts, Michael; Qin, Nan; Jiang, Steve B.; Jia, Xun

    2015-09-01

    Monte Carlo (MC) simulation has been recognized as the most accurate dose calculation method for radiotherapy. However, the extremely long computation time impedes its clinical application. Recently, a lot of effort has been made to realize fast MC dose calculation on graphic processing units (GPUs). However, most of the GPU-based MC dose engines have been developed under NVidia’s CUDA environment. This limits the code portability to other platforms, hindering the introduction of GPU-based MC simulations to clinical practice. The objective of this paper is to develop a GPU OpenCL based cross-platform MC dose engine named goMC with coupled photon-electron simulation for external photon and electron radiotherapy in the MeV energy range. Compared to our previously developed GPU-based MC code named gDPM (Jia et al 2012 Phys. Med. Biol. 57 7783-97), goMC has two major differences. First, it was developed under the OpenCL environment for high code portability and hence could be run not only on different GPU cards but also on CPU platforms. Second, we adopted the electron transport model used in EGSnrc MC package and PENELOPE’s random hinge method in our new dose engine, instead of the dose planning method employed in gDPM. Dose distributions were calculated for a 15 MeV electron beam and a 6 MV photon beam in a homogenous water phantom, a water-bone-lung-water slab phantom and a half-slab phantom. Satisfactory agreement between the two MC dose engines goMC and gDPM was observed in all cases. The average dose differences in the regions that received a dose higher than 10% of the maximum dose were 0.48-0.53% for the electron beam cases and 0.15-0.17% for the photon beam cases. In terms of efficiency, goMC was ~4-16% slower than gDPM when running on the same NVidia TITAN card for all the cases we tested, due to both the different electron transport models and the different development environments. The code portability of our new dose engine goMC was validated by

  2. A GPU OpenCL based cross-platform Monte Carlo dose calculation engine (goMC).

    Science.gov (United States)

    Tian, Zhen; Shi, Feng; Folkerts, Michael; Qin, Nan; Jiang, Steve B; Jia, Xun

    2015-10-07

    Monte Carlo (MC) simulation has been recognized as the most accurate dose calculation method for radiotherapy. However, the extremely long computation time impedes its clinical application. Recently, a lot of effort has been made to realize fast MC dose calculation on graphic processing units (GPUs). However, most of the GPU-based MC dose engines have been developed under NVidia's CUDA environment. This limits the code portability to other platforms, hindering the introduction of GPU-based MC simulations to clinical practice. The objective of this paper is to develop a GPU OpenCL based cross-platform MC dose engine named goMC with coupled photon-electron simulation for external photon and electron radiotherapy in the MeV energy range. Compared to our previously developed GPU-based MC code named gDPM (Jia et al 2012 Phys. Med. Biol. 57 7783-97), goMC has two major differences. First, it was developed under the OpenCL environment for high code portability and hence could be run not only on different GPU cards but also on CPU platforms. Second, we adopted the electron transport model used in EGSnrc MC package and PENELOPE's random hinge method in our new dose engine, instead of the dose planning method employed in gDPM. Dose distributions were calculated for a 15 MeV electron beam and a 6 MV photon beam in a homogenous water phantom, a water-bone-lung-water slab phantom and a half-slab phantom. Satisfactory agreement between the two MC dose engines goMC and gDPM was observed in all cases. The average dose differences in the regions that received a dose higher than 10% of the maximum dose were 0.48-0.53% for the electron beam cases and 0.15-0.17% for the photon beam cases. In terms of efficiency, goMC was ~4-16% slower than gDPM when running on the same NVidia TITAN card for all the cases we tested, due to both the different electron transport models and the different development environments. The code portability of our new dose engine goMC was validated by

  3. A GPU OpenCL based cross-platform Monte Carlo dose calculation engine (goMC)

    International Nuclear Information System (INIS)

    Tian, Zhen; Shi, Feng; Folkerts, Michael; Qin, Nan; Jiang, Steve B; Jia, Xun

    2015-01-01

    Monte Carlo (MC) simulation has been recognized as the most accurate dose calculation method for radiotherapy. However, the extremely long computation time impedes its clinical application. Recently, a lot of effort has been made to realize fast MC dose calculation on graphic processing units (GPUs). However, most of the GPU-based MC dose engines have been developed under NVidia’s CUDA environment. This limits the code portability to other platforms, hindering the introduction of GPU-based MC simulations to clinical practice. The objective of this paper is to develop a GPU OpenCL based cross-platform MC dose engine named goMC with coupled photon–electron simulation for external photon and electron radiotherapy in the MeV energy range. Compared to our previously developed GPU-based MC code named gDPM (Jia et al 2012 Phys. Med. Biol. 57 7783–97), goMC has two major differences. First, it was developed under the OpenCL environment for high code portability and hence could be run not only on different GPU cards but also on CPU platforms. Second, we adopted the electron transport model used in EGSnrc MC package and PENELOPE’s random hinge method in our new dose engine, instead of the dose planning method employed in gDPM. Dose distributions were calculated for a 15 MeV electron beam and a 6 MV photon beam in a homogenous water phantom, a water-bone-lung-water slab phantom and a half-slab phantom. Satisfactory agreement between the two MC dose engines goMC and gDPM was observed in all cases. The average dose differences in the regions that received a dose higher than 10% of the maximum dose were 0.48–0.53% for the electron beam cases and 0.15–0.17% for the photon beam cases. In terms of efficiency, goMC was ∼4–16% slower than gDPM when running on the same NVidia TITAN card for all the cases we tested, due to both the different electron transport models and the different development environments. The code portability of our new dose engine goMC was

  4. Geophysical validation of MIPAS-ENVISAT operational ozone data

    Directory of Open Access Journals (Sweden)

    U. Cortesi

    2007-09-01

    Full Text Available The Michelson Interferometer for Passive Atmospheric Sounding (MIPAS, on-board the European ENVIronmental SATellite (ENVISAT launched on 1 March 2002, is a middle infrared Fourier Transform spectrometer measuring the atmospheric emission spectrum in limb sounding geometry. The instrument is capable to retrieve the vertical distribution of temperature and trace gases, aiming at the study of climate and atmospheric chemistry and dynamics, and at applications to data assimilation and weather forecasting. MIPAS operated in its standard observation mode for approximately two years, from July 2002 to March 2004, with scans performed at nominal spectral resolution of 0.025 cm−1 and covering the altitude range from the mesosphere to the upper troposphere with relatively high vertical resolution (about 3 km in the stratosphere. Only reduced spectral resolution measurements have been performed subsequently. MIPAS data were re-processed by ESA using updated versions of the Instrument Processing Facility (IPF v4.61 and v4.62 and provided a complete set of level-2 operational products (geo-located vertical profiles of temperature and volume mixing ratio of H2O, O3, HNO3, CH4, N2O and NO2 with quasi continuous and global coverage in the period of MIPAS full spectral resolution mission. In this paper, we report a detailed description of the validation of MIPAS-ENVISAT operational ozone data, that was based on the comparison between MIPAS v4.61 (and, to a lesser extent, v4.62 O3 VMR profiles and a comprehensive set of correlative data, including observations from ozone sondes, ground-based lidar, FTIR and microwave radiometers, remote-sensing and in situ instruments on-board stratospheric aircraft and balloons, concurrent satellite sensors and ozone fields assimilated by the European Center for Medium-range Weather Forecasting.

    A coordinated effort was carried out

  5. Reliability and Validity of Qualitative and Operational Research Paradigm

    Directory of Open Access Journals (Sweden)

    Muhammad Bashir

    2008-01-01

    Full Text Available Both qualitative and quantitative paradigms try to find the same result; the truth. Qualitative studies are tools used in understanding and describing the world of human experience. Since we maintain our humanity throughout the research process, it is largely impossible to escape the subjective experience, even for the most experienced of researchers. Reliability and Validity are the issue that has been described in great deal by advocates of quantitative researchers. The validity and the norms of rigor that are applied to quantitative research are not entirely applicable to qualitative research. Validity in qualitative research means the extent to which the data is plausible, credible and trustworthy; and thus can be defended when challenged. Reliability and validity remain appropriate concepts for attaining rigor in qualitative research. Qualitative researchers have to salvage responsibility for reliability and validity by implementing verification strategies integral and self-correcting during the conduct of inquiry itself. This ensures the attainment of rigor using strategies inherent within each qualitative design, and moves the responsibility for incorporating and maintaining reliability and validity from external reviewers’ judgments to the investigators themselves. There have different opinions on validity with some suggesting that the concepts of validity is incompatible with qualitative research and should be abandoned while others argue efforts should be made to ensure validity so as to lend credibility to the results. This paper is an attempt to clarify the meaning and use of reliability and validity in the qualitative research paradigm.

  6. Test validation of nuclear and fossil fuel control operators

    International Nuclear Information System (INIS)

    Moffie, D.J.

    1976-01-01

    To establish job relatedness, one must go through a procedure of concurrent and predictive validation. For concurrent validity a group of employees is tested and the test scores are related to performance concurrently or during the same time period. For predictive validity, individuals are tested but the results of these tests are not used at the time of employment. The tests are sealed and scored at a later date, and then related to job performance. Job performance data include ratings by supervisors, actual job performance indices, turnover, absenteeism, progress in training, etc. The testing guidelines also stipulate that content and construct validity can be used

  7. Validation of MIPAS-ENVISAT NO2 operational data

    Directory of Open Access Journals (Sweden)

    R. Ruhnke

    2007-06-01

    Full Text Available The Michelson Interferometer for Passive Atmospheric Sounding (MIPAS instrument was launched aboard the environmental satellite ENVISAT into its sun-synchronous orbit on 1 March 2002. The short-lived species NO2 is one of the key target products of MIPAS that are operationally retrieved from limb emission spectra measured in the stratosphere and mesosphere. Within the MIPAS validation activities, a large number of independent observations from balloons, satellites and ground-based stations have been compared to European Space Agency (ESA version 4.61 operational NO2 data comprising the time period from July 2002 until March 2004 where MIPAS measured with full spectral resolution. Comparisons between MIPAS and balloon-borne observations carried out in 2002 and 2003 in the Arctic, at mid-latitudes, and in the tropics show a very good agreement below 40 km altitude with a mean deviation of roughly 3%, virtually without any significant bias. The comparison to ACE satellite observations exhibits only a small negative bias of MIPAS which appears not to be significant. The independent satellite instruments HALOE, SAGE II, and POAM III confirm in common for the spring-summer time period a negative bias of MIPAS in the Arctic and a positive bias in the Antarctic middle and upper stratosphere exceeding frequently the combined systematic error limits. In contrast to the ESA operational processor, the IMK/IAA retrieval code allows accurate inference of NO2 volume mixing ratios under consideration of all important non-LTE processes. Large differences between both retrieval results appear especially at higher altitudes, above about 50 to 55 km. These differences might be explained at least partly by non-LTE under polar winter conditions but not at mid-latitudes. Below this altitude region mean differences between both processors remain within 5% (during night and up to 10% (during day under undisturbed (September 2002 conditions and up to 40% under perturbed

  8. PySpline: A Modern, Cross-Platform Program for the Processing of Raw Averaged XAS Edge and EXAFS Data

    International Nuclear Information System (INIS)

    Tenderholt, Adam; Hedman, Britt; Hodgson, Keith O.

    2007-01-01

    PySpline is a modern computer program for processing raw averaged XAS and EXAFS data using an intuitive approach which allows the user to see the immediate effect of various processing parameters on the resulting k- and R-space data. The Python scripting language and Qt and Qwt widget libraries were chosen to meet the design requirement that it be cross-platform (i.e. versions for Windows, Mac OS X, and Linux). PySpline supports polynomial pre- and post-edge background subtraction, splining of the EXAFS region with a multi-segment polynomial spline, and Fast Fourier Transform (FFT) of the resulting k3-weighted EXAFS data

  9. A Neural Networks Based Operation Guidance System for Procedure Presentation and Validation

    International Nuclear Information System (INIS)

    Seung, Kun Mo; Lee, Seung Jun; Seong, Poong Hyun

    2006-01-01

    In this paper, a neural network based operator support system is proposed to reduce operator's errors in abnormal situations in nuclear power plants (NPPs). There are many complicated situations, in which regular and suitable operations should be done by operators accordingly. In order to regulate and validate operators' operations, it is necessary to develop an operator support system which includes computer based procedures with the functions for operation validation. Many computerized procedures systems (CPS) have been recently developed. Focusing on the human machine interface (HMI) design and procedures' computerization, most of CPSs used various methodologies to enhance system's convenience, reliability and accessibility. Other than only showing procedures, the proposed system integrates a simple CPS and an operation validation system (OVS) by using artificial neural network (ANN) for operational permission and quantitative evaluation

  10. 40 CFR 1065.514 - Cycle-validation criteria for operation over specified duty cycles.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Cycle-validation criteria for operation over specified duty cycles. 1065.514 Section 1065.514 Protection of Environment ENVIRONMENTAL... Over Specified Duty Cycles § 1065.514 Cycle-validation criteria for operation over specified duty...

  11. Browser App Approach: Can It Be an Answer to the Challenges in Cross-Platform App Development?

    Directory of Open Access Journals (Sweden)

    Minh Q. Huynh

    2017-02-01

    Full Text Available Aim/Purpose: As smartphones proliferate, many different platforms begin to emerge. The challenge to developers as well as IS educators and students is how to learn the skills to design and develop apps to run on cross-platforms. Background: For developers, the purpose of this paper is to describe an alternative to the complex native app development. For IS educators and students, the paper provides a feasible way to learn and develop fully functional mobile apps without technical burdens. Methodology: The methods used in the development of browser-based apps is prototyping. Our proposed approach is browser-based, supports cross-platforms, uses open-source standards, and takes advantage of “write-once-and-run-anywhere” (WORA concept. Contribution: The paper illustrates the application of the browser-based approach to create a series of browser apps without high learning curve. Findings: The results show the potentials for using browser app approach to teach as well as to create new apps. Recommendations for Practitioners\t: Our proposed browser app development approach and example would be useful to mobile app developers/IS educators and non-technical students because the source code as well as documentations in this project are available for downloading. Future Research: For further work, we discuss the use of hybrid development framework to enhance browser apps.

  12. A Validated Task Analysis of the Single Pilot Operations Concept

    Science.gov (United States)

    Wolter, Cynthia A.; Gore, Brian F.

    2015-01-01

    The current day flight deck operational environment consists of a two-person Captain/First Officer crew. A concept of operations (ConOps) to reduce the commercial cockpit to a single pilot from the current two pilot crew is termed Single Pilot Operations (SPO). This concept has been under study by researchers in the Flight Deck Display Research Laboratory (FDDRL) at the National Aeronautics and Space Administration's (NASA) Ames (Johnson, Comerford, Lachter, Battiste, Feary, and Mogford, 2012) and researchers from Langley Research Centers (Schutte et al., 2007). Transitioning from a two pilot crew to a single pilot crew will undoubtedly require changes in operational procedures, crew coordination, use of automation, and in how the roles and responsibilities of the flight deck and ATC are conceptualized in order to maintain the high levels of safety expected of the US National Airspace System. These modifications will affect the roles and the subsequent tasks that are required of the various operators in the NextGen environment. The current report outlines the process taken to identify and document the tasks required by the crew according to a number of operational scenarios studied by the FDDRL between the years 2012-2014. A baseline task decomposition has been refined to represent the tasks consistent with a new set of entities, tasks, roles, and responsibilities being explored by the FDDRL as the move is made towards SPO. Information from Subject Matter Expert interviews, participation in FDDRL experimental design meetings, and study observation was used to populate and refine task sets that were developed as part of the SPO task analyses. The task analysis is based upon the proposed ConOps for the third FDDRL SPO study. This experiment possessed nine different entities operating in six scenarios using a variety of SPO-related automation and procedural activities required to guide safe and efficient aircraft operations. The task analysis presents the roles and

  13. Analysis of human plasma metabolites across different liquid chromatography/mass spectrometry platforms: Cross-platform transferable chemical signatures.

    Science.gov (United States)

    Telu, Kelly H; Yan, Xinjian; Wallace, William E; Stein, Stephen E; Simón-Manso, Yamil

    2016-03-15

    The metabolite profiling of a NIST plasma Standard Reference Material (SRM 1950) on different liquid chromatography/mass spectrometry (LC/MS) platforms showed significant differences. Although these findings suggest caution when interpreting metabolomics results, the degree of overlap of both profiles allowed us to use tandem mass spectral libraries of recurrent spectra to evaluate to what extent these results are transferable across platforms and to develop cross-platform chemical signatures. Non-targeted global metabolite profiles of SRM 1950 were obtained on different LC/MS platforms using reversed-phase chromatography and different chromatographic scales (conventional HPLC, UHPLC and nanoLC). The data processing and the metabolite differential analysis were carried out using publically available (XCMS), proprietary (Mass Profiler Professional) and in-house software (NIST pipeline). Repeatability and intermediate precision showed that the non-targeted SRM 1950 profiling was highly reproducible when working on the same platform (relative standard deviation (RSD) HPLC, UHPLC and nanoLC) on the same platform. A substantial degree of overlap (common molecular features) was also found. A procedure to generate consistent chemical signatures using tandem mass spectral libraries of recurrent spectra is proposed. Different platforms rendered significantly different metabolite profiles, but the results were highly reproducible when working within one platform. Tandem mass spectral libraries of recurrent spectra are proposed to evaluate the degree of transferability of chemical signatures generated on different platforms. Chemical signatures based on our procedure are most likely cross-platform transferable. Published in 2016. This article is a U.S. Government work and is in the public domain in the USA.

  14. Model validity and frequency band selection in operational modal analysis

    Science.gov (United States)

    Au, Siu-Kui

    2016-12-01

    Experimental modal analysis aims at identifying the modal properties (e.g., natural frequencies, damping ratios, mode shapes) of a structure using vibration measurements. Two basic questions are encountered when operating in the frequency domain: Is there a mode near a particular frequency? If so, how much spectral data near the frequency can be included for modal identification without incurring significant modeling error? For data with high signal-to-noise (s/n) ratios these questions can be addressed using empirical tools such as singular value spectrum. Otherwise they are generally open and can be challenging, e.g., for modes with low s/n ratios or close modes. In this work these questions are addressed using a Bayesian approach. The focus is on operational modal analysis, i.e., with 'output-only' ambient data, where identification uncertainty and modeling error can be significant and their control is most demanding. The approach leads to 'evidence ratios' quantifying the relative plausibility of competing sets of modeling assumptions. The latter involves modeling the 'what-if-not' situation, which is non-trivial but is resolved by systematic consideration of alternative models and using maximum entropy principle. Synthetic and field data are considered to investigate the behavior of evidence ratios and how they should be interpreted in practical applications.

  15. Use of operational data for the validation of the SOPHT thermal-hydraulic code

    Energy Technology Data Exchange (ETDEWEB)

    Ho, S F; Martin, G; Shoukas, L; Siddiqui, Z; Phillips, B [Ontario Hydro, Bowmanville, ON (Canada). Darlington Nuclear Generating Station

    1996-12-31

    The primary objective of this paper is to describe the validation process of the SOPHT and MINI-SOPHT codes with the use of reactor operational data. The secondary objective is to illustrative the effectiveness of the code as a performance monitoring tool by discussing the discoveries that were made during the validation process. (author). 2 refs.

  16. Development of Cross-Platform Software for Well Logging Data Visualization

    Science.gov (United States)

    Akhmadulin, R. K.; Miraev, A. I.

    2017-07-01

    Well logging data processing is one of the main sources of information in the oil-gas field analysis and is of great importance in the process of its development and operation. Therefore, it is important to have the software which would accurately and clearly provide the user with processed data in the form of well logs. In this work, there have been developed a software product which not only has the basic functionality for this task (loading data from .las files, well log curves display, etc.), but can be run in different operating systems and on different devices. In the article a subject field analysis and task formulation have been performed, and the software design stage has been considered. At the end of the work the resulting software product interface has been described.

  17. Using Kokkos for Performant Cross-Platform Acceleration of Liquid Rocket Simulations

    Science.gov (United States)

    2017-05-08

    defined functors (like Thrust or Intel TBB) Backends for Nvidia GPU, Intel Xeon, Xeon Phi , IBM Power8, others. “View” data structure provides optimal... Architecture of my Kokkos framework Designed for minimally-invasive operation alongside large Fortran code. Everything is controlled from Fortran through a...Controls Kokkos initialization/finalization void initialize(…); void finalize(…); TVProperties* gettvproperties(); Architecture of my Kokkos framework

  18. Cross-platform comparison of nucleic acid hybridization: toward quantitative reference standards.

    Science.gov (United States)

    Halvorsen, Ken; Agris, Paul F

    2014-11-15

    Measuring interactions between biological molecules is vitally important to both basic and applied research as well as development of pharmaceuticals. Although a wide and growing range of techniques is available to measure various kinetic and thermodynamic properties of interacting biomolecules, it can be difficult to compare data across techniques of different laboratories and personnel or even across different instruments using the same technique. Here we evaluate relevant biological interactions based on complementary DNA and RNA oligonucleotides that could be used as reference standards for many experimental systems. We measured thermodynamics of duplex formation using isothermal titration calorimetry, differential scanning calorimetry, and ultraviolet-visible (UV-vis) monitored denaturation/renaturation. These standards can be used to validate results, compare data from disparate techniques, act as a teaching tool for laboratory classes, or potentially to calibrate instruments. The RNA and DNA standards have many attractive features, including low cost, high purity, easily measurable concentrations, and minimal handling concerns, making them ideal for use as a reference material. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. A searchable cross-platform gene expression database reveals connections between drug treatments and disease

    Directory of Open Access Journals (Sweden)

    Williams Gareth

    2012-01-01

    Full Text Available Abstract Background Transcriptional data covering multiple platforms and species is collected and processed into a searchable platform independent expression database (SPIED. SPIED consists of over 100,000 expression fold profiles defined independently of control/treatment assignment and mapped to non-redundant gene lists. The database is thus searchable with query profiles defined over genes alone. The motivation behind SPIED is that transcriptional profiles can be quantitatively compared and ranked and thus serve as effective surrogates for comparing the underlying biological states across multiple experiments. Results Drug perturbation, cancer and neurodegenerative disease derived transcriptional profiles are shown to be effective descriptors of the underlying biology as they return related drugs and pathologies from SPIED. In the case of Alzheimer's disease there is high transcriptional overlap with other neurodegenerative conditions and rodent models of neurodegeneration and nerve injury. Combining the query signature with correlating profiles allows for the definition of a tight neurodegeneration signature that successfully highlights many neuroprotective drugs in the Broad connectivity map. Conclusions Quantitative querying of expression data from across the totality of deposited experiments is an effective way of discovering connections between different biological systems and in particular that between drug action and biological disease state. Examples in cancer and neurodegenerative conditions validate the utility of SPIED.

  20. Validation of AEGIS/SCOPE2 system through actual core follow calculations with irregular operational conditions

    International Nuclear Information System (INIS)

    Tabuchi, M.; Tatsumi, M.; Ohoka, Y.; Nagano, H.; Ishizaki, K.

    2017-01-01

    This paper describes overview of AEGIS/SCOPE2 system, an advanced in-core fuel management system for pressurized water reactors, and its validation results of actual core follow calculations including irregular operational conditions. AEGIS and SCOPE2 codes adopt more detailed and accurate calculation models compared to the current core design codes while computational cost is minimized with various techniques on numerical and computational algorithms. Verification and validation of AEGIS/SCOPE2 has been intensively performed to confirm validity of the system. As a part of the validation, core follow calculations have been carried out mainly for typical operational conditions. After the Fukushima Daiichi nuclear power plant accident, however, all the nuclear reactors in Japan suffered from long suspension and irregular operational conditions. In such situations, measured data in the restart and operation of the reactors should be good examinations for validation of the codes. Therefore, core follow calculations were carried out with AEGIS/SCOPE2 for various cases including zero power reactor physics tests with irregular operational conditions. Comparisons between measured data and predictions by AEGIS/SCOPE2 revealed the validity and robustness of the system. (author)

  1. HashDist: Reproducible, Relocatable, Customizable, Cross-Platform Software Stacks for Open Hydrological Science

    Science.gov (United States)

    Ahmadia, A. J.; Kees, C. E.

    2014-12-01

    Developing scientific software is a continuous balance between not reinventing the wheel and getting fragile codes to interoperate with one another. Binary software distributions such as Anaconda provide a robust starting point for many scientific software packages, but this solution alone is insufficient for many scientific software developers. HashDist provides a critical component of the development workflow, enabling highly customizable, source-driven, and reproducible builds for scientific software stacks, available from both the IPython Notebook and the command line. To address these issues, the Coastal and Hydraulics Laboratory at the US Army Engineer Research and Development Center has funded the development of HashDist in collaboration with Simula Research Laboratories and the University of Texas at Austin. HashDist is motivated by a functional approach to package build management, and features intelligent caching of sources and builds, parametrized build specifications, and the ability to interoperate with system compilers and packages. HashDist enables the easy specification of "software stacks", which allow both the novice user to install a default environment and the advanced user to configure every aspect of their build in a modular fashion. As an advanced feature, HashDist builds can be made relocatable, allowing the easy redistribution of binaries on all three major operating systems as well as cloud, and supercomputing platforms. As a final benefit, all HashDist builds are reproducible, with a build hash specifying exactly how each component of the software stack was installed. This talk discusses the role of HashDist in the hydrological sciences, including its use by the Coastal and Hydraulics Laboratory in the development and deployment of the Proteus Toolkit as well as the Rapid Operational Access and Maneuver Support project. We demonstrate HashDist in action, and show how it can effectively support development, deployment, teaching, and

  2. ScaMo: Realisation of an OO-functional DSL for cross platform mobile applications development

    Science.gov (United States)

    Macos, Dragan; Solymosi, Andreas

    2013-10-01

    The software market is dynamically changing: the Internet is going mobile, the software applications are shifting from the desktop hardware onto the mobile devices. The largest markets are the mobile applications for iOS, Android and Windows Phone and for the purpose the typical programming languages include Objective-C, Java and C ♯. The realization of the native applications implies the integration of the developed software into the environments of mentioned mobile operating systems to enable the access to different hardware components of the devices: GPS module, display, GSM module, etc. This paper deals with the definition and possible implementation of an environment for the automatic application generation for multiple mobile platforms. It is based on a DSL for mobile application development, which includes the programming language Scala and a DSL defined in Scala. As part of a multi-stage cross-compiling algorithm, this language is translated into the language of the affected mobile platform. The advantage of our method lies in the expressiveness of the defined language and the transparent source code translation between different languages, which implies, for example, the advantages of debugging and development of the generated code.

  3. Psynteract: A flexible, cross-platform, open framework for interactive experiments.

    Science.gov (United States)

    Henninger, Felix; Kieslich, Pascal J; Hilbig, Benjamin E

    2017-10-01

    We introduce a novel platform for interactive studies, that is, any form of study in which participants' experiences depend not only on their own responses, but also on those of other participants who complete the same study in parallel, for example a prisoner's dilemma or an ultimatum game. The software thus especially serves the rapidly growing field of strategic interaction research within psychology and behavioral economics. In contrast to all available software packages, our platform does not handle stimulus display and response collection itself. Instead, we provide a mechanism to extend existing experimental software to incorporate interactive functionality. This approach allows us to draw upon the capabilities already available, such as accuracy of temporal measurement, integration with auxiliary hardware such as eye-trackers or (neuro-)physiological apparatus, and recent advances in experimental software, for example capturing response dynamics through mouse-tracking. Through integration with OpenSesame, an open-source graphical experiment builder, studies can be assembled via a drag-and-drop interface requiring little or no further programming skills. In addition, by using the same communication mechanism across software packages, we also enable interoperability between systems. Our source code, which provides support for all major operating systems and several popular experimental packages, can be freely used and distributed under an open source license. The communication protocols underlying its functionality are also well documented and easily adapted to further platforms. Code and documentation are available at https://github.com/psynteract/ .

  4. Hanford tank waste operation simulator operational waste volume projection verification and validation procedure

    International Nuclear Information System (INIS)

    HARMSEN, R.W.

    1999-01-01

    The Hanford Tank Waste Operation Simulator is tested to determine if it can replace the FORTRAN-based Operational Waste Volume Projection computer simulation that has traditionally served to project double-shell tank utilization. Three Test Cases are used to compare the results of the two simulators; one incorporates the cleanup schedule of the Tri Party Agreement

  5. The development and validation of an ergonomics index for assessing tractor operator work place

    Directory of Open Access Journals (Sweden)

    Juan Paulo Barbieri

    2018-02-01

    Full Text Available ABSTRACT: This study aimed to develop and validate an ergonomics index for the operator workplace assessment of agricultural tractors sold in the Brazilian market. To develop the ergonomics index, the operator work places were assessed for compliance with current, national and international, safety and ergonomics standards. The following standards were analyzed to develop ergonomics index: ISO 15077 (1996, which regulates the position of operator controls; ABNT NBR ISO 4254-1(2015 and ABNT NBR ISO 4252 (2011, which regulate the access to operator workplaces; and NR 12 (2010, which determines the mandatory items of operator workplaces.Thirty-four operator work places of 152 models of new agricultural tractors sold in the Brazilian market were analyzed in this study. Ergonomics index was developed and validated using these standards, and the findings enabled the ranking of agricultural tractors. Therefore, the proposed ergonomics index proved feasible and may be applied to other agricultural machines.

  6. The development and validation of an ergonomics index for assessing tractor operator work place

    OpenAIRE

    Barbieri, Juan Paulo; Schlosser, José Fernando; Farias, Marcelo Silveira de; Negri, Giácomo Müller; Oliveira, Luis Fernando Vargas de

    2018-01-01

    ABSTRACT: This study aimed to develop and validate an ergonomics index for the operator workplace assessment of agricultural tractors sold in the Brazilian market. To develop the ergonomics index, the operator work places were assessed for compliance with current, national and international, safety and ergonomics standards. The following standards were analyzed to develop ergonomics index: ISO 15077 (1996), which regulates the position of operator controls; ABNT NBR ISO 4254-1(2015) and ABNT ...

  7. PlasmaDNA: a free, cross-platform plasmid manipulation program for molecular biology laboratories

    Directory of Open Access Journals (Sweden)

    Rainy Jeffrey

    2007-09-01

    Full Text Available Abstract Background Most molecular biology experiments, and the techniques associated with this field of study, involve a great deal of engineering in the form of molecular cloning. Like all forms of engineering, perfect information about the starting material is crucial for successful completion of design and strategies. Results We have generated a program that allows complete in silico simulation of the cloning experiment. Starting with a primary DNA sequence, PlasmaDNA looks for restriction sites, open reading frames, primer annealing sequences, and various common domains. The databases are easily expandable by the user to fit his most common cloning needs. PlasmaDNA can manage and graphically represent multiple sequences at the same time, and keeps in memory the overhangs at the end of the sequences if any. This means that it is possible to virtually digest fragments, to add the digestion products to the project, and to ligate together fragments with compatible ends to generate the new sequences. Polymerase Chain Reaction (PCR fragments can also be virtually generated using the primer database, automatically adding to the fragments any 5' extra sequences present in the primers. Conclusion PlasmaDNA is a program available both on Windows and Apple operating systems, designed to facilitate molecular cloning experiments by building a visual map of the DNA. It then allows the complete planning and simulation of the cloning experiment. It also automatically updates the new sequences generated in the process, which is an important help in practice. The capacity to maintain multiple sequences in the same file can also be used to archive the various steps and strategies involved in the cloning of each construct. The program is freely available for download without charge or restriction.

  8. JobCenter: an open source, cross-platform, and distributed job queue management system optimized for scalability and versatility

    Directory of Open Access Journals (Sweden)

    Jaschob Daniel

    2012-07-01

    Full Text Available Abstract Background Laboratories engaged in computational biology or bioinformatics frequently need to run lengthy, multistep, and user-driven computational jobs. Each job can tie up a computer for a few minutes to several days, and many laboratories lack the expertise or resources to build and maintain a dedicated computer cluster. Results JobCenter is a client–server application and framework for job management and distributed job execution. The client and server components are both written in Java and are cross-platform and relatively easy to install. All communication with the server is client-driven, which allows worker nodes to run anywhere (even behind external firewalls or “in the cloud” and provides inherent load balancing. Adding a worker node to the worker pool is as simple as dropping the JobCenter client files onto any computer and performing basic configuration, which provides tremendous ease-of-use, flexibility, and limitless horizontal scalability. Each worker installation may be independently configured, including the types of jobs it is able to run. Executed jobs may be written in any language and may include multistep workflows. Conclusions JobCenter is a versatile and scalable distributed job management system that allows laboratories to very efficiently distribute all computational work among available resources. JobCenter is freely available at http://code.google.com/p/jobcenter/.

  9. JobCenter: an open source, cross-platform, and distributed job queue management system optimized for scalability and versatility.

    Science.gov (United States)

    Jaschob, Daniel; Riffle, Michael

    2012-07-30

    Laboratories engaged in computational biology or bioinformatics frequently need to run lengthy, multistep, and user-driven computational jobs. Each job can tie up a computer for a few minutes to several days, and many laboratories lack the expertise or resources to build and maintain a dedicated computer cluster. JobCenter is a client-server application and framework for job management and distributed job execution. The client and server components are both written in Java and are cross-platform and relatively easy to install. All communication with the server is client-driven, which allows worker nodes to run anywhere (even behind external firewalls or "in the cloud") and provides inherent load balancing. Adding a worker node to the worker pool is as simple as dropping the JobCenter client files onto any computer and performing basic configuration, which provides tremendous ease-of-use, flexibility, and limitless horizontal scalability. Each worker installation may be independently configured, including the types of jobs it is able to run. Executed jobs may be written in any language and may include multistep workflows. JobCenter is a versatile and scalable distributed job management system that allows laboratories to very efficiently distribute all computational work among available resources. JobCenter is freely available at http://code.google.com/p/jobcenter/.

  10. Adding Cross-Platform Support to a High-Throughput Software Stack and Exploration of Vectorization Libraries

    CERN Document Server

    AUTHOR|(CDS)2258962

    This master thesis is written at the LHCb experiment at CERN. It is part of the initiative for improving software in view of the upcoming upgrade in 2021 which will significantly increase the amount of acquired data. This thesis consists of two parts. The first part is about the exploration of different vectorization libraries and their usefulness for the LHCb collaboration. The second part is about adding cross-platform support to the LHCb software stack. Here, the LHCb stack is successfully ported to ARM (aarch64) and its performance is analyzed. At the end of the thesis, the port to PowerPC(ppc64le) awaits the performance analysis. The main goal of porting the stack is the cost-performance evaluation for the different platforms to get the most cost efficient hardware for the new server farm for the upgrade. For this, selected vectorization libraries are extended to support the PowerPC and ARM platform. And though the same compiler is used, platform-specific changes to the compilation flags are required. In...

  11. WeBCMD: A cross-platform interface for the BCMD modelling framework [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Joshua Russell-Buckland

    2017-07-01

    Full Text Available Multimodal monitoring of the brain generates a great quantity of data, providing the potential for great insight into both healthy and injured cerebral dynamics. In particular, near-infrared spectroscopy can be used to measure various physiological variables of interest, such as haemoglobin oxygenation and the redox state of cytochrome-c-oxidase, alongside systemic signals, such as blood pressure. Interpreting these measurements is a complex endeavour, and much work has been done to develop mathematical models that can help to provide understanding of the underlying processes that contribute to the overall dynamics. BCMD is a software framework that was developed to run such models. However, obtaining, installing and running this software is no simple task. Here we present WeBCMD, an online environment that attempts to make the process simpler and much more accessible. By leveraging modern web technologies, an extensible and cross-platform package has been created that can also be accessed remotely from the cloud. WeBCMD is available as a Docker image and an online service.

  12. Data validation report for the 100-HR-3 Operable Unit, fifth round groundwater samples

    International Nuclear Information System (INIS)

    Vukelich, S.E.

    1994-01-01

    The data from the chemical analysis of 68 samples from the 100-HR-3 Operable Unit Third Quarter 1993 Groundwater Sampling Investigation and their related quality assurance samples were reviewed and validated to verify that reported sample results were of sufficient quality to support decisions regarding remedial actions performed at the site. Sample analysis included inorganics and general chemical parameters. Fifty three samples were validated for radiochemical parameters

  13. Data validation report for the 100-KR-4 operable unit first quarter, 1994

    International Nuclear Information System (INIS)

    Krug, A.D.

    1994-01-01

    Samples were obtained from the 100-KR-4 Operable Unit first Quarter 1994 Groundwater Sampling event. The data from the chemical analysis of fifty-eight samples from this sampling event and their related quality assurance samples were reviewed and validated to verify that reported samples results were of sufficient quality to support decisions regarding remedial actions performed at this site. Information fro the sampling event and the information validation processes are presented in this document

  14. Data validation report for the 100-D Ponds Operable Unit: 100-D ponds sampling

    International Nuclear Information System (INIS)

    Stankovich, M.T.

    1994-01-01

    Westinghouse-Hanford has requested that 100 percent of the Sample Delivery Groups be validated for the 100-D Ponds Operable Unit Sampling Investigation. Therefore the data from the chemical analysis of all 30 samples from this sampling event and their related quality assurance samples were reviewed and validated to verify that reported sample results were of sufficient quality to support decisions regarding remedial actions performed at this site

  15. Data validation summary report for the 100-BC-5 Operable Unit Round 8 Groundwater Sampling

    International Nuclear Information System (INIS)

    Kearney, A.T.

    1996-03-01

    The information provided in this validation summary report includes data from the chemical analyses of samples from the 100-BC-5 Operable Unit Round 8 Groundwater Sampling Investigation. All of the data from this sampling event and their related quality assurance samples were reviewed and validated to verify that the reported sample results were of sufficient quality to support decisions regarding remedial actions performed at this site. Sample analyses included metals, general chemistry and radiochemistry

  16. UAS Cross Platform JTA

    Science.gov (United States)

    2014-07-18

    1.16 Verify system align and degradations to determine impact to mission. 17 1.17 Ensure clearance of line personnel, ground equipment, and other...as needed during phases of flight. 99 7.12 Manage data security and data links during communications. 100 7.13 Obtain IFR clearance over radio...example, heading or airspeed) to return aircraft to intended course. 116 8.10 Perform navigation under instrument flight rules ( IFR ). 117 8.11

  17. Validating a benchmarking tool for audit of early outcomes after operations for head and neck cancer

    OpenAIRE

    Tighe, D.; Sassoon, I.; McGurk, M.

    2017-01-01

    INTRODUCTION In 2013 all UK surgical specialties, with the exception of head and neck surgery, published outcome data adjusted for case mix for indicator operations. This paper reports a pilot study to validate a previously published risk adjustment score on patients from separate UK cancer centres. METHODS A case note audit was performed of 1,075 patients undergoing 1,218 operations for head and neck squamous cell carcinoma under general anaesthesia in 4 surgical centres. A logistic regressi...

  18. SU-C-BRC-06: OpenCL-Based Cross-Platform Monte Carlo Simulation Package for Carbon Ion Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Qin, N; Tian, Z; Pompos, A; Jiang, S; Jia, X [UT Southwestern Medical Ctr, Dallas, TX (United States); Pinto, M; Dedes, G; Parodi, K [Ludwig-Maximilians-Universitaet Muenchen, Garching / Munich (Germany)

    2016-06-15

    Purpose: Monte Carlo (MC) simulation is considered to be the most accurate method for calculation of absorbed dose and fundamental physical quantities related to biological effects in carbon ion therapy. Its long computation time impedes clinical and research applications. We have developed an MC package, goCMC, on parallel processing platforms, aiming at achieving accurate and efficient simulations for carbon therapy. Methods: goCMC was developed under OpenCL framework. It supported transport simulation in voxelized geometry with kinetic energy up to 450 MeV/u. Class II condensed history algorithm was employed for charged particle transport with stopping power computed via Bethe-Bloch equation. Secondary electrons were not transported with their energy locally deposited. Energy straggling and multiple scattering were modeled. Production of secondary charged particles from nuclear interactions was implemented based on cross section and yield data from Geant4. They were transported via the condensed history scheme. goCMC supported scoring various quantities of interest e.g. physical dose, particle fluence, spectrum, linear energy transfer, and positron emitting nuclei. Results: goCMC has been benchmarked against Geant4 with different phantoms and beam energies. For 100 MeV/u, 250 MeV/u and 400 MeV/u beams impinging to a water phantom, range difference was 0.03 mm, 0.20 mm and 0.53 mm, and mean dose difference was 0.47%, 0.72% and 0.79%, respectively. goCMC can run on various computing devices. Depending on the beam energy and voxel size, it took 20∼100 seconds to simulate 10{sup 7} carbons on an AMD Radeon GPU card. The corresponding CPU time for Geant4 with the same setup was 60∼100 hours. Conclusion: We have developed an OpenCL-based cross-platform carbon MC simulation package, goCMC. Its accuracy, efficiency and portability make goCMC attractive for research and clinical applications in carbon therapy.

  19. Data validation report for the 100-HR-3 Operable Unit first quarter 1994 groundwater sampling data

    Energy Technology Data Exchange (ETDEWEB)

    Biggerstaff, R.L.

    1994-06-24

    Westinghouse-Hanford has requested that a minimum of 20% of the total number of Sample Delivery Groups be validated for the 100-HR-3 Operable Unit First Quarter 1994 Groundwater Sampling Investigation. Therefore, the data from the chemical analysis of twenty-four samples from this sampling event and their related quality assurance samples were reviewed and validated to verify that reported sample results were of sufficient quality to support decisions regarding remedial actions performed at this site. The samples were analyzed by Thermo-Analytic Laboratories (TMA) and Roy F. Weston Laboratories (WESTON) using US Environmental Protection Agency (EPA) CLP protocols. Sample analyses included: inorganics; and general chemical parameters. Forty-two samples were validated for radiochemical parameters by TMA and Teledyne.

  20. Point-to-Point! Validation of the Small Aircraft Transportation System Higher Volume Operations Concept

    Science.gov (United States)

    Williams, Daniel M.

    2006-01-01

    Described is the research process that NASA researchers used to validate the Small Aircraft Transportation System (SATS) Higher Volume Operations (HVO) concept. The four phase building-block validation and verification process included multiple elements ranging from formal analysis of HVO procedures to flight test, to full-system architecture prototype that was successfully shown to the public at the June 2005 SATS Technical Demonstration in Danville, VA. Presented are significant results of each of the four research phases that extend early results presented at ICAS 2004. HVO study results have been incorporated into the development of the Next Generation Air Transportation System (NGATS) vision and offer a validated concept to provide a significant portion of the 3X capacity improvement sought after in the United States National Airspace System (NAS).

  1. Non-destructive measurements of nuclear wastes. Validation and industrial operating experience

    International Nuclear Information System (INIS)

    Saas, A.; Tchemitciieff, E.

    1993-01-01

    After a short survey of the means employed for the non-destructive measurement of specific activities (γ and X-ray) in waste packages and raw waste, the performances of the device and the ANDRA requirements are presented. The validation of the γ and X-ray measurements on packages is obtained through determining, by destructive means, the same activity on coring samples. The same procedure is used for validating the homogeneity measurements on packages (either homogeneous or heterogeneous). Different operating experiences are then exposed for several kinds of packages and waste. Up to now, about twenty different types of packages have been examined and more than 200 packages have allowed the calibration, validation, and control

  2. Data validation report for the 100-HR-3 Operable Unit first quarter 1994 groundwater sampling data

    International Nuclear Information System (INIS)

    Biggerstaff, R.L.

    1994-01-01

    Westinghouse-Hanford has requested that a minimum of 20% of the total number of Sample Delivery Groups be validated for the 100-HR-3 Operable Unit First Quarter 1994 Groundwater Sampling Investigation. Therefore, the data from the chemical analysis of twenty-four samples from this sampling event and their related quality assurance samples were reviewed and validated to verify that reported sample results were of sufficient quality to support decisions regarding remedial actions performed at this site. The samples were analyzed by Thermo-Analytic Laboratories (TMA) and Roy F. Weston Laboratories (WESTON) using US Environmental Protection Agency (EPA) CLP protocols. Sample analyses included: inorganics; and general chemical parameters. Forty-two samples were validated for radiochemical parameters by TMA and Teledyne

  3. Development of an integrated signal validation system and application to operating power plants

    International Nuclear Information System (INIS)

    Upadhyaya, B.R.; Holbert, K.E.; Kerlin, T.W.

    1989-01-01

    The objective of the university-industry joint research program at the University of Tennessee and Combustion Engineering, Inc. is to develop and implement a comprehensive signal validation system for current power plants and future advanced reactors. The integrated system consists of several parallel signal processing modules. The multi-modular decision information is combined to detect, isolate and characterize faulty signals. The signal validation system has been implemented in a VAX workstation and applied to operational data from a pressurized water reactor (PWR) and the Experimental Breeder Reactor-II (EBR-II). The use of the various signal validation techniques may be extended to predictive maintenance advising, instrument calibration verification, and to the development of intelligent instrumentation systems. 18 refs., 6 figs

  4. Validation of Continuous CHP Operation of a Two-Stage Biomass Gasifier

    DEFF Research Database (Denmark)

    Ahrenfeldt, Jesper; Henriksen, Ulrik Birk; Jensen, Torben Kvist

    2006-01-01

    The Viking gasification plant at the Technical University of Denmark was built to demonstrate a continuous combined heat and power operation of a two-stage gasifier fueled with wood chips. The nominal input of the gasifier is 75 kW thermal. To validate the continuous operation of the plant, a 9-day...... measurement campaign was performed. The campaign verified a stable operation of the plant, and the energy balance resulted in an overall fuel to gas efficiency of 93% and a wood to electricity efficiency of 25%. Very low tar content in the producer gas was observed: only 0.1 mg/Nm3 naphthalene could...... be measured in raw gas. A stable engine operation on the producer gas was observed, and very low emissions of aldehydes, N2O, and polycyclic aromatic hydrocarbons were measured....

  5. The design and validation of advanced operator support systems for a role in plant safety

    International Nuclear Information System (INIS)

    Hughes, G.

    1989-06-01

    Advanced operator support systems have the potential of making a significant contribution to plant safety. This note reviews the different support functions required, the specification of performance criteria and possible approaches for system validation. The importance of the different functions that can be provided is related to the stage of the accident sequence. Also, because of the restricted reliability of any single system, subdivision of the systems is suggested in order to make the maximum contribution at a number of sequential stages. In this way it should be possible to make a significant claim for reduced operator error over the full accident progression, from incipient fault to disaster. The use of performance criteria currently associated with the classification of safety-grade trip systems (e.g. detection failure probability) would seem to provide a sound basis for validation. The validation of systems is seen as a significant task which will rely on the use of design and training-simulator data together with specific plant measurements. Expert systems appear to present particular problems for validation. (author)

  6. Multi-platform operational validation of the Western Mediterranean SOCIB forecasting system

    Science.gov (United States)

    Juza, Mélanie; Mourre, Baptiste; Renault, Lionel; Tintoré, Joaquin

    2014-05-01

    The development of science-based ocean forecasting systems at global, regional, and local scales can support a better management of the marine environment (maritime security, environmental and resources protection, maritime and commercial operations, tourism, ...). In this context, SOCIB (the Balearic Islands Coastal Observing and Forecasting System, www.socib.es) has developed an operational ocean forecasting system in the Western Mediterranean Sea (WMOP). WMOP uses a regional configuration of the Regional Ocean Modelling System (ROMS, Shchepetkin and McWilliams, 2005) nested in the larger scale Mediterranean Forecasting System (MFS) with a spatial resolution of 1.5-2km. WMOP aims at reproducing both the basin-scale ocean circulation and the mesoscale variability which is known to play a crucial role due to its strong interaction with the large scale circulation in this region. An operational validation system has been developed to systematically assess the model outputs at daily, monthly and seasonal time scales. Multi-platform observations are used for this validation, including satellite products (Sea Surface Temperature, Sea Level Anomaly), in situ measurements (from gliders, Argo floats, drifters and fixed moorings) and High-Frequency radar data. The validation procedures allow to monitor and certify the general realism of the daily production of the ocean forecasting system before its distribution to users. Additionally, different indicators (Sea Surface Temperature and Salinity, Eddy Kinetic Energy, Mixed Layer Depth, Heat Content, transports in key sections) are computed every day both at the basin-scale and in several sub-regions (Alboran Sea, Balearic Sea, Gulf of Lion). The daily forecasts, validation diagnostics and indicators from the operational model over the last months are available at www.socib.es.

  7. Validating a benchmarking tool for audit of early outcomes after operations for head and neck cancer.

    Science.gov (United States)

    Tighe, D; Sassoon, I; McGurk, M

    2017-04-01

    INTRODUCTION In 2013 all UK surgical specialties, with the exception of head and neck surgery, published outcome data adjusted for case mix for indicator operations. This paper reports a pilot study to validate a previously published risk adjustment score on patients from separate UK cancer centres. METHODS A case note audit was performed of 1,075 patients undergoing 1,218 operations for head and neck squamous cell carcinoma under general anaesthesia in 4 surgical centres. A logistic regression equation predicting for all complications, previously validated internally at sites A-C, was tested on a fourth external validation sample (site D, 172 operations) using receiver operating characteristic curves, Hosmer-Lemeshow goodness of fit analysis and Brier scores. RESULTS Thirty-day complication rates varied widely (34-51%) between the centres. The predictive score allowed imperfect risk adjustment (area under the curve: 0.70), with Hosmer-Lemeshow analysis suggesting good calibration. The Brier score changed from 0.19 for sites A-C to 0.23 when site D was also included, suggesting poor accuracy overall. CONCLUSIONS Marked differences in operative risk and patient case mix captured by the risk adjustment score do not explain all the differences in observed outcomes. Further investigation with different methods is recommended to improve modelling of risk. Morbidity is common, and usually has a major impact on patient recovery, ward occupancy, hospital finances and patient perception of quality of care. We hope comparative audit will highlight good performance and challenge underperformance where it exists.

  8. Validity and reliability of a dental operator posture assessment instrument (PAI).

    Science.gov (United States)

    Branson, Bonnie G; Williams, Karen B; Bray, Kimberly Krust; Mcllnay, Sandy L; Dickey, Diana

    2002-01-01

    Basic operating posture is considered an important occupational health issue for oral health care clinicians. It is generally agreed that the physical posture of the operator, while providing care, should be such that all muscles are in a relaxed, well-balanced, and neutral position. Postures outside of this neutral position are likely to cause musculoskeletal discomfort. To date, the range of the neutral operator position has not been well-defined; nor have any specific instruments been identified that can quantitatively or semi-quantitatively assess dental operator posture. This paper reports on the development of an instrument that can be used to semi-quantitatively evaluate postural components. During the first phase of the study, an expert panel defined the basic parameters for acceptable, compromised, and harmful operator postures and established face validity of a posture assessment instrument (PAI). During the second phase, the PAI was tested for reliability using generalizability theory. Four raters tested the instrument for reliability. Overall, total PAI scores were similar amongst three of the raters, with the fourth rater's scores being slightly greater than the other three. The main effect of the rater on individual postural components was moderate, indicating that rater variance contributed to 11.9% of total variance. The PAI measures posture as it occurs and will have numerous applications when evaluating operator performance in the dental and dental hygiene education setting. Also, the PAI will prove useful when examining the effects of operator posture and musculoskeletal disorders.

  9. Experimental methods to validate measures of emotional state and readiness for duty in critical operations

    International Nuclear Information System (INIS)

    Weston, Louise Marie

    2007-01-01

    A recent report on criticality accidents in nuclear facilities indicates that human error played a major role in a significant number of incidents with serious consequences and that some of these human errors may be related to the emotional state of the individual. A pre-shift test to detect a deleterious emotional state could reduce the occurrence of such errors in critical operations. The effectiveness of pre-shift testing is a challenge because of the need to gather predictive data in a relatively short test period and the potential occurrence of learning effects due to a requirement for frequent testing. This report reviews the different types of reliability and validity methods and testing and statistical analysis procedures to validate measures of emotional state. The ultimate value of a validation study depends upon the percentage of human errors in critical operations that are due to the emotional state of the individual. A review of the literature to identify the most promising predictors of emotional state for this application is highly recommended

  10. Validation experiments of the chimney model for the operational simulation of hydrogen recombiners

    International Nuclear Information System (INIS)

    Simon, Berno

    2013-01-01

    The calculation program REKO-DIREKT allows the simulation of the operational behavior of a hydrogen recombiner during accidents with hydrogen release. The interest is focused on the interaction between the catalyst insertion and the chimney that influences the natural ventilation and thus the throughput through the recombiner significantly. For validation experiments were performed with a small-scale recombiner model in the test facility REKO-4. The results show the correlation between the hydrogen concentration at the recombiner entrance, the temperature on catalyst sheets and the entrance velocity using different chimney heights. The entrance velocity increases with the heights of the installed chimney that influences the natural ventilation significantly. The results allow the generation of a wide data base for validation of the computer code REKO-DIREKT.

  11. Validity and design of environmental surveillance systems for operating nuclear power plants. Final report

    International Nuclear Information System (INIS)

    Eichholz, G.G.

    1977-12-01

    The composition and procedures of environmental surveillance programs at completed and operating nuclear power plants have been examined with respect to their validity, continuing significance and cost. It was found that many programs contain components that are mainly an extension of preoperational baseline measurements that need not be continued indefinitely and that others lack the statistical validity to make their continued application meaningful. To identify the practical limits imposed by counting statistics and realistic equipment capacity measurements were done on iodine-131 and cesium-137 containing samples to establish detectability limits and proportionate costs for sample preparation and counting. It was found that under commercial conditions effective detectability limits and expected confidence limits were substantially higher than those mentioned in NRC Regulatory Guides. This imposes a need for either selecting fewer samples and counting them for longer times or accepting a lesser accuracy on more samples, within the bounds of reasonable cost per sample

  12. Validation and prediction of traditional Chinese physical operation on spinal disease using multiple deformation models.

    Science.gov (United States)

    Pan, Lei; Yang, Xubo; Gu, Lixu; Lu, Wenlong; Fang, Min

    2011-03-01

    Traditional Chinese medical massage is a physical manipulation that achieves satisfactory results on spinal diseases, according to its advocates. However, the method relies on an expert's experience. Accurate analysis and simulation of massage are essential for validation of traditional Chinese physical treatment. The objective of this study is to provide analysis and simulation that can reproducibly verify and predict treatment efficacy. An improved physical multi-deformation model for simulating human cervical spine is proposed. First, the human spine, which includes muscle, vertebrae and inter- vertebral disks, are segmented and reconstructed from clinical CT and MR images. Homogeneous landmark registration is employed to align the spine models before and after the massage manipulation. Central line mass spring and contact FEM deformation models are used to individually evaluate spinal anatomy variations. The response of the human spine during the massage process is simulated based on specific clinical cases. Ten sets of patient data, including muscle-force relationships, displacement of vertebrae, strain and stress distribution on inter-vertebral disks were collected, including the pre-operation, post-operation and the 3-month follow-up. The simulation results demonstrate that traditional Chinese massage could significantly affect and treat most mild spinal disease. A new method that simulates a traditional Chinese medical massage operation on the human spine may be a useful tool to scientifically validate and predict treatment efficacy.

  13. Preliminary Validation of the Small Aircraft Transportation System Higher Volume Operations (SATS HVO) Concept

    Science.gov (United States)

    Williams, Daniel; Consiglio, Maria; Murdoch, Jennifer; Adams, Catherine

    2004-01-01

    This document provides a preliminary validation of the Small Aircraft Transportation System (SATS) Higher Volume Operations (HVO) concept for normal conditions. Initial results reveal that the concept provides reduced air traffic delays when compared to current operations without increasing pilot workload. Characteristic to the SATS HVO concept is the establishment of a newly defined area of flight operations called a Self-Controlled Area (SCA) which would be activated by air traffic control (ATC) around designated non-towered, non-radar airports. During periods of poor visibility, SATS pilots would take responsibility for separation assurance between their aircraft and other similarly equipped aircraft in the SCA. Using onboard equipment and simple instrument flight procedures, they would then be better able to approach and land at the airport or depart from it. This concept would also require a new, ground-based automation system, typically located at the airport that would provide appropriate sequencing information to the arriving aircraft. Further validation of the SATS HVO concept is required and is the subject of ongoing research and subsequent publications.

  14. EDF EPR project: operating principles validation and human factor engineering program

    International Nuclear Information System (INIS)

    Lefebvre, B.; Berard, E.; Arpino, J.-M.

    2005-01-01

    This article describes the specificities of the operating principles chosen by EDF for the EPR project as a result of an extensive Human Factor Engineering program successfully implemented in an industrial project context. The design process and its achievements benefit of the EDF experience feedback not only in term of NPP operation - including the fully computerized control room of the N4-serie - but also in term of NPP designer. The elements exposed hereafter correspond to the basic design phase of EPR HMI which has been completed and successfully validated by the end of 2003. The article aims to remind the context of the project which basically consists in designing a modern and efficient HMI taking into account the operating needs while relying on proven and reliable technologies. The Human Factor Engineering program implemented merges these both aspects by : 1) being fully integrated within the project activities and scheduling; 2) efficiently taking into account the users needs as well as the feasibility constraints by relying on a multidisciplinary design team including HF specialists, I and C specialists, Process specialists and experienced operator representatives. The resulting design process makes a wide use of experience feedback and experienced operator knowledge to complete largely the existing standards for providing a fully useable and successful design method in an industrial context. The article underlines the design process highlights that largely contribute to the successful implementation of a Human Factor Engineering program for EPR. (authors)

  15. Cloud detection algorithm comparison and validation for operational Landsat data products

    Science.gov (United States)

    Foga, Steven Curtis; Scaramuzza, Pat; Guo, Song; Zhu, Zhe; Dilley, Ronald; Beckmann, Tim; Schmidt, Gail L.; Dwyer, John L.; Hughes, MJ; Laue, Brady

    2017-01-01

    Clouds are a pervasive and unavoidable issue in satellite-borne optical imagery. Accurate, well-documented, and automated cloud detection algorithms are necessary to effectively leverage large collections of remotely sensed data. The Landsat project is uniquely suited for comparative validation of cloud assessment algorithms because the modular architecture of the Landsat ground system allows for quick evaluation of new code, and because Landsat has the most comprehensive manual truth masks of any current satellite data archive. Currently, the Landsat Level-1 Product Generation System (LPGS) uses separate algorithms for determining clouds, cirrus clouds, and snow and/or ice probability on a per-pixel basis. With more bands onboard the Landsat 8 Operational Land Imager (OLI)/Thermal Infrared Sensor (TIRS) satellite, and a greater number of cloud masking algorithms, the U.S. Geological Survey (USGS) is replacing the current cloud masking workflow with a more robust algorithm that is capable of working across multiple Landsat sensors with minimal modification. Because of the inherent error from stray light and intermittent data availability of TIRS, these algorithms need to operate both with and without thermal data. In this study, we created a workflow to evaluate cloud and cloud shadow masking algorithms using cloud validation masks manually derived from both Landsat 7 Enhanced Thematic Mapper Plus (ETM +) and Landsat 8 OLI/TIRS data. We created a new validation dataset consisting of 96 Landsat 8 scenes, representing different biomes and proportions of cloud cover. We evaluated algorithm performance by overall accuracy, omission error, and commission error for both cloud and cloud shadow. We found that CFMask, C code based on the Function of Mask (Fmask) algorithm, and its confidence bands have the best overall accuracy among the many algorithms tested using our validation data. The Artificial Thermal-Automated Cloud Cover Algorithm (AT-ACCA) is the most accurate

  16. Operational Street Pollution Model (OSPM) - a review of performed validation studies, and future prospects

    DEFF Research Database (Denmark)

    Kakosimos K.E., Konstantinos E.; Hertel, Ole; Ketzel, Matthias

    2010-01-01

    in this context is the fast and easy to apply Operational Street Pollution Model (OSPM). For almost 20 years, OSPM has been routinely used in many countries for studying traffic pollution, performing analyses of field campaign measurements, studying efficiency of pollution abatement strategies, carrying out...... exposure assessments and as reference in comparisons to other models. OSPM is generally considered as state-of-the-art in applied street pollution modelling. This paper outlines the most important findings in OSPM validation and application studies in literature. At the end of the paper, future research...... needs are outlined for traffic air pollution modelling in general but with outset in the research performed with OSPM....

  17. Verification and validation--The key to operating plant software reliability

    International Nuclear Information System (INIS)

    Daughtrey, H.T.; Daggett, P.W.; Schamp, C.A.

    1983-01-01

    This paper discusses the design and implementation of a verification and validation (V and V) plan for reviewing the microcomputer software developed for a Safety Parameter Display System (SPDS). Topics considered include a historical perspective on V and V, the function and significance of SPDS software, and testing. An SPDS provides information to nuclear power plant operators about the status of the plant under all operating conditions. It is determined that by implementing V and V activities throughout the development cycle, problems are less expensive to locate in the early phases of software development, problems are less expensive to fix in the early phases of software development, and a parallel V and V activity is more cost effective than a similar effort performed only at the end of software development. It is concluded that V and V is a proven tool for improving power plant software reliability

  18. Enhanced Oceanic Operations Human-In-The-Loop In-Trail Procedure Validation Simulation Study

    Science.gov (United States)

    Murdoch, Jennifer L.; Bussink, Frank J. L.; Chamberlain, James P.; Chartrand, Ryan C.; Palmer, Michael T.; Palmer, Susan O.

    2008-01-01

    The Enhanced Oceanic Operations Human-In-The-Loop In-Trail Procedure (ITP) Validation Simulation Study investigated the viability of an ITP designed to enable oceanic flight level changes that would not otherwise be possible. Twelve commercial airline pilots with current oceanic experience flew a series of simulated scenarios involving either standard or ITP flight level change maneuvers and provided subjective workload ratings, assessments of ITP validity and acceptability, and objective performance measures associated with the appropriate selection, request, and execution of ITP flight level change maneuvers. In the majority of scenarios, subject pilots correctly assessed the traffic situation, selected an appropriate response (i.e., either a standard flight level change request, an ITP request, or no request), and executed their selected flight level change procedure, if any, without error. Workload ratings for ITP maneuvers were acceptable and not substantially higher than for standard flight level change maneuvers, and, for the majority of scenarios and subject pilots, subjective acceptability ratings and comments for ITP were generally high and positive. Qualitatively, the ITP was found to be valid and acceptable. However, the error rates for ITP maneuvers were higher than for standard flight level changes, and these errors may have design implications for both the ITP and the study's prototype traffic display. These errors and their implications are discussed.

  19. Hip2Norm: an object-oriented cross-platform program for 3D analysis of hip joint morphology using 2D pelvic radiographs.

    Science.gov (United States)

    Zheng, G; Tannast, M; Anderegg, C; Siebenrock, K A; Langlotz, F

    2007-07-01

    We developed an object-oriented cross-platform program to perform three-dimensional (3D) analysis of hip joint morphology using two-dimensional (2D) anteroposterior (AP) pelvic radiographs. Landmarks extracted from 2D AP pelvic radiographs and optionally an additional lateral pelvic X-ray were combined with a cone beam projection model to reconstruct 3D hip joints. Since individual pelvic orientation can vary considerably, a method for standardizing pelvic orientation was implemented to determine the absolute tilt/rotation. The evaluation of anatomically morphologic differences was achieved by reconstructing the projected acetabular rim and the measured hip parameters as if obtained in a standardized neutral orientation. The program had been successfully used to interactively objectify acetabular version in hips with femoro-acetabular impingement or developmental dysplasia. Hip(2)Norm is written in object-oriented programming language C++ using cross-platform software Qt (TrollTech, Oslo, Norway) for graphical user interface (GUI) and is transportable to any platform.

  20. Validation of Scales from the Deployment Risk and Resilience Inventory in a Sample of Operation Iraqi Freedom Veterans

    National Research Council Canada - National Science Library

    Vogt, D. S; Proctor, S. P; King, D. W; King, L. A; Vasterling, J. J

    2008-01-01

    .... Although initial evidence for the reliability and validity of DRRI scales based on Gulf War veteran samples is encouraging, evidence with respect to a more contemporary cohort of Operation Iraqi Freedom (OIF...

  1. On the use of the observation-wise k-fold operation in PCA cross-validation

    NARCIS (Netherlands)

    Saccenti, E.; Camacho, J.

    2015-01-01

    Cross-validation (CV) is a common approach for determining the optimal number of components in a principal component analysis model. To guarantee the independence between model testing and calibration, the observationwise k-fold operation is commonly implemented in each cross-validation step. This

  2. Numerical Simulation and Validation of a High Head Model Francis Turbine at Part Load Operating Condition

    Science.gov (United States)

    Goyal, Rahul; Trivedi, Chirag; Kumar Gandhi, Bhupendra; Cervantes, Michel J.

    2017-07-01

    Hydraulic turbines are operated over an extended operating range to meet the real time electricity demand. Turbines operated at part load have flow parameters not matching the designed ones. This results in unstable flow conditions in the runner and draft tube developing low frequency and high amplitude pressure pulsations. The unsteady pressure pulsations affect the dynamic stability of the turbine and cause additional fatigue. The work presented in this paper discusses the flow field investigation of a high head model Francis turbine at part load: 50% of the rated load. Numerical simulation of the complete turbine has been performed. Unsteady pressure pulsations in the vaneless space, runner, and draft tube are investigated and validated with available experimental data. Detailed analysis of the rotor stator interaction and draft tube flow field are performed and discussed. The analysis shows the presence of a rotating vortex rope in the draft tube at the frequency of 0.3 times of the runner rotational frequency. The frequency of the vortex rope precession, which causes severe fluctuations and vibrations in the draft tube, is predicted within 3.9% of the experimental measured value. The vortex rope results pressure pulsations propagating in the system whose frequency is also perceive in the runner and upstream the runner.

  3. Reactor core modeling practice: Operational requirements, model characteristics, and model validation

    International Nuclear Information System (INIS)

    Zerbino, H.

    1997-01-01

    The physical models implemented in power plant simulators have greatly increased in performance and complexity in recent years. This process has been enabled by the ever increasing computing power available at affordable prices. This paper describes this process from several angles: First the operational requirements which are more critical from the point of view of model performance, both for normal and off-normal operating conditions; A second section discusses core model characteristics in the light of the solutions implemented by Thomson Training and Simulation (TT and S) in several full-scope simulators recently built and delivered for Dutch, German, and French nuclear power plants; finally we consider the model validation procedures, which are of course an integral part of model development, and which are becoming more and more severe as performance expectations increase. As a conclusion, it may be asserted that in the core modeling field, as in other areas, the general improvement in the quality of simulation codes has resulted in a fairly rapid convergence towards mainstream engineering-grade calculations. This is remarkable performance in view of the stringent real-time requirements which the simulation codes must satisfy as well as the extremely wide range of operating conditions that they are called upon to cover with good accuracy. (author)

  4. Validation and Verification of Operational Land Analysis Activities at the Air Force Weather Agency

    Science.gov (United States)

    Shaw, Michael; Kumar, Sujay V.; Peters-Lidard, Christa D.; Cetola, Jeffrey

    2012-01-01

    The NASA developed Land Information System (LIS) is the Air Force Weather Agency's (AFWA) operational Land Data Assimilation System (LDAS) combining real time precipitation observations and analyses, global forecast model data, vegetation, terrain, and soil parameters with the community Noah land surface model, along with other hydrology module options, to generate profile analyses of global soil moisture, soil temperature, and other important land surface characteristics. (1) A range of satellite data products and surface observations used to generate the land analysis products (2) Global, 1/4 deg spatial resolution (3) Model analysis generated at 3 hours. AFWA recognizes the importance of operational benchmarking and uncertainty characterization for land surface modeling and is developing standard methods, software, and metrics to verify and/or validate LIS output products. To facilitate this and other needs for land analysis activities at AFWA, the Model Evaluation Toolkit (MET) -- a joint product of the National Center for Atmospheric Research Developmental Testbed Center (NCAR DTC), AFWA, and the user community -- and the Land surface Verification Toolkit (LVT), developed at the Goddard Space Flight Center (GSFC), have been adapted to operational benchmarking needs of AFWA's land characterization activities.

  5. STORMVEX: The Storm Peak Lab Cloud Property Validation Experiment Science and Operations Plan

    Energy Technology Data Exchange (ETDEWEB)

    Mace, J; Matrosov, S; Shupe, M; Lawson, P; Hallar, G; McCubbin, I; Marchand, R; Orr, B; Coulter, R; Sedlacek, A; Avallone, L; Long, C

    2010-09-29

    During the Storm Peak Lab Cloud Property Validation Experiment (STORMVEX), a substantial correlative data set of remote sensing observations and direct in situ measurements from fixed and airborne platforms will be created in a winter season, mountainous environment. This will be accomplished by combining mountaintop observations at Storm Peak Laboratory and the airborne National Science Foundation-supported Colorado Airborne Multi-Phase Cloud Study campaign with collocated measurements from the second ARM Mobile Facility (AMF2). We describe in this document the operational plans and motivating science for this experiment, which includes deployment of AMF2 to Steamboat Springs, Colorado. The intensive STORMVEX field phase will begin nominally on 1 November 2010 and extend to approximately early April 2011.

  6. A Study on the Construct Validity of Safety Culture Oversight Model for Nuclear Power Operating Organization

    International Nuclear Information System (INIS)

    Jung, Su Jin; Choi, Young Sung; Oh, Jang Jin

    2015-01-01

    In Korea, the safety policy statement declared in 1994 by government stressed the importance of safety culture and licensees were encouraged to manage and conduct their self-assessments. A change in regulatory position about safety culture oversight was made after the event of SBO cover-up in Kori unit 1 and several subsequent falsification events. Since then KINS has been developing licensee's safety culture oversight system including conceptual framework of oversight, prime focus area for oversight, and specific details on regulatory expectations, all of which are based on defence-in-depth (DiD) safety enhancement approach. Development and gathering of performance data which is related to actual 'safety' of nuclear power plant are needed to identify the relationship between safety culture and safety performance. Authors consider this study as pilot which has a contribution on verifying the construct validity of the model and the effectiveness of survey based research. This is the first attempt that the validity of safety culture oversight model has been investigated with empirical data obtained from Korean nuclear power operating organization

  7. Data validation report for the 100-FR-3 Operable Unit, third round groundwater samples

    International Nuclear Information System (INIS)

    Ayres, J.M.

    1994-01-01

    Westinghouse-Hanford has requested that a minimum of 20% of the total number of Sample Delivery Groups be validated for the 100-FR-3 operable Unit Third Round Groundwater sampling investigation. Therefore, the data from the chemical analysis of 51 samples from this sampling event and their related quality assurance samples were reviewed and validated to verify that reported sample results were of sufficient quality to support decisions regarding remedial actions performed at this site. The report is broken down into sections for each chemical analysis and radiochemical analysis type. Each section addresses the data package completeness, holding time adherence, instrument calibration and tuning acceptability, blank results, accuracy, precision, system performance, as well as the compound identification and quantitation. In addition, each section has an overall assessment and summary for the data packages reviewed for the particular chemical/radiochemical analyses. Detailed backup information is provided to the reader by SDG No. and sample number. For each data package, a matrix of chemical analyses per sample number is presented, as well as data qualification summaries

  8. A Study on the Construct Validity of Safety Culture Oversight Model for Nuclear Power Operating Organization

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Su Jin; Choi, Young Sung; Oh, Jang Jin [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2015-05-15

    In Korea, the safety policy statement declared in 1994 by government stressed the importance of safety culture and licensees were encouraged to manage and conduct their self-assessments. A change in regulatory position about safety culture oversight was made after the event of SBO cover-up in Kori unit 1 and several subsequent falsification events. Since then KINS has been developing licensee's safety culture oversight system including conceptual framework of oversight, prime focus area for oversight, and specific details on regulatory expectations, all of which are based on defence-in-depth (DiD) safety enhancement approach. Development and gathering of performance data which is related to actual 'safety' of nuclear power plant are needed to identify the relationship between safety culture and safety performance. Authors consider this study as pilot which has a contribution on verifying the construct validity of the model and the effectiveness of survey based research. This is the first attempt that the validity of safety culture oversight model has been investigated with empirical data obtained from Korean nuclear power operating organization.

  9. Sliding spool design for reducing the actuation forces in direct operated proportional directional valves: Experimental validation

    International Nuclear Information System (INIS)

    Amirante, Riccardo; Distaso, Elia; Tamburrano, Paolo

    2016-01-01

    Highlights: • An innovative procedure to design a commercial proportional directional valve is shown. • Experimental tests are performed to demonstrate the flow force reduction. • The design is improved by means of a previously made optimization procedure. • Great reduction in the flow forces without reducing the flow rate is demonstrated. - Abstract: This paper presents the experimental validation of a new methodology for the design of the spool surfaces of four way three position direct operated proportional directional valves. The proposed methodology is based on the re-design of both the compensation profile (the central conical surface of the spool) and the lateral surfaces of the spool, in order to reduce the flow forces acting on the spool and hence the actuation forces. The aim of this work is to extend the application range of these valves to higher values of pressure and flow rate, thus avoiding the employment of more expensive two stage configurations in the case of high-pressure conditions and/or flow rate. The paper first presents a theoretical approach and a general strategy for the sliding spool design to be applied to any four way three position direct operated proportional directional valve. Then, the proposed approach is experimentally validated on a commercially available valve using a hydraulic circuit capable of measuring the flow rate as well as the actuation force over the entire spool stroke. The experimental results, performed using both the electronic driver provided by the manufacturer and a manual actuation system, show that the novel spool surface requires remarkably lower actuation forces compared to the commercial configuration, while maintaining the same flow rate trend as a function of the spool position.

  10. Validity limits of fuel rod performance calculations from radiochemical data at operating LWRs

    International Nuclear Information System (INIS)

    Zaenker, H.; Nebel, D.

    1986-01-01

    There are various calculational models for the assessment of the fuel rod performance on the basis of the activities of gaseous and volatile fission products in the reactor coolant. The most important condition for the applicability of the calculational models is that a steady state release of the fission products into the reactor coolant takes place. It is well known that the models are not applicable during or shortly after reactor transients. The fact that 'unsteady states' caused by the fuel defection processes themselves can also occur in rare cases at steady reactor operation has not been taken into account so far. A test of validity is suggested with the aid of which the applicability of the calculational models can be checked in any concrete case, and the misleading of the reactor operators by gross misinterpretation of the radiochemical data can be avoided. The criteria of applicability are the fission product total activity, the slope tan α in the relationship lg (R/sub i//B/sub i/) proportional to lg lambda/sub i/ for the gaseous and volatile fission products, and the activity of the nonvolatile isotope 239 Np. (author)

  11. XBT profilers for operational purposes: application and validation in real exercises

    Directory of Open Access Journals (Sweden)

    Francisco Machín

    2008-12-01

    Full Text Available A methodology for recovering salinity from expendable bathythermograph (XBT data is presented. The procedure exploits climatological relationships between temperature, salinity and depth to build regional characteristic curves by fitting a polynomial function that minimises both the variance of residuals and unknowns. Hence, salinity is computed and recovered as a function of temperature and depth. Empirical formula are provided to recover the salinity field from temperature-depth measurements for the Cantabrian Sea and Galician Area. The method is validated and applied in the context of two marine rescue exercises carried out in the Bay of Biscay close to the north coast of Spain and in the Finisterre region, where a series of XBT and conductivity-temperature-depth (CTD profiles were acquired during fast samplings. The results agree reasonably well with independent data in terms of the spatial structure, with the largest errors in the upper 100 m of the ocean and at intermediate levels. The first diagnoses of the surface geostrophic velocity fields obtained through the salinity reconstruction are coherent and may help in rescue and safety operations during marine emergencies. Hence, we recommend that a technical unit should consider this kind of expandable sampling strategy with both XBT and XCTD data during marine emergencies, since it provides useful and comprehensive information rapidly with minimal interference by means of formal operations on board search and rescue ships.

  12. Validation and Verification of Future Integrated Safety-Critical Systems Operating under Off-Nominal Conditions

    Science.gov (United States)

    Belcastro, Christine M.

    2010-01-01

    Loss of control remains one of the largest contributors to aircraft fatal accidents worldwide. Aircraft loss-of-control accidents are highly complex in that they can result from numerous causal and contributing factors acting alone or (more often) in combination. Hence, there is no single intervention strategy to prevent these accidents and reducing them will require a holistic integrated intervention capability. Future onboard integrated system technologies developed for preventing loss of vehicle control accidents must be able to assure safe operation under the associated off-nominal conditions. The transition of these technologies into the commercial fleet will require their extensive validation and verification (V and V) and ultimate certification. The V and V of complex integrated systems poses major nontrivial technical challenges particularly for safety-critical operation under highly off-nominal conditions associated with aircraft loss-of-control events. This paper summarizes the V and V problem and presents a proposed process that could be applied to complex integrated safety-critical systems developed for preventing aircraft loss-of-control accidents. A summary of recent research accomplishments in this effort is also provided.

  13. External gear pumps operating with non-Newtonian fluids: Modelling and experimental validation

    Science.gov (United States)

    Rituraj, Fnu; Vacca, Andrea

    2018-06-01

    External Gear Pumps are used in various industries to pump non-Newtonian viscoelastic fluids like plastics, paints, inks, etc. For both design and analysis purposes, it is often a matter of interest to understand the features of the displacing action realized by meshing of the gears and the description of the behavior of the leakages for this kind of pumps. However, very limited work can be found in literature about methodologies suitable to model such phenomena. This article describes the technique of modelling external gear pumps that operate with non-Newtonian fluids. In particular, it explains how the displacing action of the unit can be modelled using a lumped parameter approach which involves dividing fluid domain into several control volumes and internal flow connections. This work is built upon the HYGESim simulation tool, conceived by the authors' research team in the last decade, which is for the first time extended for the simulation of non-Newtonian fluids. The article also describes several comparisons between simulation results and experimental data obtained from numerous experiments performed for validation of the presented methodology. Finally, operation of external gear pump with fluids having different viscosity characteristics is discussed.

  14. Intercomparison and validation of operational coastal-scale models, the experience of the project MOMAR.

    Science.gov (United States)

    Brandini, C.; Coudray, S.; Taddei, S.; Fattorini, M.; Costanza, L.; Lapucci, C.; Poulain, P.; Gerin, R.; Ortolani, A.; Gozzini, B.

    2012-04-01

    The need for regional governments to implement operational systems for the sustainable management of coastal waters, in order to meet the requirements imposed by legislation (e.g. EU directives such as WFD, MSFD, BD and relevant national legislation) often lead to the implementation of coastal measurement networks and to the construction of computational models that surround and describe parts of regional seas without falling in the classic definition of regional/coastal models. Although these operational models may be structured to cover parts of different oceanographic basins, they can have considerable advantages and highlight relevant issues, such as the role of narrow channels, straits and islands in coastal circulation, as both in physical and biogeochemical processes such as in the exchanges of water masses among basins. Two models of this type were made in the context of cross-border European project MOMAR: an operational model of the Tuscan Archipelago sea and one around the Corsica coastal waters, which are both located between the Tyrrhenian and the Algerian-Ligurian-Provençal basins. Although these two models were based on different computer codes (MARS3D and ROMS), they have several elements in common, such as a 400 m resolution, boundary conditions from the same "father" model, and an important area of overlap, the Corsica channel, which has a key role in the exchange of water masses between the two oceanographic basins. In this work we present the results of the comparison of these two ocean forecasting systems in response to different weather and oceanographic forcing. In particular, we discuss aspects related to the validation of the two systems, and a systematic comparison between the forecast/hindcast based on such hydrodynamic models, as regards to both operational models available at larger scale, both to in-situ measurements made by fixed or mobile platforms. In this context we will also present the results of two oceanographic cruises in the

  15. Validating an Air Traffic Management Concept of Operation Using Statistical Modeling

    Science.gov (United States)

    He, Yuning; Davies, Misty Dawn

    2013-01-01

    Validating a concept of operation for a complex, safety-critical system (like the National Airspace System) is challenging because of the high dimensionality of the controllable parameters and the infinite number of states of the system. In this paper, we use statistical modeling techniques to explore the behavior of a conflict detection and resolution algorithm designed for the terminal airspace. These techniques predict the robustness of the system simulation to both nominal and off-nominal behaviors within the overall airspace. They also can be used to evaluate the output of the simulation against recorded airspace data. Additionally, the techniques carry with them a mathematical value of the worth of each prediction-a statistical uncertainty for any robustness estimate. Uncertainty Quantification (UQ) is the process of quantitative characterization and ultimately a reduction of uncertainties in complex systems. UQ is important for understanding the influence of uncertainties on the behavior of a system and therefore is valuable for design, analysis, and verification and validation. In this paper, we apply advanced statistical modeling methodologies and techniques on an advanced air traffic management system, namely the Terminal Tactical Separation Assured Flight Environment (T-TSAFE). We show initial results for a parameter analysis and safety boundary (envelope) detection in the high-dimensional parameter space. For our boundary analysis, we developed a new sequential approach based upon the design of computer experiments, allowing us to incorporate knowledge from domain experts into our modeling and to determine the most likely boundary shapes and its parameters. We carried out the analysis on system parameters and describe an initial approach that will allow us to include time-series inputs, such as the radar track data, into the analysis

  16. NASA Operational Simulator for Small Satellites: Tools for Software Based Validation and Verification of Small Satellites

    Science.gov (United States)

    Grubb, Matt

    2016-01-01

    The NASA Operational Simulator for Small Satellites (NOS3) is a suite of tools to aid in areas such as software development, integration test (IT), mission operations training, verification and validation (VV), and software systems check-out. NOS3 provides a software development environment, a multi-target build system, an operator interface-ground station, dynamics and environment simulations, and software-based hardware models. NOS3 enables the development of flight software (FSW) early in the project life cycle, when access to hardware is typically not available. For small satellites there are extensive lead times on many of the commercial-off-the-shelf (COTS) components as well as limited funding for engineering test units (ETU). Considering the difficulty of providing a hardware test-bed to each developer tester, hardware models are modeled based upon characteristic data or manufacturers data sheets for each individual component. The fidelity of each hardware models is such that FSW executes unaware that physical hardware is not present. This allows binaries to be compiled for both the simulation environment, and the flight computer, without changing the FSW source code. For hardware models that provide data dependent on the environment, such as a GPS receiver or magnetometer, an open-source tool from NASA GSFC (42 Spacecraft Simulation) is used to provide the necessary data. The underlying infrastructure used to transfer messages between FSW and the hardware models can also be used to monitor, intercept, and inject messages, which has proven to be beneficial for VV of larger missions such as James Webb Space Telescope (JWST). As hardware is procured, drivers can be added to the environment to enable hardware-in-the-loop (HWIL) testing. When strict time synchronization is not vital, any number of combinations of hardware components and software-based models can be tested. The open-source operator interface used in NOS3 is COSMOS from Ball Aerospace. For

  17. Data validation summary report for the 100-BC-5 Operable Unit Round 9 Groundwater Sampling. Revision 0

    International Nuclear Information System (INIS)

    Kearney, A.T.

    1996-03-01

    The information provided in this validation summary report includes chemical analyses of samples from 100-BC-5 Operable Unit Round 9 Groundwater sampling data. Data from this sampling event and their related quality assurance (QA) samples were reviewed and validated in accordance with Westinghouse Hanford Company (WHC) guidelines at the requested level. Sample analyses included metals, general chemistry, and radiochemistry. Sixty metals samples were analyzed by Quanterra Environmental Services (QES) and Lockheed Analytical Services (LAS). The metals samples were validated using WHC protocols specified in Data Validation Procedures for Chemical Analyses. All qualifiers assigned to the metals data were based on this guidance. The Table 1.1 lists the metals sample delivery group (SDG) that were validated for this sampling event

  18. Developing a model of competence in the operating theatre: psychometric validation of the perceived perioperative competence scale-revised.

    Science.gov (United States)

    Gillespie, Brigid M; Polit, Denise F; Hamlin, Lois; Chaboyer, Wendy

    2012-01-01

    This paper describes the development and validation of the Revised Perioperative Competence Scale (PPCS-R). There is a lack of a psychometrically tested sound self-assessment tools to measure nurses' perceived competence in the operating room. Content validity was established by a panel of international experts and the original 98-item scale was pilot tested with 345 nurses in Queensland, Australia. Following the removal of several items, a national sample that included all 3209 nurses who were members of the Australian College of Operating Room Nurses was surveyed using the 94-item version. Psychometric testing assessed content validity using exploratory factor analysis, internal consistency using Cronbach's alpha, and construct validity using the "known groups" technique. During item reduction, several preliminary factor analyses were performed on two random halves of the sample (n=550). Usable data for psychometric assessment were obtained from 1122 nurses. The original 94-item scale was reduced to 40 items. The final factor analysis using the entire sample resulted in a 40 item six-factor solution. Cronbach's alpha for the 40-item scale was .96. Construct validation demonstrated significant differences (pperceived competence scores relative to years of operating room experience and receipt of specialty education. On the basis of these results, the psychometric properties of the PPCS-R were considered encouraging. Further testing of the tool in different samples of operating room nurses is necessary to enable cross-cultural comparisons. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. On the factorial and construct validity of the Intrinsic Motivation Inventory: conceptual and operational concerns.

    Science.gov (United States)

    Markland, D; Hardy, L

    1997-03-01

    The Intrinsic Motivation Inventory (IMI) has been gaining acceptance in the sport and exercise domain since the publication of research by McAuley, Duncan, and Tammen (1989) and McAuley, Wraith, and Duncan (1991), which reported confirmatory support for the factorial validity of a hierarchical model of intrinsic motivation. Authors of the present study argue that the results of these studies did not conclusively support the hierarchical model and that the model did not accurately reflect the tenets of cognitive evaluation theory (Deci & Ryan, 1985) from which the IMI is drawn. It is also argued that a measure of perceived locus of causality is required to model intrinsic motivation properly. The development of a perceived locus of causality for exercise scale is described, and alternative models, in which perceived competence and perceived locus of causality are held to have causal influences on intrinsic motivation, are compared with an oblique confirmatory factor analytic model in which the constructs are held at the same conceptual level. Structural equation modeling showed support for a causal model in which perceived locus of causality mediates the effects of perceived competence on pressure-tension, interest-enjoyment, and effort-importance. It is argued that conceptual and operational problems with the IMI, as currently used, should be addressed before it becomes established as the instrument of choice for assessing levels of intrinsic motivation.

  20. Two-phase 1D+1D model of a DMFC: development and validation on extensive operating conditions range

    Energy Technology Data Exchange (ETDEWEB)

    Casalegno, A.; Marchesi, R.; Parenti, D. [Dipartimento di Energetica, Politecnico di Milano (Italy)

    2008-02-15

    A two-phase 1D+1D model of a direct methanol fuel cell (DMFC) is developed, considering overall mass balance, methanol transport in gas phase through anode diffusion layer, methanol and water crossover. The model is quantitatively validated on an extensive range of operating conditions, 24 polarisation curves. The model accurately reproduces DMFC performance in the validation range and, outside this, it is able to predict values under feasible operating conditions. Finally, the estimations of methanol crossover flux are qualitatively and quantitatively similar to experimental measures and the main local quantities' trends are coherent with results obtained with more complex models. (Abstract Copyright [2008], Wiley Periodicals, Inc.)

  1. OpenChrom: a cross-platform open source software for the mass spectrometric analysis of chromatographic data.

    Science.gov (United States)

    Wenig, Philip; Odermatt, Juergen

    2010-07-30

    Today, data evaluation has become a bottleneck in chromatographic science. Analytical instruments equipped with automated samplers yield large amounts of measurement data, which needs to be verified and analyzed. Since nearly every GC/MS instrument vendor offers its own data format and software tools, the consequences are problems with data exchange and a lack of comparability between the analytical results. To challenge this situation a number of either commercial or non-profit software applications have been developed. These applications provide functionalities to import and analyze several data formats but have shortcomings in terms of the transparency of the implemented analytical algorithms and/or are restricted to a specific computer platform. This work describes a native approach to handle chromatographic data files. The approach can be extended in its functionality such as facilities to detect baselines, to detect, integrate and identify peaks and to compare mass spectra, as well as the ability to internationalize the application. Additionally, filters can be applied on the chromatographic data to enhance its quality, for example to remove background and noise. Extended operations like do, undo and redo are supported. OpenChrom is a software application to edit and analyze mass spectrometric chromatographic data. It is extensible in many different ways, depending on the demands of the users or the analytical procedures and algorithms. It offers a customizable graphical user interface. The software is independent of the operating system, due to the fact that the Rich Client Platform is written in Java. OpenChrom is released under the Eclipse Public License 1.0 (EPL). There are no license constraints regarding extensions. They can be published using open source as well as proprietary licenses. OpenChrom is available free of charge at http://www.openchrom.net.

  2. Assessment and validation of CT scanogram to compare per-operative and post-operative mechanical axis after navigated total knee replacement

    Science.gov (United States)

    Jain, Sunil

    2008-01-01

    Our objective was to assess and validate low-dose computed tomography (CT) scanogram as a post-operative imaging modality to measure the mechanical axis after navigated total knee replacement. A prospective study was performed to compare intra-operative and post-operative mechanical axis after navigated total knee replacements. All consecutive patients who underwent navigated total knee replacement between May and December 2006 were included. The intra-operative final axis was recorded, and post-operatively a CT scanogram of lower limbs was performed. The mechanical axis was measured and compared against the intra-operative measurement. There were 15 patients ranging in age from 57 to 80 (average 70) years. The average final intra-operative axis was 0.56° varus (4° varus to 1.5° valgus) and post-operative CT scanogram axis was 0.52° varus (3.1° varus to 1.8° valgus). The average deviation from final axes to CT scanogram axes was 0.12° valgus with a correlation coefficient of 0.9. Our study suggests that CT scanogram is an imaging modality with reasonable accuracy for measuring mechanical axis despite significantly low radiation. It also confirms a high level of correlation between intra-operative and post-operative mechanical axis after navigated total knee replacement. PMID:18696064

  3. Technical Optimization of Cross-Platform Software Development Process quality and Usability of 3rd-Party Tools

    Directory of Open Access Journals (Sweden)

    Yevgueny Kondratyev

    2016-03-01

    Full Text Available The article exposes developer's point of view on minimizing creation, upgrade, post-release problem solving time for applications and components, targeted to multiple operating systems, while keeping high end product quality and computational performance. Non-uniformity of analogous tools and components, available on different platforms, causes strong impact on developer's productivity. In part., differences in 3rd-party component interfaces, versions, quality of distinct functions, cause frequent switching developer's attention on issues not connected (in principle with the target project. While loss of development performance because of attention specifics is more subjective value, at least physical time spent on tools/components misbehavior compensation and normal tools configuring is measurable. So, the main thesis verified is whether it's possible to increase continuity of the development process by technical improvements only, and by which value. In addition, a novel experimental tool for interactive code execution is described, allowing for deep changes in the working program without its restart. Question under research: minimizing durations of programming-build-test-correct loop and small code parts runs, in part., improving the debugging workflow for the account of combining the interactive editor and the debugger.

  4. Three-dimensional Cross-Platform Planning for Complex Spinal Procedures: A New Method Adaptive to Different Navigation Systems.

    Science.gov (United States)

    Kosterhon, Michael; Gutenberg, Angelika; Kantelhardt, Sven R; Conrad, Jens; Nimer Amr, Amr; Gawehn, Joachim; Giese, Alf

    2017-08-01

    A feasibility study. To develop a method based on the DICOM standard which transfers complex 3-dimensional (3D) trajectories and objects from external planning software to any navigation system for planning and intraoperative guidance of complex spinal procedures. There have been many reports about navigation systems with embedded planning solutions but only few on how to transfer planning data generated in external software. Patients computerized tomography and/or magnetic resonance volume data sets of the affected spinal segments were imported to Amira software, reconstructed to 3D images and fused with magnetic resonance data for soft-tissue visualization, resulting in a virtual patient model. Objects needed for surgical plans or surgical procedures such as trajectories, implants or surgical instruments were either digitally constructed or computerized tomography scanned and virtually positioned within the 3D model as required. As crucial step of this method these objects were fused with the patient's original diagnostic image data, resulting in a single DICOM sequence, containing all preplanned information necessary for the operation. By this step it was possible to import complex surgical plans into any navigation system. We applied this method not only to intraoperatively adjustable implants and objects under experimental settings, but also planned and successfully performed surgical procedures, such as the percutaneous lateral approach to the lumbar spine following preplanned trajectories and a thoracic tumor resection including intervertebral body replacement using an optical navigation system. To demonstrate the versatility and compatibility of the method with an entirely different navigation system, virtually preplanned lumbar transpedicular screw placement was performed with a robotic guidance system. The presented method not only allows virtual planning of complex surgical procedures, but to export objects and surgical plans to any navigation or

  5. SU-E-T-112: An OpenCL-Based Cross-Platform Monte Carlo Dose Engine (oclMC) for Coupled Photon-Electron Transport

    International Nuclear Information System (INIS)

    Tian, Z; Shi, F; Folkerts, M; Qin, N; Jiang, S; Jia, X

    2015-01-01

    Purpose: Low computational efficiency of Monte Carlo (MC) dose calculation impedes its clinical applications. Although a number of MC dose packages have been developed over the past few years, enabling fast MC dose calculations, most of these packages were developed under NVidia’s CUDA environment. This limited their code portability to other platforms, hindering the introduction of GPU-based MC dose engines to clinical practice. To solve this problem, we developed a cross-platform fast MC dose engine named oclMC under OpenCL environment for external photon and electron radiotherapy. Methods: Coupled photon-electron simulation was implemented with standard analogue simulation scheme for photon transport and Class II condensed history scheme for electron transport. We tested the accuracy and efficiency of oclMC by comparing the doses calculated using oclMC and gDPM, a previously developed GPU-based MC code on NVidia GPU platform, for a 15MeV electron beam and a 6MV photon beam in a homogenous water phantom, a water-bone-lung-water slab phantom and a half-slab phantom. We also tested code portability of oclMC on different devices, including an NVidia GPU, two AMD GPUs and an Intel CPU. Results: Satisfactory agreements were observed in all photon and electron cases, with ∼0.48%–0.53% average dose differences at regions within 10% isodose line for electron beam cases and ∼0.15%–0.17% for photon beam cases. It took oclMC 3–4 sec to perform transport simulation for electron beam on NVidia Titan GPU and 35–51 sec for photon beam, both with ∼0.5% statistical uncertainty. The computation was 6%–17% slower than gDPM due to the differences in both physics model and development environment, which is considered not significant for clinical applications. In terms of code portability, gDPM only runs on NVidia GPUs, while oclMC successfully runs on all the tested devices. Conclusion: oclMC is an accurate and fast MC dose engine. Its high cross-platform

  6. SU-E-T-112: An OpenCL-Based Cross-Platform Monte Carlo Dose Engine (oclMC) for Coupled Photon-Electron Transport

    Energy Technology Data Exchange (ETDEWEB)

    Tian, Z; Shi, F; Folkerts, M; Qin, N; Jiang, S; Jia, X [The University of Texas Southwestern Medical Ctr, Dallas, TX (United States)

    2015-06-15

    Purpose: Low computational efficiency of Monte Carlo (MC) dose calculation impedes its clinical applications. Although a number of MC dose packages have been developed over the past few years, enabling fast MC dose calculations, most of these packages were developed under NVidia’s CUDA environment. This limited their code portability to other platforms, hindering the introduction of GPU-based MC dose engines to clinical practice. To solve this problem, we developed a cross-platform fast MC dose engine named oclMC under OpenCL environment for external photon and electron radiotherapy. Methods: Coupled photon-electron simulation was implemented with standard analogue simulation scheme for photon transport and Class II condensed history scheme for electron transport. We tested the accuracy and efficiency of oclMC by comparing the doses calculated using oclMC and gDPM, a previously developed GPU-based MC code on NVidia GPU platform, for a 15MeV electron beam and a 6MV photon beam in a homogenous water phantom, a water-bone-lung-water slab phantom and a half-slab phantom. We also tested code portability of oclMC on different devices, including an NVidia GPU, two AMD GPUs and an Intel CPU. Results: Satisfactory agreements were observed in all photon and electron cases, with ∼0.48%–0.53% average dose differences at regions within 10% isodose line for electron beam cases and ∼0.15%–0.17% for photon beam cases. It took oclMC 3–4 sec to perform transport simulation for electron beam on NVidia Titan GPU and 35–51 sec for photon beam, both with ∼0.5% statistical uncertainty. The computation was 6%–17% slower than gDPM due to the differences in both physics model and development environment, which is considered not significant for clinical applications. In terms of code portability, gDPM only runs on NVidia GPUs, while oclMC successfully runs on all the tested devices. Conclusion: oclMC is an accurate and fast MC dose engine. Its high cross-platform

  7. [Validation of knowledge acquired from experience: opportunity or threat for nurses working in operating theatres?].

    Science.gov (United States)

    Chauvat-Bouëdec, Cécile

    2005-06-01

    The law n 2002-73, dated 17 January 2002, of social modernisation, as it is called, reformed continuing professional training in France. It established a new system of professional certification, the validation of the knowledge acquired from experience (VAE in French). Since 2003, the Health Ministry has been studying a project to set up the VAE for health professions, among which, in particular, the profession of the state registered nurse working in operating theatres (IBODES in French). A state diploma sanctions the training enabling to practise this profession. In the future, the VAE will open a new access way to this diploma. Does this evolution constitute a threat for the profession, and a risk or an opportunity for individual people? The aim of this thesis is to characterise the impacts of the VAE on the IBODE profession and its current system of training. Two sociological and educational approaches are comforted by a field survey. A historical background of the IBODE profession develops the evolution of the caring practices, and presents the evolution of the training systems. A sociological approach enables to analyse the vocational focus of the IBODE on looking at functionalist theories. Therefore, the study enables to think that the VAE will have no consequences on the vocational focus of the IBODE. The VAE is then the object of an educational approach within the context of continuing professional training. The topics on which it could apply and the resistances it causes are studied. Some examples are taken within other Ministries. This study shows that the VAE involves an adaptation of training centres. The VAE constitutes a genuine opportunity for the IBODE profession. However, to manage its setting up in a delicate human context, the field professionals should be involved as early as possible in the reflection initiated by the Ministry.

  8. Computational modelling of an operational wind turbine and validation with LIDAR

    Science.gov (United States)

    Creech, Angus; Fruh, Wolf-Gerrit; Clive, Peter

    2010-05-01

    We present a computationally efficient method to model the interaction of wind turbines with the surrounding flow, where the interaction provides information on the power generation of the turbine and the generated wake behind the turbine. The turbine representation is based on the principle of an actuator volume, whereby the energy extraction and balancing forces on the fluids are formulated as body forces which avoids the extremely high computational costs of boundary conditions and forces. Depending on the turbine information available, those forces can be derived either from published turbine performance specifications or from their rotor and blade design. This turbine representation is then coupled to a Computational Fluid Dynamics package, in this case the hr-adaptive Finite-Element code Fluidity from Imperial College, London. Here we present a simulation of an operational 950kW NEG Micon NM54 wind turbine installed in the west of Scotland. The calculated wind is compared with LIDAR measurements using a Galion LIDAR from SgurrEnergy. The computational domain extends over an area of 6km by 6km and a height of 750m, centred on the turbine. The lower boundary includes the orography of the terrain and surface roughness values representing the vegetation - some forested areas and some grassland. The boundary conditions on the sides are relaxed Dirichlet conditions, relaxed to an observed prevailing wind speed and direction. Within instrumental errors and model limitations, the overall flow field in general and the wake behind the turbine in particular, show a very high degree of agreement, demonstrating the validity and value of this approach. The computational costs of this approach are such that it is possible to extend this single-turbine example to a full wind farm, as the number of required mesh nodes is given by the domain and then increases only linearly with the number of turbines

  9. SAXSEV 2.1 CROSS-PLATFORM APPLICATION FOR DATA ANALYSIS OF SMALL-ANGLE X-RAY SCATTERING FROM POLYDISPERSE SYSTEMS

    Directory of Open Access Journals (Sweden)

    A. V. Kuchko

    2015-03-01

    Full Text Available The present paper discusses development and implementation of the cross-platform application with a graphical user interface for estimation of the particle volume fraction distribution function and fitting specific surface area to this distribution pattern. SAXSEV implements the method of statistical regularization for ill-posed mathematical tasks being solved with the use of Numpy, Scipy and MathPlotlib libraries. The main features of this software application are the ability to adjust the arguments grid of the desired function and the ability to select the optimal value of the regularization parameter. This parameter is selected by several specific and one common criteria. The software application consists of modules written in Python3. The modules are combined by common interface based on Tkinter library. Current version SAXSEV 2.1 was tested on the basis of Windows XP / Vista / 7/8, Ubuntu 14.1. SAXSEV 2.1 was used successfully at effectiveness study of statistical regularization method for analyzing dispersed system by SAXS, at research of the powder consisting from nanoparticles and composite materials with nanoparticles inclusion.

  10. TCV software test and validation tools and technique. [Terminal Configured Vehicle program for commercial transport aircraft operation

    Science.gov (United States)

    Straeter, T. A.; Williams, J. R.

    1976-01-01

    The paper describes techniques for testing and validating software for the TCV (Terminal Configured Vehicle) program which is intended to solve problems associated with operating a commercial transport aircraft in the terminal area. The TCV research test bed is a Boeing 737 specially configured with digital computer systems to carry out automatic navigation, guidance, flight controls, and electronic displays research. The techniques developed for time and cost reduction include automatic documentation aids, an automatic software configuration, and an all software generation and validation system.

  11. The Copernicus S5P Mission Performance Centre / Validation Data Analysis Facility for TROPOMI operational atmospheric data products

    Science.gov (United States)

    Compernolle, Steven; Lambert, Jean-Christopher; Langerock, Bavo; Granville, José; Hubert, Daan; Keppens, Arno; Rasson, Olivier; De Mazière, Martine; Fjæraa, Ann Mari; Niemeijer, Sander

    2017-04-01

    Sentinel-5 Precursor (S5P), to be launched in 2017 as the first atmospheric composition satellite of the Copernicus programme, carries as payload the TROPOspheric Monitoring Instrument (TROPOMI) developed by The Netherlands in close cooperation with ESA. Designed to measure Earth radiance and solar irradiance in the ultraviolet, visible and near infrared, TROPOMI will provide Copernicus with observational data on atmospheric composition at unprecedented geographical resolution. The S5P Mission Performance Center (MPC) provides an operational service-based solution for various QA/QC tasks, including the validation of S5P Level-2 data products and the support to algorithm evolution. Those two tasks are to be accomplished by the MPC Validation Data Analysis Facility (VDAF), one MPC component developed and operated at BIRA-IASB with support from S[&]T and NILU. The routine validation to be ensured by VDAF is complemented by a list of validation AO projects carried out by ESA's S5P Validation Team (S5PVT), with whom interaction is essential. Here we will introduce the general architecture of VDAF, its relation to the other MPC components, the generic and specific validation strategies applied for each of the official TROPOMI data products, and the expected output of the system. The S5P data products to be validated by VDAF are diverse: O3 (vertical profile, total column, tropospheric column), NO2 (total and tropospheric column), HCHO (tropospheric column), SO2 (column), CO (column), CH4 (column), aerosol layer height and clouds (fractional cover, cloud-top pressure and optical thickness). Starting from a generic validation protocol meeting community-agreed standards, a set of specific validation settings is associated with each data product, as well as the appropriate set of Fiducial Reference Measurements (FRM) to which it will be compared. VDAF collects FRMs from ESA's Validation Data Centre (EVDC) and from other sources (e.g., WMO's GAW, NDACC and TCCON). Data

  12. The CMEMS-Med-MFC-Biogeochemistry operational system: implementation of NRT and Multi-Year validation tools

    Science.gov (United States)

    Salon, Stefano; Cossarini, Gianpiero; Bolzon, Giorgio; Teruzzi, Anna

    2017-04-01

    The Mediterranean Monitoring and Forecasting Centre (Med-MFC) is one of the regional production centres of the EU Copernicus Marine Environment Monitoring Service (CMEMS). Med-MFC manages a suite of numerical model systems for the operational delivery of the CMEMS products, providing continuous monitoring and forecasting of the Mediterranean marine environment. The CMEMS products of fundamental biogeochemical variables (chlorophyll, nitrate, phosphate, oxygen, phytoplankton biomass, primary productivity, pH, pCO2) are organised as gridded datasets and are available at the marine.copernicus.eu web portal. Quantitative estimates of CMEMS products accuracy are prerequisites to release reliable information to intermediate users, end users and to other downstream services. In particular, validation activities aim to deliver accuracy information of the model products and to serve as a long term monitoring of the performance of the modelling systems. The quality assessment of model output is implemented using a multiple-stages approach, basically inspired to the classic "GODAE 4 Classes" metrics and criteria (consistency, quality, performance and benefit). Firstly, pre-operational runs qualify the operational model system against historical data, also providing a verification of the improvements of the new model system release with respect to the previous version. Then, the near real time (NRT) validation aims at delivering a sustained on-line skill assessment of the model analysis and forecast, relying on the NRT available relevant observations (e.g. in situ, Bio Argo and satellite observations). NRT validation results are operated on weekly basis and published on the MEDEAF web portal (www.medeaf.inogs.it). On a quarterly basis, the integration of the NRT validation activities delivers a comprehensive view of the accuracy of model forecast through the official CMEMS validation webpage. Multi-Year production (e.g. reanalysis runs) follows a similar procedure, and the

  13. Toward feasible, valid, and reliable video-based assessments of technical surgical skills in the operating room

    DEFF Research Database (Denmark)

    Aggarwal, R.; Grantcharov, T.; Moorthy, K.

    2008-01-01

    .72). Conclusions: Video-based technical skills evaluation in the operating room is feasible, valid and reliable. Global rating scales hold promise for summative assessment, though further work is necessary to elucidate the value of procedural rating scales Udgivelsesdato: 2008/2......Objective: To determine the feasibility, validity, inter-rater, and intertest reliability of 4 previously published video-based rating scales, for technical skills assessment on a benchmark laparoscopic procedure. Summary Background Data: Assessment of technical skills is crucial...... to the demonstration and maintenance of competent healthcare practitioners. Traditional assessment methods are prone to subjectivity through a lack of proven validity and reliability. Methods: Nineteen surgeons (6 novice and 13 experienced) performed a median of 2 laparoscopic cholecystectomies each (range 1-5) on 53...

  14. Validation of the IAEA-WIMSD library for the LOADF code on operation transients at the Krsko Power Plant

    International Nuclear Information System (INIS)

    Trkov, A.; Kurincic, B.

    2002-01-01

    The LOADF package for reactor core operation monitoring has been tested with the new IAEA-WIMSD-69 library. A transient involving power reduction from full to 80% power was analysed. Predicted critical boron concentrations and control rod positions were compared against measured values. The results confirm that transient prediction with the new library is at least as good or better as with the validated old library.(author)

  15. Analytical validation of operator actions in case of primary to secondary leakage for VVER-1000/V320

    Energy Technology Data Exchange (ETDEWEB)

    Andreeva, M., E-mail: m_andreeva@inrne.bas.bg; Groudev, P., E-mail: pavlinpg@inrne.bas.bg; Pavlova, M., E-mail: pavlova@inrne.bas.bg

    2015-12-15

    Highlights: • We validate operator actions in case of primary to secondary leakage. • We perform four scenarios related to SGTR accident for VVER-1000/V320. • The reference power plant for the analyses is Unit 6 at Kozloduy NPP. • The RELAP5/MOD 3.2 computer code is used in performing the analyses. • The analyses confirm the effectiveness of operator actions during PRISE. - Abstract: This paper presents the results of analytical validation of operator actions in case of “Steam Generator Tube Rupture” (SGTR) for VVER-1000/V320 units at Kozloduy Nuclear Power Plant (KNPP), done during the development of Symptom Based Emergency Operating Procedures (SB EOPs) for this plant. The purpose of the analyses is to demonstrate the ability to terminate primary to secondary leakage and to indicate an effective strategy for preventing secondary leakage to the environment and in this way to prevent radiological release to the environment. Following depressurization and cooldown of reactor coolant system (RCS) with isolation of the affected steam generator (SG), in these analyses are validated options for post-SGTR cooldown by: • back up filling the ruptured SG; • using letdown system in the affected SG and • by opening Fast Acting Isolation Valve (FAIV) and using Steam Dump Facility to the Condenser (BRU-K). The results of the thermal-hydraulic analyses have been used to assist KNPP specialists in analytical validation of EOPs. The RELAP5/MOD3.2 computer code has been used for the analyses in a VVER-1000 Nuclear Power Plant (NPP) model. A model of VVER-1000 based on Unit 6 of Kozloduy NPP has been developed for the thermal-hydraulics code RELAP5/MOD3.2 at the Institute for Nuclear Research and Nuclear Energy – Bulgarian Academy of Sciences (INRNE-BAS). This paper is possible through the participation of leading specialists from KNPP.

  16. Cross platform analysis of methylation, miRNA and stem cell gene expression data in germ cell tumors highlights characteristic differences by tumor histology

    International Nuclear Information System (INIS)

    Poynter, Jenny N.; Bestrashniy, Jessica R. B. M.; Silverstein, Kevin A. T.; Hooten, Anthony J.; Lees, Christopher; Ross, Julie A.; Tolar, Jakub

    2015-01-01

    Alterations in methylation patterns, miRNA expression, and stem cell protein expression occur in germ cell tumors (GCTs). Our goal is to integrate molecular data across platforms to identify molecular signatures in the three main histologic subtypes of Type I and Type II GCTs (yolk sac tumor (YST), germinoma, and teratoma). We included 39 GCTs and 7 paired adjacent tissue samples in the current analysis. Molecular data available for analysis include DNA methylation data (Illumina GoldenGate Cancer Methylation Panel I), miRNA expression (NanoString nCounter miRNA platform), and stem cell factor expression (SABiosciences Human Embryonic Stem Cell Array). We evaluated the cross platform correlations of the data features using the Maximum Information Coefficient (MIC). In analyses of individual datasets, differences were observed by tumor histology. Germinomas had higher expression of transcription factors maintaining stemness, while YSTs had higher expression of cytokines, endoderm and endothelial markers. We also observed differences in miRNA expression, with miR-371-5p, miR-122, miR-302a, miR-302d, and miR-373 showing elevated expression in one or more histologic subtypes. Using the MIC, we identified correlations across the data features, including six major hubs with higher expression in YST (LEFTY1, LEFTY2, miR302b, miR302a, miR 126, and miR 122) compared with other GCT. While prognosis for GCTs is overall favorable, many patients experience resistance to chemotherapy, relapse and/or long term adverse health effects following treatment. Targeted therapies, based on integrated analyses of molecular tumor data such as that presented here, may provide a way to secure high cure rates while reducing unintended health consequences

  17. Validity of the adiabatic rotational model in the case of the Hexadecupole operator

    International Nuclear Information System (INIS)

    Sharma, S.K.; Raina, P.K.

    1989-06-01

    The validity of the rotational model expressions relating the E4 transition probabilities and the intrinsic hexadecupole moments is examined in the light of the recent data on the inelastic electron-scattering form factors for the 0 + → 4 + transitions in some medium-mass nuclei. (author). 14 refs, 2 figs, 2 tabs

  18. Towards an Operational Definition of Effective Co-Teaching: Instrument Development, Validity, and Reliability

    Science.gov (United States)

    La Monte, Michelle Evonne

    2012-01-01

    This study focused on developing a valid and reliable instrument that can not only identify successful co-teaching, but also the professional development needs of co-teachers and their administrators in public schools. Two general questions about the quality of co-teaching were addressed in this study: (a) How well did descriptors within each of…

  19. 78 FR 32255 - HHS-Operated Risk Adjustment Data Validation Stakeholder Meeting

    Science.gov (United States)

    2013-05-29

    ... States'') is assigned a host (in accordance with the Department Foreign Visitor Management Policy... general public. Visitors to the complex are required to show a valid U.S. Government issued photo... lobby, and the cafeteria. If a visitor is found outside of those areas without proper escort, they may...

  20. Modeling and validating the grabbing forces of hydraulic log grapples used in forest operations

    Science.gov (United States)

    Jingxin Wang; Chris B. LeDoux; Lihai Wang

    2003-01-01

    The grabbing forces of log grapples were modeled and analyzed mathematically under operating conditions when grabbing logs from compact log piles and from bunch-like log piles. The grabbing forces are closely related to the structural parameters of the grapple, the weight of the grapple, and the weight of the log grabbed. An operational model grapple was designed and...

  1. Self-management interventions : Proposal and validation of a new operational definition

    NARCIS (Netherlands)

    Jonkman, Nini H; Schuurmans, Marieke J; Jaarsma, Tiny; Shortridge-Baggett, Lillie M; Hoes, Arno W; Trappenburg, Jaap C A

    2016-01-01

    OBJECTIVES: Systematic reviews on complex interventions like self-management interventions often do not explicitly state an operational definition of the intervention studied, which may impact the review's conclusions. This study aimed to propose an operational definition of self-management

  2. Self-management interventions: Proposal and validation of a new operational definition

    NARCIS (Netherlands)

    Jonkman, N.H.; Schuurmans, Marieke J.; Jaarsma, Tiny; Shortbridge-Baggett, Lillie M.; Hoes, Arno W.; Trappenburg, Jaap C A

    2016-01-01

    Objectives: Systematic reviews on complex interventions like self-management interventions often do not explicitly state an operational definition of the intervention studied, which may impact the review's conclusions. This study aimed to propose an operational definition of self-management

  3. External Validation of a Decision Tool To Guide Post-Operative Management of Patients with Secondary Peritonitis.

    Science.gov (United States)

    Atema, Jasper J; Ram, Kim; Schultz, Marcus J; Boermeester, Marja A

    Timely identification of patients in need of an intervention for abdominal sepsis after initial surgical management of secondary peritonitis is vital but complex. The aim of this study was to validate a decision tool for this purpose and to evaluate its potential to guide post-operative management. A prospective cohort study was conducted on consecutive adult patients undergoing surgery for secondary peritonitis in a single hospital. Assessments using the decision tool, based on one intra-operative and five post-operative variables, were performed on the second and third post-operative days and when the patients' clinical status deteriorated. Scores were compared with the clinical reference standard of persistent sepsis based on the clinical course or findings at imaging or surgery. Additionally, the potential of the decision tool to guide management in terms of diagnostic imaging in three previously defined score categories (low, intermediate, and high) was evaluated. A total of 161 assessments were performed in 69 patients. The majority of cases of secondary peritonitis (68%) were caused by perforation of the gastrointestinal tract. Post-operative persistent sepsis occurred in 28 patients. The discriminative capacity of the decision tool score was fair (area under the curve of the receiver operating characteristic = 0.79). The incidence rate differed significantly between the three score categories (p peritonitis, the decision tool score predicts with fair accuracy whether persistent sepsis is present.

  4. Development and validation of a heuristic model for evaluation of the team performance of operators in nuclear power plants

    International Nuclear Information System (INIS)

    Kim, Sa Kil; Byun, Seong Nam; Lee, Dhong Hoon

    2011-01-01

    Highlights: → We develop an estimation model for evaluation of the team performance of MCR. → To build the model, we extract team performance factors through reviewing literatures and identifying behavior markers. → We validate that the model is adaptable to the advanced MCR of nuclear power plants. → As a result, we find that the model is a systematic and objective to measure team performance. - Abstract: The global concerns about safety in the digital technology of the main control room (MCR) are growing as domestic and foreign nuclear power plants are developed with computerized control facilities and human-system interfaces. In a narrow space, the digital technology contributes to a control room environment, which can facilitate the acquisition of all the information needed for operation. Thus, although an individual performance of the advanced MCR can be further improved; there is a limit in expecting an improvement in team performance. The team performance depends on organic coherence as a whole team rather than on the knowledge and skill of an individual operator. Moreover, a good team performance improves communication between and within teams in an efficient manner, and then it can be conducive to addressing unsafe conditions. Respecting this, it is important and necessary to develop methodological technology for the evaluation of operators' teamwork or collaboration, thus enhancing operational performance in nuclear power plant at the MCR. The objectives of this research are twofold: to develop a systematic methodology for evaluation of the team performance of MCR operators in consideration of advanced MCR characteristics, and to validate that the methodology is adaptable to the advanced MCR of nuclear power plants. In order to achieve these two objectives, first, team performance factors were extracted through literature reviews and methodological study concerning team performance theories. Second, the team performance factors were identified and

  5. Validation of post-operative atrial fibrillation in the Western Denmark Heart Registry

    DEFF Research Database (Denmark)

    Munkholm, Sarah Bach; Jakobsen, Carl-Johan; Mortensen, Poul Erik

    2015-01-01

    INTRODUCTION: Post-operative new-onset atrial fibrillation and flutter (POAF) is associated with increased morbidity and mortality following cardiac surgery. Registers and databases are important data sources for observational studies in this research area; hence, the aim was to assess the data...... of the registry. FUNDING: none. TRIAL REGISTRATION: not relevant....

  6. Self-management interventions: Proposal and validation of a new operational definition.

    Science.gov (United States)

    Jonkman, Nini H; Schuurmans, Marieke J; Jaarsma, Tiny; Shortridge-Baggett, Lillie M; Hoes, Arno W; Trappenburg, Jaap C A

    2016-12-01

    Systematic reviews on complex interventions like self-management interventions often do not explicitly state an operational definition of the intervention studied, which may impact the review's conclusions. This study aimed to propose an operational definition of self-management interventions and determine its discriminative performance compared with other operational definitions. Systematic review of definitions of self-management interventions and consensus meetings with self-management research experts and practitioners. Self-management interventions were defined as interventions that aim to equip patients with skills to actively participate and take responsibility in the management of their chronic condition in order to function optimally through at least knowledge acquisition and a combination of at least two of the following: stimulation of independent sign/symptom monitoring, medication management, enhancing problem-solving and decision-making skills for medical treatment management, and changing their physical activity, dietary, and/or smoking behavior. This definition substantially reduced the number of selected studies (255 of 750). In two preliminary expert meetings (n = 6), the proposed definition was identifiable for self-management research experts and practitioners (80% and 60% agreement, respectively). Future systematic reviews must carefully consider the operational definition of the intervention studied because the definition influences the selection of studies on which conclusions and recommendations for clinical practice are based. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Validation of standard operating procedures in a multicenter retrospective study to identify -omics biomarkers for chronic low back pain.

    Directory of Open Access Journals (Sweden)

    Concetta Dagostino

    Full Text Available Chronic low back pain (CLBP is one of the most common medical conditions, ranking as the greatest contributor to global disability and accounting for huge societal costs based on the Global Burden of Disease 2010 study. Large genetic and -omics studies provide a promising avenue for the screening, development and validation of biomarkers useful for personalized diagnosis and treatment (precision medicine. Multicentre studies are needed for such an effort, and a standardized and homogeneous approach is vital for recruitment of large numbers of participants among different centres (clinical and laboratories to obtain robust and reproducible results. To date, no validated standard operating procedures (SOPs for genetic/-omics studies in chronic pain have been developed. In this study, we validated an SOP model that will be used in the multicentre (5 centres retrospective "PainOmics" study, funded by the European Community in the 7th Framework Programme, which aims to develop new biomarkers for CLBP through three different -omics approaches: genomics, glycomics and activomics. The SOPs describe the specific procedures for (1 blood collection, (2 sample processing and storage, (3 shipping details and (4 cross-check testing and validation before assays that all the centres involved in the study have to follow. Multivariate analysis revealed the absolute specificity and homogeneity of the samples collected by the five centres for all genetics, glycomics and activomics analyses. The SOPs used in our multicenter study have been validated. Hence, they could represent an innovative tool for the correct management and collection of reliable samples in other large-omics-based multicenter studies.

  8. Fires and Smoke Observed from the Earth Observing System MODIS Instrument: Products, Validation, and Operational Use

    Science.gov (United States)

    Kaufman, Y. J.; Ichoku, C.; Giglio, L.; Korontzi, S.; Chu, D. A.; Hao, W. M.; Justice, C. O.; Lau, William K. M. (Technical Monitor)

    2001-01-01

    The MODIS sensor, launched on NASA's Terra satellite at the end of 1999, was designed with 36 spectral channels for a wide array of land, ocean, and atmospheric investigations. MODIS has a unique ability to observe fires, smoke, and burn scars globally. Its main fire detection channels saturate at high brightness temperatures: 500 K at 4 microns and 400 K at 11 microns, which can only be attained in rare circumstances at the I kin fire detection spatial resolution. Thus, unlike other polar orbiting satellite sensors with similar thermal and spatial resolutions, but much lower saturation temperatures (e.g. AVHRR and ATSR), MODIS can distinguish between low intensity ground surface fires and high intensity crown forest fires. Smoke column concentration over land is for the first time being derived from the MOMS solar channels, extending from 0.41 microns to 2.1 microns. The smoke product has been provisionally validated both globally and regionally over southern Africa and central and south America. Burn scars are observed from MODIS even in the presence of smoke, using the 1.2 to 2.1 micron channels. MODIS burned area information is used to estimate pyrogenic emissions. A wide range of these fire and related products and validation are demonstrated for the wild fires that occurred in northwestern United States in the summer of 2000. The MODIS rapid response system and direct broadcast capability is being developed to enable users to obtain and generate data in near real time. It is expected that health and land management organizations will use these systems for monitoring the occurrence of fires and the dispersion of smoke within two to six hours after data acquisition.

  9. SMOS sea ice product: Operational application and validation in the Barents Sea marginal ice zone

    DEFF Research Database (Denmark)

    Kaleschke, Lars; Tian-Kunze, Xiangshan; Maaß, Nina

    2016-01-01

    system for ship route optimisation has been developed and was tested during this field campaign with the ice-strengthened research vessel RV Lance. The ship cruise was complemented with coordinated measurements from a helicopter and the research aircraft Polar 5. Sea ice thickness was measured using...... an electromagnetic induction (EM) system from the bow of RV Lance and another EM-system towed below the helicopter. Polar 5 was equipped among others with the L-band radiometer EMIRAD-2. The experiment yielded a comprehensive data set allowing the evaluation of the operational forecast and route optimisation system...

  10. Validation of Generic Models for Variable Speed Operation Wind Turbines Following the Recent Guidelines Issued by IEC 61400-27

    Directory of Open Access Journals (Sweden)

    Andrés Honrubia-Escribano

    2016-12-01

    Full Text Available Considerable efforts are currently being made by several international working groups focused on the development of generic, also known as simplified or standard, wind turbine models for power system stability studies. In this sense, the first edition of International Electrotechnical Commission (IEC 61400-27-1, which defines generic dynamic simulation models for wind turbines, was published in February 2015. Nevertheless, the correlations of the IEC generic models with respect to specific wind turbine manufacturer models are required by the wind power industry to validate the accuracy and corresponding usability of these standard models. The present work conducts the validation of the two topologies of variable speed wind turbines that present not only the largest market share, but also the most technological advances. Specifically, the doubly-fed induction machine and the full-scale converter (FSC topology are modeled based on the IEC 61400-27-1 guidelines. The models are simulated for a wide range of voltage dips with different characteristics and wind turbine operating conditions. The simulated response of the IEC generic model is compared to the corresponding simplified model of a wind turbine manufacturer, showing a good correlation in most cases. Validation error sources are analyzed in detail, as well. In addition, this paper reviews in detail the previous work done in this field. Results suggest that wind turbine manufacturers are able to adjust the IEC generic models to represent the behavior of their specific wind turbines for power system stability analysis.

  11. ARM Radiosondes for National Polar-Orbiting Operational Environmental Satellite System Preparatory Project Validation Field Campaign Report

    Energy Technology Data Exchange (ETDEWEB)

    Borg, Lori [Univ. of Wisconsin, Madison, WI (United States); Tobin, David [Univ. of Wisconsin, Madison, WI (United States); Reale, Anthony [National Oceanic and Atmospheric Administration (NOAA), Washington, DC (United States); Knuteson, Robert [Univ. of Wisconsin, Madison, WI (United States); Feltz, Michelle [Univ. of Wisconsin, Madison, WI (United States); Liu, Mark [National Oceanic and Atmospheric Administration (NOAA), Washington, DC (United States); Holdridge, Donna J [Argonne National Lab. (ANL), Argonne, IL (United States); Mather, James [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2017-06-01

    This IOP has been a coordinated effort involving the U.S. Department of Energy (DOE) Atmospheric Radiation (ARM) Climate Research Facility, the University of Wisconsin (UW)-Madison, and the JPSS project to validate SNPP NOAA Unique Combined Atmospheric Processing System (NUCAPS) temperature and moisture sounding products from the Cross-track Infrared Sounder (CrIS) and the Advanced Technology Microwave Sounder (ATMS). In this arrangement, funding for radiosondes was provided by the JPSS project to ARM. These radiosondes were launched coincident with the SNPP satellite overpasses (OP) at four of the ARM field sites beginning in July 2012 and running through September 2017. Combined with other ARM data, an assessment of the radiosonde data quality was performed and post-processing corrections applied producing an ARM site Best Estimate (BE) product. The SNPP targeted radiosondes were integrated into the NOAA Products Validation System (NPROVS+) system, which collocated the radiosondes with satellite products (NOAA, National Aeronautics and Space Administration [NASA], European Organisation for the Exploitation of Meteorological Satellites [EUMETSAT], Geostationary Operational Environmental Satellite [GOES], Constellation Observing System for Meteorology, Ionosphere, and Climate [COSMIC]) and Numerical Weather Prediction (NWP forecasts for use in product assessment and algorithm development. This work was a fundamental, integral, and cost-effective part of the SNPP validation effort and provided critical accuracy assessments of the SNPP temperature and water vapor soundings.

  12. The ARPAL operational high resolution Poor Man's Ensemble, description and validation

    Science.gov (United States)

    Corazza, Matteo; Sacchetti, Davide; Antonelli, Marta; Drofa, Oxana

    2018-05-01

    The Meteo Hydrological Functional Center for Civil Protection of the Environmental Protection Agency of the Liguria Region is responsible for issuing forecasts primarily aimed at the Civil Protection needs. Several deterministic high resolution models, run every 6 or 12 h, are regularly used in the Center to elaborate weather forecasts at short to medium range. The Region is frequently affected by severe flash floods over its very small basins, characterized by a steep orography close to the sea. These conditions led the Center in the past years to pay particular attention to the use and development of high resolution model chains for explicit simulation of convective phenomena. For years, the availability of several models has been used by the forecasters for subjective analyses of the potential evolution of the atmosphere and of its uncertainty. More recently, an Interactive Poor Man's Ensemble has been developed, aimed at providing statistical ensemble variables to help forecaster's evaluations. In this paper the structure of this system is described and results are validated using the regional dense ground observational network.

  13. Calculation and validation of heat transfer coefficient for warm forming operations

    Science.gov (United States)

    Omer, Kaab; Butcher, Clifford; Worswick, Michael

    2017-10-01

    In an effort to reduce the weight of their products, the automotive industry is exploring various hot forming and warm forming technologies. One critical aspect in these technologies is understanding and quantifying the heat transfer between the blank and the tooling. The purpose of the current study is twofold. First, an experimental procedure to obtain the heat transfer coefficient (HTC) as a function of pressure for the purposes of a metal forming simulation is devised. The experimental approach was used in conjunction with finite element models to obtain HTC values as a function of die pressure. The materials that were characterized were AA5182-O and AA7075-T6. Both the heating operation and warm forming deep draw were modelled using the LS-DYNA commercial finite element code. Temperature-time measurements were obtained from both applications. The results of the finite element model showed that the experimentally derived HTC values were able to predict the temperature-time history to within a 2% of the measured response. It is intended that the HTC values presented herein can be used in warm forming models in order to accurately capture the heat transfer characteristics of the operation.

  14. Validation of mixing heights derived from the operational NWP models at the German weather service

    Energy Technology Data Exchange (ETDEWEB)

    Fay, B.; Schrodin, R.; Jacobsen, I. [Deutscher Wetterdienst, Offenbach (Germany); Engelbart, D. [Deutscher Wetterdienst, Meteorol. Observ. Lindenberg (Germany)

    1997-10-01

    NWP models incorporate an ever-increasing number of observations via four-dimensional data assimilation and are capable of providing comprehensive information about the atmosphere both in space and time. They describe not only near surface parameters but also the vertical structure of the atmosphere. They operate daily, are well verified and successfully used as meteorological pre-processors in large-scale dispersion modelling. Applications like ozone forecasts, emission or power plant control calculations require highly resolved, reliable, and routine values of the temporal evolution of the mixing height (MH) which is a critical parameter in determining the mixing and transformation of substances and the resulting pollution levels near the ground. The purpose of development at the German Weather Service is a straightforward mixing height scheme that uses only parameters derived from NWP model variables and thus automatically provides spatial and temporal fields of mixing heights on an operational basis. An universal parameter to describe stability is the Richardson number Ri. Compared to the usual diagnostic or rate equations, the Ri number concept of determining mixing heights has the advantage of using not only surface layer parameters but also regarding the vertical structure of the boundary layer resolved in the NWP models. (au)

  15. Development, initial reliability and validity testing of an observational tool for assessing technical skills of operating room nurses.

    Science.gov (United States)

    Sevdalis, Nick; Undre, Shabnam; Henry, Janet; Sydney, Elaine; Koutantji, Mary; Darzi, Ara; Vincent, Charles A

    2009-09-01

    The recent emergence of the Systems Approach to the safety and quality of surgical care has triggered individual and team skills training modules for surgeons and anaesthetists and relevant observational assessment tools have been developed. To develop an observational tool that captures operating room (OR) nurses' technical skill and can be used for assessment and training. The Imperial College Assessment of Technical Skills for Nurses (ICATS-N) assesses (i) gowning and gloving, (ii) setting up instrumentation, (iii) draping, and (iv) maintaining sterility. Three to five observable behaviours have been identified for each skill and are rated on 1-6 scales. Feasibility and aspects of reliability and validity were assessed in 20 simulation-based crisis management training modules for trainee nurses and doctors, carried out in a Simulated Operating Room. The tool was feasible to use in the context of simulation-based training. Satisfactory reliability (Cronbach alpha) was obtained across trainers' and trainees' scores (analysed jointly and separately). Moreover, trainer nurse's ratings of the four skills correlated positively, thus indicating adequate content validity. Trainer's and trainees' ratings did not correlate. Assessment of OR nurses' technical skill is becoming a training priority. The present evidence suggests that the ICATS-N could be considered for use as an assessment/training tool for junior OR nurses.

  16. Validity and reliability of global operative assessment of laparoscopic skills (GOALS) in novice trainees performing a laparoscopic cholecystectomy.

    Science.gov (United States)

    Kramp, Kelvin H; van Det, Marc J; Hoff, Christiaan; Lamme, Bas; Veeger, Nic J G M; Pierie, Jean-Pierre E N

    2015-01-01

    Global Operative Assessment of Laparoscopic Skills (GOALS) assessment has been designed to evaluate skills in laparoscopic surgery. A longitudinal blinded study of randomized video fragments was conducted to estimate the validity and reliability of GOALS in novice trainees. In total, 10 trainees each performed 6 consecutive laparoscopic cholecystectomies. Sixty procedures were recorded on video. Video fragments of (1) opening of the peritoneum; (2) dissection of Calot's triangle and achievement of critical view of safety; and (3) dissection of the gallbladder from the liver bed were blinded, randomized, and rated by 2 consultant surgeons using GOALS. Also, a grade was given for overall competence. The correlation of GOALS with live observation Objective Structured Assessment of Technical Skills (OSATS) scores was calculated. Construct validity was estimated using the Friedman 2-way analysis of variance by ranks and the Wilcoxon signed-rank test. The interrater reliability was calculated using the absolute and consistency agreement 2-way random-effects model intraclass correlation coefficient. A high correlation was found between mean GOALS score (r = 0.879, p = 0.021) and mean OSATS score. The GOALS score increased significantly across the 6 procedures (p = 0.002). The trainees performed significantly better on their sixth when compared with their first cholecystectomy (p = 0.004). The consistency agreement interrater reliability was 0.37 for the mean GOALS score (p = 0.002) and 0.55 for overall competence (p < 0.001) of the 3 video fragments. The validity observed in this randomized blinded longitudinal study supports the existing evidence that GOALS is a valid tool for assessment of novice trainees. A relatively low reliability was found in this study. Copyright © 2014 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  17. Validating cognitive support for operators of complex human-machine systems

    International Nuclear Information System (INIS)

    O'Hara, J.; Wachtel, J.

    1995-01-01

    Modem nuclear power plants (NPPs) are complex systems whose performance is the result of an intricate interaction of human and system control. A complex system may be defined as one which supports a dynamic process involving a large number of elements that interact in many different ways. Safety is addressed through defense-in-depth design and preplanning; i.e., designers consider the types of failures that are most likely to occur and those of high consequence, and design their solutions in advance. However, complex interactions and their failure modes cannot always be anticipated by the designer and may be unfamiliar to plant personnel. These situations may pose cognitive demands on plant personnel, both individually and as a crew. Other factors may contribute to the cognitive challenges of NPP operation as well, including hierarchal processes, dynamic pace, system redundancy and reliability, and conflicting objectives. These factors are discussed in this paper

  18. Operation and validation of the decision aid system 'CAIRE' in complex terrain

    International Nuclear Information System (INIS)

    De Witt, H.; Nuesser, A.; Brenk, H.D.

    1998-01-01

    In cases of nuclear emergencies it is the primary task of emergency response forces and decision making authorities to act properly. Based on telemetric surveillance networks, an advanced automatic on-line decision support system named CAIRE (Computer Aided Response to Emergencies) has been developed and is in operation now at 4 sites as a real time emergency response tool. This tool is designed to provide decision makers with precise radiation exposure data for the population at risk. Depending on the individual layout of the connected telemetric networks, CAIRE is able to satisfy the following main requirements: automatic identification of the source location and of the source term, automatic diagnosis of the actual radiological situation and identification of the endangered area, projection of the radiological situation, delivery of all this information in the form computer graphics. (R.P.)

  19. Non invasive transcostal focusing based on the decomposition of the time reversal operator: in vitro validation

    Science.gov (United States)

    Cochard, Étienne; Prada, Claire; Aubry, Jean-François; Fink, Mathias

    2010-03-01

    Thermal ablation induced by high intensity focused ultrasound has produced promising clinical results to treat hepatocarcinoma and other liver tumors. However skin burns have been reported due to the high absorption of ultrasonic energy by the ribs. This study proposes a method to produce an acoustic field focusing on a chosen target while sparing the ribs, using the decomposition of the time-reversal operator (DORT method). The idea is to apply an excitation weight vector to the transducers array which is orthogonal to the subspace of emissions focusing on the ribs. The ratio of the energies absorbed at the focal point and on the ribs has been enhanced up to 100-fold as demonstrated by the measured specific absorption rates.

  20. Validating and extending the three process model of alertness in airline operations.

    Directory of Open Access Journals (Sweden)

    Michael Ingre

    Full Text Available Sleepiness and fatigue are important risk factors in the transport sector and bio-mathematical sleepiness, sleep and fatigue modeling is increasingly becoming a valuable tool for assessing safety of work schedules and rosters in Fatigue Risk Management Systems (FRMS. The present study sought to validate the inner workings of one such model, Three Process Model (TPM, on aircrews and extend the model with functions to model jetlag and to directly assess the risk of any sleepiness level in any shift schedule or roster with and without knowledge of sleep timings. We collected sleep and sleepiness data from 136 aircrews in a real life situation by means of an application running on a handheld touch screen computer device (iPhone, iPod or iPad and used the TPM to predict sleepiness with varying level of complexity of model equations and data. The results based on multilevel linear and non-linear mixed effects models showed that the TPM predictions correlated with observed ratings of sleepiness, but explorative analyses suggest that the default model can be improved and reduced to include only two-processes (S+C, with adjusted phases of the circadian process based on a single question of circadian type. We also extended the model with a function to model jetlag acclimatization and with estimates of individual differences including reference limits accounting for 50%, 75% and 90% of the population as well as functions for predicting the probability of any level of sleepiness for ecological assessment of absolute and relative risk of sleepiness in shift systems for safety applications.

  1. Validating and extending the three process model of alertness in airline operations.

    Science.gov (United States)

    Ingre, Michael; Van Leeuwen, Wessel; Klemets, Tomas; Ullvetter, Christer; Hough, Stephen; Kecklund, Göran; Karlsson, David; Åkerstedt, Torbjörn

    2014-01-01

    Sleepiness and fatigue are important risk factors in the transport sector and bio-mathematical sleepiness, sleep and fatigue modeling is increasingly becoming a valuable tool for assessing safety of work schedules and rosters in Fatigue Risk Management Systems (FRMS). The present study sought to validate the inner workings of one such model, Three Process Model (TPM), on aircrews and extend the model with functions to model jetlag and to directly assess the risk of any sleepiness level in any shift schedule or roster with and without knowledge of sleep timings. We collected sleep and sleepiness data from 136 aircrews in a real life situation by means of an application running on a handheld touch screen computer device (iPhone, iPod or iPad) and used the TPM to predict sleepiness with varying level of complexity of model equations and data. The results based on multilevel linear and non-linear mixed effects models showed that the TPM predictions correlated with observed ratings of sleepiness, but explorative analyses suggest that the default model can be improved and reduced to include only two-processes (S+C), with adjusted phases of the circadian process based on a single question of circadian type. We also extended the model with a function to model jetlag acclimatization and with estimates of individual differences including reference limits accounting for 50%, 75% and 90% of the population as well as functions for predicting the probability of any level of sleepiness for ecological assessment of absolute and relative risk of sleepiness in shift systems for safety applications.

  2. Validation Tests of Fiber Optic Strain-Based Operational Shape and Load Measurements

    Science.gov (United States)

    Bakalyar, John A.; Jutte, Christine

    2012-01-01

    Aircraft design has been progressing toward reduced structural weight to improve fuel efficiency, increase performance, and reduce cost. Lightweight aircraft structures are more flexible than conventional designs and require new design considerations. Intelligent sensing allows for enhanced control and monitoring of aircraft, which enables increased structurally efficiency. The NASA Dryden Flight Research Center (DFRC) has developed an instrumentation system and analysis techniques that combine to make distributed structural measurements practical for lightweight vehicles. Dryden's Fiber Optic Strain Sensing (FOSS) technology enables a multitude of lightweight, distributed surface strain measurements. The analysis techniques, referred to as the Displacement Transfer Functions (DTF) and Load Transfer Functions (LTF), use surface strain values to calculate structural deflections and operational loads. The combined system is useful for real-time monitoring of aeroelastic structures, along with many other applications. This paper describes how the capabilities of the measurement system were demonstrated using subscale test articles that represent simple aircraft structures. Empirical FOSS strain data were used within the DTF to calculate the displacement of the article and within the LTF to calculate bending moments due to loads acting on the article. The results of the tests, accuracy of the measurements, and a sensitivity analysis are presented.

  3. Validation of activity determination codes and nuclide vectors by using results from processing of retired components and operational waste

    International Nuclear Information System (INIS)

    Lundgren, Klas; Larsson, Arne

    2012-01-01

    Decommissioning studies for nuclear power reactors are performed in order to assess the decommissioning costs and the waste volumes as well as to provide data for the licensing and construction of the LILW repositories. An important part of this work is to estimate the amount of radioactivity in the different types of decommissioning waste. Studsvik ALARA Engineering has performed such assessments for LWRs and other nuclear facilities in Sweden. These assessments are to a large content depending on calculations, senior experience and sampling on the facilities. The precision in the calculations have been found to be relatively high close to the reactor core. Of natural reasons the precision will decline with the distance. Even if the activity values are lower the content of hard to measure nuclides can cause problems in the long term safety demonstration of LLW repositories. At the same time Studsvik is processing significant volumes of metallic and combustible waste from power stations in operation and in decommissioning phase as well as from other nuclear facilities such as research and waste treatment facilities. Combining the unique knowledge in assessment of radioactivity inventory and the large data bank the waste processing represents the activity determination codes can be validated and the waste processing analysis supported with additional data. The intention with this presentation is to highlight how the European nuclear industry jointly could use the waste processing data for validation of activity determination codes. (authors)

  4. Development of an expert system for success path generation and operator's action guides in NPP: Verification and validation of COSMOS

    International Nuclear Information System (INIS)

    Yang, Jun Un; Jung, Kwang Sup; Park, Chang Gyu

    1992-08-01

    For the support of emergency operation, an expert system named COSMOS (COmputerized Success-path MOnitoring System) is being developed at Korea Atomic Energy Research Institute (KAERI). COSMOS identifies the critical safety function's (CSF'S) status, and suggests the overall response strategy with a set of success paths which restore the challenged CSF's. The status of CSF is identified by the rule-based reasoning. The overall response strategy is inferred according to the identified CSF's status. The success paths are generated by the given structure descriptions of systems and the general generation algorithm. For efficient man-machine interface, a colar graphic display is utilized. COSMOS is being built on a workstation. The major tasks to build an expert system such as COSMOS are the construction of knowledge base and inference engine. In COSMOS, the knowledges are derived from the Emergency Operating Procedures (EOPs), and the forward chaining is adopted as the inference strategy. While the knowledge base and inference engine are the most common and essential elements of an expert system, they are not the only ones. The evaluation of expert systems can not only lessen the risk of using faulty software, but also enhance the acceptability of the expert systems by both users and regulators. The evaluation of expert systems consists of the system verification, validation and user acceptance testing. Among them, in this report, we have focused our attention to verification and validation (V≅V) of expert systems. We have accessed the general V≅V procedures and tried to develop the specific V≅V procedure for COSMOS. (Author)

  5. Self-validated calculation of characteristics of a Francis turbine and the mechanism of the S-shape operational instability

    International Nuclear Information System (INIS)

    Zhang, Z; Titzschkau, M

    2012-01-01

    A calculation method has been presented to accurately estimate the characteristics of a Francis turbine. Both the shock loss at the impeller inlet and the swirling flow loss at the Impeller exit have been confirmed to dominantly influence the turbine characteristics and particularly the hydraulic efficiency. Both together totally govern the through flow of water through the impeller being at the rest. Calculations have been performed for the flow rate, the shaft torque and the hydraulic efficiency and compared with the available measurements on a model turbine. Excellent agreements have been achieved. Some other interesting properties of the turbine characteristics could also be derived from the calculations and verified by experiments. For this reason and because of not using any unreliable assumptions the calculation method has been confirmed to be self-validated. The operational instability in the upper range of the rotational speed, known as the S-shape instability, is ascribed to the total flow separation and stagnation at the impeller inlet. In that range of the rotational speed, the operation of the Francis turbine oscillates between pump and turbine mode.

  6. VALIDATION OF SPRING OPERATED PRESSURE RELIEF VALVE TIME TO FAILURE AND THE IMPORTANCE OF STATISTICALLY SUPPORTED MAINTENANCE INTERVALS

    Energy Technology Data Exchange (ETDEWEB)

    Gross, R; Stephen Harris, S

    2009-02-18

    The Savannah River Site operates a Relief Valve Repair Shop certified by the National Board of Pressure Vessel Inspectors to NB-23, The National Board Inspection Code. Local maintenance forces perform inspection, testing, and repair of approximately 1200 spring-operated relief valves (SORV) each year as the valves are cycled in from the field. The Site now has over 7000 certified test records in the Computerized Maintenance Management System (CMMS); a summary of that data is presented in this paper. In previous papers, several statistical techniques were used to investigate failure on demand and failure rates including a quantal response method for predicting the failure probability as a function of time in service. The non-conservative failure mode for SORV is commonly termed 'stuck shut'; industry defined as the valve opening at greater than or equal to 1.5 times the cold set pressure. Actual time to failure is typically not known, only that failure occurred some time since the last proof test (censored data). This paper attempts to validate the assumptions underlying the statistical lifetime prediction results using Monte Carlo simulation. It employs an aging model for lift pressure as a function of set pressure, valve manufacturer, and a time-related aging effect. This paper attempts to answer two questions: (1) what is the predicted failure rate over the chosen maintenance/ inspection interval; and do we understand aging sufficient enough to estimate risk when basing proof test intervals on proof test results?

  7. Validation of MIPAS-ENVISAT H2O operational data collected between July 2002 and March 2004

    Directory of Open Access Journals (Sweden)

    G. Wetzel

    2013-06-01

    Full Text Available Water vapour (H2O is one of the operationally retrieved key species of the Michelson Interferometer for Passive Atmospheric Sounding (MIPAS instrument aboard the Environmental Satellite (ENVISAT which was launched into its sun-synchronous orbit on 1 March 2002 and operated until April 2012. Within the MIPAS validation activities, independent observations from balloons, aircraft, satellites, and ground-based stations have been compared to European Space Agency (ESA version 4.61 operational H2O data comprising the time period from July 2002 until March 2004 where MIPAS measured with full spectral resolution. No significant bias in the MIPAS H2O data is seen in the lower stratosphere (above the hygropause between about 15 and 30 km. Differences of H2O quantities observed by MIPAS and the validation instruments are mostly well within the combined total errors in this altitude region. In the upper stratosphere (above about 30 km, a tendency towards a small positive bias (up to about 10% is present in the MIPAS data when compared to its balloon-borne counterpart MIPAS-B, to the satellite instruments HALOE (Halogen Occultation Experiment and ACE-FTS (Atmospheric Chemistry Experiment, Fourier Transform Spectrometer, and to the millimeter-wave airborne sensor AMSOS (Airborne Microwave Stratospheric Observing System. In the mesosphere the situation is unclear due to the occurrence of different biases when comparing HALOE and ACE-FTS data. Pronounced deviations between MIPAS and the correlative instruments occur in the lowermost stratosphere and upper troposphere, a region where retrievals of H2O are most challenging. Altogether it can be concluded that MIPAS H2O profiles yield valuable information on the vertical distribution of H2O in the stratosphere with an overall accuracy of about 10 to 30% and a precision of typically 5 to 15% – well within the predicted error budget, showing that these global and continuous data are very valuable for scientific

  8. Validation of MCNP and ORIGEN-S 3-D computational model for reactivity predictions during BR2 operation

    International Nuclear Information System (INIS)

    Kalcheva, S.; Koonen, E.; Ponsard, B.

    2005-01-01

    The Belgian Material Test Reactor (MTR) BR2 is strongly heterogeneous high flux engineering test reactor at SCK-CEN (Centre d'Etude de l'energie Nucleaire) in Mol at a thermal power 60 to 100 MW. It deploys highly enriched uranium, water cooled concentric plate fuel elements, positioned inside a beryllium reflector with complex hyperboloid arrangement of test holes. The objective of this paper is the validation of a MCNP and ORIGEN-S 3D model for reactivity predictions of the entire BR2 core during reactor operation. We employ the Monte Carlo code MCNP-4C for evaluating the effective multiplication factor k eff and 3D space dependent specific power distribution. The 1D code ORIGEN-S is used for calculation of isotopic fuel depletion versus burn up and preparation of a database (DB) with depleted fuel compositions. The approach taken is to evaluate the 3D power distribution at each time step and along with DB to evaluate the 3D isotopic fuel depletion at the next step and to deduce the corresponding shim rods positions of the reactor operation. The capabilities of the both codes are fully exploited without constraints on the number of involved isotope depletion chains or increase of the computational time. The reactor has a complex operation, with important shutdowns between cycles, and its reactivity is strongly influenced by poisons, mainly 3 He and 6 Li from the beryllium reflector, and burnable absorbers 149 Sm and 10 B in the fresh UAlx fuel. Our computational predictions for the shim rods position at various restarts are within 0.5$ (β eff =0.0072). (author)

  9. NASA's Rodent Research Project: Validation of Flight Hardware, Operations and Science Capabilities for Conducting Long Duration Experiments in Space

    Science.gov (United States)

    Choi, S. Y.; Beegle, J. E.; Wigley, C. L.; Pletcher, D.; Globus, R. K.

    2015-01-01

    Research using rodents is an essential tool for advancing biomedical research on Earth and in space. Rodent Research (RR)-1 was conducted to validate flight hardware, operations, and science capabilities that were developed at the NASA Ames Research Center. Twenty C57BL/6J adult female mice were launched on Sept 21, 2014 in a Dragon Capsule (SpaceX-4), then transferred to the ISS for a total time of 21-22 days (10 commercial mice) or 37 (10 validation mice). Tissues collected on-orbit were either rapidly frozen or preserved in RNA later at less than or equal to -80 C (n=2/group) until their return to Earth. Remaining carcasses were rapidly frozen for dissection post-flight. The three controls groups at Kennedy Space Center consisted of: Basal mice euthanized at the time of launch, Vivarium controls, housed in standard cages, and Ground Controls (GC), housed in flight hardware within an environmental chamber. FLT mice appeared more physically active on-orbit than GC, and behavior analysis are in progress. Upon return to Earth, there were no differences in body weights between FLT and GC at the end of the 37 days in space. RNA was of high quality (RIN greater than 8.5). Liver enzyme activity levels of FLT mice and all control mice were similar in magnitude to those of the samples that were optimally processed in the laboratory. Liver samples collected from the intact frozen FLT carcasses had RNA RIN of 7.27 +/- 0.52, which was lower than that of the samples processed on-orbit, but similar to those obtained from the control group intact carcasses. Nonetheless, the RNA samples from the intact carcasses were acceptable for the most demanding transcriptomic analyses. Adrenal glands, thymus and spleen (organs associated with stress response) showed no significant difference in weights between FLT and GC. Enzymatic activity was also not significantly different. Over 3,000 tissues collected from the four groups of mice have become available for the Biospecimen Sharing

  10. Optimization and validation of accelerated golden-angle radial sparse MRI reconstruction with self-calibrating GRAPPA operator gridding.

    Science.gov (United States)

    Benkert, Thomas; Tian, Ye; Huang, Chenchan; DiBella, Edward V R; Chandarana, Hersh; Feng, Li

    2018-07-01

    Golden-angle radial sparse parallel (GRASP) MRI reconstruction requires gridding and regridding to transform data between radial and Cartesian k-space. These operations are repeatedly performed in each iteration, which makes the reconstruction computationally demanding. This work aimed to accelerate GRASP reconstruction using self-calibrating GRAPPA operator gridding (GROG) and to validate its performance in clinical imaging. GROG is an alternative gridding approach based on parallel imaging, in which k-space data acquired on a non-Cartesian grid are shifted onto a Cartesian k-space grid using information from multicoil arrays. For iterative non-Cartesian image reconstruction, GROG is performed only once as a preprocessing step. Therefore, the subsequent iterative reconstruction can be performed directly in Cartesian space, which significantly reduces computational burden. Here, a framework combining GROG with GRASP (GROG-GRASP) is first optimized and then compared with standard GRASP reconstruction in 22 prostate patients. GROG-GRASP achieved approximately 4.2-fold reduction in reconstruction time compared with GRASP (∼333 min versus ∼78 min) while maintaining image quality (structural similarity index ≈ 0.97 and root mean square error ≈ 0.007). Visual image quality assessment by two experienced radiologists did not show significant differences between the two reconstruction schemes. With a graphics processing unit implementation, image reconstruction time can be further reduced to approximately 14 min. The GRASP reconstruction can be substantially accelerated using GROG. This framework is promising toward broader clinical application of GRASP and other iterative non-Cartesian reconstruction methods. Magn Reson Med 80:286-293, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  11. Measurement error correction in the least absolute shrinkage and selection operator model when validation data are available.

    Science.gov (United States)

    Vasquez, Monica M; Hu, Chengcheng; Roe, Denise J; Halonen, Marilyn; Guerra, Stefano

    2017-01-01

    Measurement of serum biomarkers by multiplex assays may be more variable as compared to single biomarker assays. Measurement error in these data may bias parameter estimates in regression analysis, which could mask true associations of serum biomarkers with an outcome. The Least Absolute Shrinkage and Selection Operator (LASSO) can be used for variable selection in these high-dimensional data. Furthermore, when the distribution of measurement error is assumed to be known or estimated with replication data, a simple measurement error correction method can be applied to the LASSO method. However, in practice the distribution of the measurement error is unknown and is expensive to estimate through replication both in monetary cost and need for greater amount of sample which is often limited in quantity. We adapt an existing bias correction approach by estimating the measurement error using validation data in which a subset of serum biomarkers are re-measured on a random subset of the study sample. We evaluate this method using simulated data and data from the Tucson Epidemiological Study of Airway Obstructive Disease (TESAOD). We show that the bias in parameter estimation is reduced and variable selection is improved.

  12. Operational calibration and validation of landsat data continuity mission (LDCM) sensors using the image assessment system (IAS)

    Science.gov (United States)

    Micijevic, Esad; Morfitt, Ron

    2010-01-01

    Systematic characterization and calibration of the Landsat sensors and the assessment of image data quality are performed using the Image Assessment System (IAS). The IAS was first introduced as an element of the Landsat 7 (L7) Enhanced Thematic Mapper Plus (ETM+) ground segment and recently extended to Landsat 4 (L4) and 5 (L5) Thematic Mappers (TM) and Multispectral Sensors (MSS) on-board the Landsat 1-5 satellites. In preparation for the Landsat Data Continuity Mission (LDCM), the IAS was developed for the Earth Observer 1 (EO-1) Advanced Land Imager (ALI) with a capability to assess pushbroom sensors. This paper describes the LDCM version of the IAS and how it relates to unique calibration and validation attributes of its on-board imaging sensors. The LDCM IAS system will have to handle a significantly larger number of detectors and the associated database than the previous IAS versions. An additional challenge is that the LDCM IAS must handle data from two sensors, as the LDCM products will combine the Operational Land Imager (OLI) and Thermal Infrared Sensor (TIRS) spectral bands.

  13. An integrated approach to develop, validate and operate thermo-physiological human simulator for the development of protective clothing.

    Science.gov (United States)

    Psikuta, Agnes; Koelblen, Barbara; Mert, Emel; Fontana, Piero; Annaheim, Simon

    2017-12-07

    Following the growing interest in the further development of manikins to simulate human thermal behaviour more adequately, thermo-physiological human simulators have been developed by coupling a thermal sweating manikin with a thermo-physiology model. Despite their availability and obvious advantages, the number of studies involving these devices is only marginal, which plausibly results from the high complexity of the development and evaluation process and need of multi-disciplinary expertise. The aim of this paper is to present an integrated approach to develop, validate and operate such devices including technical challenges and limitations of thermo-physiological human simulators, their application and measurement protocol, strategy for setting test scenarios, and the comparison to standard methods and human studies including details which have not been published so far. A physical manikin controlled by a human thermoregulation model overcame the limitations of mathematical clothing models and provided a complementary method to investigate thermal interactions between the human body, protective clothing, and its environment. The opportunities of these devices include not only realistic assessment of protective clothing assemblies and equipment but also potential application in many research fields ranging from biometeorology, automotive industry, environmental engineering, and urban climate to clinical and safety applications.

  14. Validation and Verification (V&V) of Safety-Critical Systems Operating Under Off-Nominal Conditions

    Science.gov (United States)

    Belcastro, Christine M.

    2012-01-01

    Loss of control (LOC) remains one of the largest contributors to aircraft fatal accidents worldwide. Aircraft LOC accidents are highly complex in that they can result from numerous causal and contributing factors acting alone or more often in combination. Hence, there is no single intervention strategy to prevent these accidents. Research is underway at the National Aeronautics and Space Administration (NASA) in the development of advanced onboard system technologies for preventing or recovering from loss of vehicle control and for assuring safe operation under off-nominal conditions associated with aircraft LOC accidents. The transition of these technologies into the commercial fleet will require their extensive validation and verification (V&V) and ultimate certification. The V&V of complex integrated systems poses highly significant technical challenges and is the subject of a parallel research effort at NASA. This chapter summarizes the V&V problem and presents a proposed process that could be applied to complex integrated safety-critical systems developed for preventing aircraft LOC accidents. A summary of recent research accomplishments in this effort is referenced.

  15. Unmanned Aircraft Systems Minimum Operations Performance Standards End-to-End Verification and Validation (E2-V2) Simulation

    Science.gov (United States)

    Ghatas, Rania W.; Jack, Devin P.; Tsakpinis, Dimitrios; Vincent, Michael J.; Sturdy, James L.; Munoz, Cesar A.; Hoffler, Keith D.; Dutle, Aaron M.; Myer, Robert R.; Dehaven, Anna M.; hide

    2017-01-01

    As Unmanned Aircraft Systems (UAS) make their way to mainstream aviation operations within the National Airspace System (NAS), research efforts are underway to develop a safe and effective environment for their integration into the NAS. Detect and Avoid (DAA) systems are required to account for the lack of "eyes in the sky" due to having no human on-board the aircraft. The current NAS relies on pilot's vigilance and judgement to remain Well Clear (CFR 14 91.113) of other aircraft. RTCA SC-228 has defined DAA Well Clear (DAAWC) to provide a quantified Well Clear volume to allow systems to be designed and measured against. Extended research efforts have been conducted to understand and quantify system requirements needed to support a UAS pilot's ability to remain well clear of other aircraft. The efforts have included developing and testing sensor, algorithm, alerting, and display requirements. More recently, sensor uncertainty and uncertainty mitigation strategies have been evaluated. This paper discusses results and lessons learned from an End-to-End Verification and Validation (E2-V2) simulation study of a DAA system representative of RTCA SC-228's proposed Phase I DAA Minimum Operational Performance Standards (MOPS). NASA Langley Research Center (LaRC) was called upon to develop a system that evaluates a specific set of encounters, in a variety of geometries, with end-to-end DAA functionality including the use of sensor and tracker models, a sensor uncertainty mitigation model, DAA algorithmic guidance in both vertical and horizontal maneuvering, and a pilot model which maneuvers the ownship aircraft to remain well clear from intruder aircraft, having received collective input from the previous modules of the system. LaRC developed a functioning batch simulation and added a sensor/tracker model from the Federal Aviation Administration (FAA) William J. Hughes Technical Center, an in-house developed sensor uncertainty mitigation strategy, and implemented a pilot

  16. VALIDATION OF MITRAL VALVE ANNULUS DIMENSIONS MEASURED BY 2D TRANS-THORACIC ECHOCARDIOGRAPHY WITH GOLD STANDARD DIRECT INTRA-OPERATIVE MEASUREMENT

    OpenAIRE

    Praveen; Yadav; Ankur; Saket; Kaushal

    2014-01-01

    CONTEXT: Precise estimation of Mitral valve annulus dimensions preoperatively through Echocardiography is of paramount importance in replacement/repair surgeries. However a frequent disagreement was experienced between anticipated size of prosthetic valve based on echocardiography and actual valve size. This fact encouraged the authors to validate the measurements through echocardiography with gold-standard direct intra operative measurement. AIM: To compare the mitral val...

  17. Face validity of a pre-clinical model of operant binge drinking: just a question of speed.

    Science.gov (United States)

    Jeanblanc, Jérôme; Sauton, Pierre; Jeanblanc, Virginie; Legastelois, Rémi; Echeverry-Alzate, Victor; Lebourgeois, Sophie; Gonzalez-Marin, Maria Del Carmen; Naassila, Mickaël

    2018-06-04

    Binge drinking (BD) is often defined as a large amount of alcohol consumed in a 'short' period of time or 'per occasion'. In clinical research, few researchers have included the notion of 'speed of drinking' in the definition of BD. Here, we aimed to describe a novel pre-clinical model based on voluntary operant BD, which included both the quantity of alcohol and the rapidity of consumption. In adult Long-Evans male rats, we induced BD by regularly decreasing the duration of ethanol self-administration from 1-hour to 15-minute sessions. We compared the behavioral consequences of BD with the behaviors of rats subjected to moderate drinking or heavy drinking (HD). We found that, despite high ethanol consumption levels (1.2 g/kg/15 minutes), the total amounts consumed were insufficient to differentiate HD from BD. However, consumption speed could distinguish between these groups. The motivation to consume was higher in BD than in HD rats. After BD, we observed alterations in locomotor coordination in rats that consumed greater than 0.8 g/kg, which was rarely observed in HD rats. Finally, chronic BD led to worse performance in a decision-making task, and as expected, we observed a lower stimulated dopaminergic release within nucleus accumbens slices in poor decision makers. Our BD model exhibited good face validity and can now provide animals voluntarily consuming very rapidly enough alcohol to achieve intoxication levels and thus allowing the study of the complex interaction between individual and environmental factors underlying BD behavior. © 2018 Society for the Study of Addiction.

  18. Attempted development and cross-validation of predictive models of individual-level and organizational-level turnover of nuclear power operators

    International Nuclear Information System (INIS)

    Vasa-Sideris, S.J.

    1989-01-01

    Nuclear power accounts for 209% of the electric power generated in the U.S. by 107 nuclear plants which employ over 8,700 operators. Operator turnover is significant to utilities from the economic point of view since it costs almost three hundred thousand dollars to train and qualify one operator, and because turnover affects plant operability and therefore plant safety. The study purpose was to develop and cross-validate individual-level and organizational-level models of turnover of nuclear power plant operators. Data were obtained by questionnaires and from published data for 1983 and 1984 on a number of individual, organizational, and environmental predictors. Plants had been in operation for two or more years. Questionnaires were returned by 29 out of 50 plants on over 1600 operators. The objectives were to examine the reliability of the turnover criterion, to determine the classification accuracy of the multivariate predictive models and of categories of predictors (individual, organizational, and environmental) and to determine if a homology existed between the individual-level and organizational-level models. The method was to examine the shrinkage that occurred between foldback design (in which the predictive models were reapplied to the data used to develop them) and cross-validation. Results did not support the hypothesis objectives. Turnover data were accurate but not stable between the two years. No significant differences were detected between the low and high turnover groups at the organization or individual level in cross-validation. Lack of stability in the criterion, restriction of range, and small sample size at the organizational level were serious limitations of this study. The results did support the methods. Considerable shrinkage occurred between foldback and cross-validation of the models

  19. Construct validity in Operations Management by using Rasch Measurement Theory. The case of the construct “motivation to implement continuous improvement"

    Directory of Open Access Journals (Sweden)

    Lidia Sanchez-Ruiz

    2016-12-01

    Full Text Available Construct design and validation is a common practise in the Operations Management field. In this sense, the aim of this study is to present Rasch Measurement Theory (RMT as richful and useful methodology in order to validate constructs. In order to do so, the measurement controversy in the social science is presented; then, RMT is explained as a solution for this measurement issue; after that, the different applications of RMT are described and, finally, the different stages of the validation process are presented. Thus, this work aims to serve as a guide for those researchers interested in the methodology. Therefore, a specific case is included: the validation of the construct “motivation to implement continuous improvement”.

  20. Serum prolactin revisited: parametric reference intervals and cross platform evaluation of polyethylene glycol precipitation-based methods for discrimination between hyperprolactinemia and macroprolactinemia.

    Science.gov (United States)

    Overgaard, Martin; Pedersen, Susanne Møller

    2017-10-26

    Hyperprolactinemia diagnosis and treatment is often compromised by the presence of biologically inactive and clinically irrelevant higher-molecular-weight complexes of prolactin, macroprolactin. The objective of this study was to evaluate the performance of two macroprolactin screening regimes across commonly used automated immunoassay platforms. Parametric total and monomeric gender-specific reference intervals were determined for six immunoassay methods using female (n=96) and male sera (n=127) from healthy donors. The reference intervals were validated using 27 hyperprolactinemic and macroprolactinemic sera, whose presence of monomeric and macroforms of prolactin were determined using gel filtration chromatography (GFC). Normative data for six prolactin assays included the range of values (2.5th-97.5th percentiles). Validation sera (hyperprolactinemic and macroprolactinemic; n=27) showed higher discordant classification [mean=2.8; 95% confidence interval (CI) 1.2-4.4] for the monomer reference interval method compared to the post-polyethylene glycol (PEG) recovery cutoff method (mean=1.8; 95% CI 0.8-2.8). The two monomer/macroprolactin discrimination methods did not differ significantly (p=0.089). Among macroprolactinemic sera evaluated by both discrimination methods, the Cobas and Architect/Kryptor prolactin assays showed the lowest and the highest number of misclassifications, respectively. Current automated immunoassays for prolactin testing require macroprolactin screening methods based on PEG precipitation in order to discriminate truly from falsely elevated serum prolactin. While the recovery cutoff and monomeric reference interval macroprolactin screening methods demonstrate similar discriminative ability, the latter method also provides the clinician with an easy interpretable monomeric prolactin concentration along with a monomeric reference interval.

  1. Soil moisture mapping using Sentinel 1 images: the proposed approach and its preliminary validation carried out in view of an operational product

    Science.gov (United States)

    Paloscia, S.; Pettinato, S.; Santi, E.; Pierdicca, N.; Pulvirenti, L.; Notarnicola, C.; Pace, G.; Reppucci, A.

    2011-11-01

    The main objective of this research is to develop, test and validate a soil moisture (SMC)) algorithm for the GMES Sentinel-1 characteristics, within the framework of an ESA project. The SMC product, to be generated from Sentinel-1 data, requires an algorithm able to process operationally in near-real-time and deliver the product to the GMES services within 3 hours from observations. Two different complementary approaches have been proposed: an Artificial Neural Network (ANN), which represented the best compromise between retrieval accuracy and processing time, thus allowing compliance with the timeliness requirements and a Bayesian Multi-temporal approach, allowing an increase of the retrieval accuracy, especially in case where little ancillary data are available, at the cost of computational efficiency, taking advantage of the frequent revisit time achieved by Sentinel-1. The algorithm was validated in several test areas in Italy, US and Australia, and finally in Spain with a 'blind' validation. The Multi-temporal Bayesian algorithm was validated in Central Italy. The validation results are in all cases very much in line with the requirements. However, the blind validation results were penalized by the availability of only VV polarization SAR images and MODIS lowresolution NDVI, although the RMS is slightly > 4%.

  2. Validity, Reliability, and Performance Determinants of a New Job-Specific Anaerobic Work Capacity Test for the Norwegian Navy Special Operations Command.

    Science.gov (United States)

    Angeltveit, Andreas; Paulsen, Gøran; Solberg, Paul A; Raastad, Truls

    2016-02-01

    Operators in Special Operation Forces (SOF) have a particularly demanding profession where physical and psychological capacities can be challenged to the extremes. The diversity of physical capacities needed depend on the mission. Consequently, tests used to monitor SOF operators' physical fitness should cover a broad range of physical capacities. Whereas tests for strength and aerobic endurance are established, there is no test for specific anaerobic work capacity described in the literature. The purpose of this study was therefore to evaluate the reliability, validity, and to identify performance determinants of a new test developed for testing specific anaerobic work capacity in SOF operators. Nineteen active young students were included in the concurrent validity part of the study. The students performed the evacuation (EVAC) test 3 times and the results were compared for reliability and with performance in the Wingate cycle test, 300-m sprint, and a maximal accumulated oxygen deficit (MAOD) test. In part II of the study, 21 Norwegian Navy Special Operations Command operators conducted the EVAC test, anthropometric measurements, a dual x-ray absorptiometry scan, leg press, isokinetic knee extensions, maximal oxygen uptake test, and countermovement jump (CMJ) test. The EVAC test showed good reliability after 1 familiarization trial (intraclass correlation = 0.89; coefficient of variance = 3.7%). The EVAC test correlated well with the Wingate test (r = -0.68), 300-m sprint time (r = 0.51), and 300-m mean power (W) (r = -0.67). No significant correlation was found with the MAOD test. In part II of the study, height, body mass, lean body mass, isokinetic knee extension torque, maximal oxygen uptake, and maximal power in a CMJ was significantly correlated with performance in the EVAC test. The EVAC test is a reliable and valid test for anaerobic work capacity for SOF operators, and muscle mass, leg strength, and leg power seem to be the most important determinants

  3. EOS Terra Validation Program

    Science.gov (United States)

    Starr, David

    2000-01-01

    The EOS Terra mission will be launched in July 1999. This mission has great relevance to the atmospheric radiation community and global change issues. Terra instruments include Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER), Clouds and Earth's Radiant Energy System (CERES), Multi-Angle Imaging Spectroradiometer (MISR), Moderate Resolution Imaging Spectroradiometer (MODIS) and Measurements of Pollution in the Troposphere (MOPITT). In addition to the fundamental radiance data sets, numerous global science data products will be generated, including various Earth radiation budget, cloud and aerosol parameters, as well as land surface, terrestrial ecology, ocean color, and atmospheric chemistry parameters. Significant investments have been made in on-board calibration to ensure the quality of the radiance observations. A key component of the Terra mission is the validation of the science data products. This is essential for a mission focused on global change issues and the underlying processes. The Terra algorithms have been subject to extensive pre-launch testing with field data whenever possible. Intensive efforts will be made to validate the Terra data products after launch. These include validation of instrument calibration (vicarious calibration) experiments, instrument and cross-platform comparisons, routine collection of high quality correlative data from ground-based networks, such as AERONET, and intensive sites, such as the SGP ARM site, as well as a variety field experiments, cruises, etc. Airborne simulator instruments have been developed for the field experiment and underflight activities including the MODIS Airborne Simulator (MAS) AirMISR, MASTER (MODIS-ASTER), and MOPITT-A. All are integrated on the NASA ER-2 though low altitude platforms are more typically used for MASTER. MATR is an additional sensor used for MOPITT algorithm development and validation. The intensive validation activities planned for the first year of the Terra

  4. Design of Cycling Race Information Release App Based on Cross -platform Technology%基于跨平台的自行车竞赛信息发布 App 设计

    Institute of Scientific and Technical Information of China (English)

    徐萌萌; 王萍; 温号; 缪刚

    2015-01-01

    根据自行车竞赛信息发布逐渐趋向移动化的需求和发布方式较为单一的现状,提出基于跨平台的自行车竞赛信息发布 App 设计方案,将竞赛信息发布从现有的计算机浏览器平台发展至移动智能终端,推向更加广阔的发布平台,为竞赛信息发布提供了新的设计思路。本设计采用三层架构实现整体设计高内聚、低耦合的特点,基于 jQuery Mobile 框架,采用 JavaScript 和 HTML 编写,使用 XML 文件存储并传输数据,利用 PhoneGap 实现整体设计、一次编写、处处运行的跨平台特性。设计已成功运用于2014年环青海湖国际公路自行车赛,运行结果稳定可靠,信息发布及时准确,满足自行车竞赛信息发布的需求。%Aiming at the mobile requirements and the single mode of information release in the cycling race,a design of the cycling race information release App based on cross -platform technology is presented,which develops the competition information release from the existing computer browser platform to mobile intelligent terminal to provide a new design idea for competition information release.A three -tier architecture is used to achieve the characteristics of high cohesion and low coupling,based on jQuery Mobile framework,written in JavaScript and HTML,XML files are used to store and transfer data,PhoneGap is adopted to achieve the cross -platform features with write once and running everywhere.It has been successfully used in 2014 Tour de Qinghai Lake International Cycling Race,and the result shows that the design works stably and reliably,releases the information promptly and accurately and meets the requirements of cycling race information release.

  5. Validation of cell voltage and water content in a PEM (polymer electrolyte membrane) fuel cell model using neutron imaging for different operating conditions

    International Nuclear Information System (INIS)

    Salva, J. Antonio; Iranzo, Alfredo; Rosa, Felipe; Tapia, Elvira

    2016-01-01

    This work presents a one dimensional analytical model developed for a 50 cm"2 PEM (polymer electrolyte membrane) fuel cell with five-channel serpentine flow field. The different coupled physical phenomena such as electrochemistry, mass transfer of hydrogen, oxygen and water (two phases) together with heat transfer have been solved simultaneously. The innovation of this work is that the model has been validated with two different variables simultaneously and quantitatively in order to ensure the accuracy of the results. The selected variables are the cell voltage and the water content within the membrane MEA (Membrane Electrode Assembly) and GDL (gas diffusion layers) experimentally measured by means of neutron radiography. The results show a good agreement for a comprehensive set of different operating conditions of cell temperature, pressure, reactants relative humidity and cathode stoichiometry. The analytical model has a relative error less than 3.5% for the value of the cell voltage and the water content within the GDL + MEA for all experiments performed. This result presents a new standard of validation in the state of art of PEM fuel cell modeling where two variables are simultaneously and quantitatively validated with experimental results. The developed analytical model has been used in order to analyze the behavior of the PEM fuel cell under different values of relative humidity. - Highlights: • One dimensional analytical model has been developed for a PEM fuel cell. • The model is validated with two different variables simultaneously. • New standard of validation is proposed.

  6. Operator competence in fetoscopic laser surgery for twin-twin transfusion syndrome: validation of a procedure-specific evaluation tool.

    Science.gov (United States)

    Peeters, S H P; Akkermans, J; Bustraan, J; Middeldorp, J M; Lopriore, E; Devlieger, R; Lewi, L; Deprest, J; Oepkes, D

    2016-03-01

    Fetoscopic laser surgery for twin-twin transfusion syndrome is a procedure for which no objective tools exist to assess technical skills. To ensure that future fetal surgeons reach competence prior to performing the procedure unsupervised, we developed a performance assessment tool. The aim of this study was to validate this assessment tool for reliability and construct validity. We made use of a procedure-specific evaluation instrument containing all essential steps of the fetoscopic laser procedure, which was previously created using Delphi methodology. Eleven experts and 13 novices from three fetal medicine centers performed the procedure on the same simulator. Two independent observers assessed each surgery using the instrument (maximum score: 52). Interobserver reliability was assessed using Spearman correlation. We compared the performance of novices and experts to assess construct validity. The interobserver reliability was high (Rs  = 0.974, P performed by experts and in 9/13 (69%) procedures performed by novices (P = 0.005). Multivariable analysis showed that the checklist score, independent of age and gender, predicted competence. The procedure-specific assessment tool for fetoscopic laser surgery shows good interobserver reliability and discriminates experts from novices. This instrument may therefore be a useful tool in the training curriculum for fetal surgeons. Further intervention studies with reassessment before and after training may increase the construct validity of the tool. Copyright © 2015 ISUOG. Published by John Wiley & Sons Ltd. Copyright © 2015 ISUOG. Published by John Wiley & Sons Ltd.

  7. Does expert opinion match the operational definition of the Lupus Low Disease Activity State (LLDAS)? A case-based construct validity study.

    Science.gov (United States)

    Golder, Vera; Huq, Molla; Franklyn, Kate; Calderone, Alicia; Lateef, Aisha; Lau, Chak Sing; Lee, Alfred Lok Hang; Navarra, Sandra Teresa V; Godfrey, Timothy; Oon, Shereen; Hoi, Alberta Yik Bun; Morand, Eric Francis; Nikpour, Mandana

    2017-06-01

    To evaluate the construct validity of the Lupus Low Disease Activity State (LLDAS), a treatment target in systemic lupus erythematosus (SLE). Fifty SLE case summaries based on real patients were prepared and assessed independently for meeting the operational definition of LLDAS. Fifty international rheumatologists with expertise in SLE, but with no prior involvement in the LLDAS project, responded to a survey in which they were asked to categorize the disease activity state of each case as remission, low, moderate, or high. Agreement between expert opinion and LLDAS was assessed using Cohen's kappa. Overall agreement between expert opinion and the operational definition of LLDAS was 77.96% (95% CI: 76.34-79.58%), with a Cohen's kappa of 0.57 (95% CI: 0.55-0.61). Of the cases (22 of 50) that fulfilled the operational definition of LLDAS, only 5.34% (59 of 22 × 50) of responses classified the cases as moderate/high activity. Of the cases that did not fulfill the operational definition of LLDAS (28 of 50), 35.14% (492 of 28 × 50) of responses classified the cases as remission/low activity. Common reasons for discordance were assignment to remission/low activity of cases with higher corticosteroid doses than defined in LLDAS (prednisolone ≤ 7.5mg) or with SLEDAI-2K >4 due to serological activity (high anti-dsDNA antibody and/or low complement). LLDAS has good construct validity with high overall agreement between the operational definition of LLDAS and expert opinion. Discordance of results suggests that the operational definition of LLDAS is more stringent than expert opinion at defining a low disease activity state. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. A Validated Set of MIDAS V5 Task Network Model Scenarios to Evaluate Nextgen Closely Spaced Parallel Operations Concepts

    Science.gov (United States)

    Gore, Brian Francis; Hooey, Becky Lee; Haan, Nancy; Socash, Connie; Mahlstedt, Eric; Foyle, David C.

    2013-01-01

    The Closely Spaced Parallel Operations (CSPO) scenario is a complex, human performance model scenario that tested alternate operator roles and responsibilities to a series of off-nominal operations on approach and landing (see Gore, Hooey, Mahlstedt, Foyle, 2013). The model links together the procedures, equipment, crewstation, and external environment to produce predictions of operator performance in response to Next Generation system designs, like those expected in the National Airspaces NextGen concepts. The task analysis that is contained in the present report comes from the task analysis window in the MIDAS software. These tasks link definitions and states for equipment components, environmental features as well as operational contexts. The current task analysis culminated in 3300 tasks that included over 1000 Subject Matter Expert (SME)-vetted, re-usable procedural sets for three critical phases of flight; the Descent, Approach, and Land procedural sets (see Gore et al., 2011 for a description of the development of the tasks included in the model; Gore, Hooey, Mahlstedt, Foyle, 2013 for a description of the model, and its results; Hooey, Gore, Mahlstedt, Foyle, 2013 for a description of the guidelines that were generated from the models results; Gore, Hooey, Foyle, 2012 for a description of the models implementation and its settings). The rollout, after landing checks, taxi to gate and arrive at gate illustrated in Figure 1 were not used in the approach and divert scenarios exercised. The other networks in Figure 1 set up appropriate context settings for the flight deck.The current report presents the models task decomposition from the tophighest level and decomposes it to finer-grained levels. The first task that is completed by the model is to set all of the initial settings for the scenario runs included in the model (network 75 in Figure 1). This initialization process also resets the CAD graphic files contained with MIDAS, as well as the embedded

  9. Cross-platform wireless sensor network development

    DEFF Research Database (Denmark)

    Hansen, Morten Tranberg; Kusy, Branislav

    Design and development of wireless sensor network applications adds an additional layer of complexity to traditional computer systems. The developer needs to be an expert in resource constrained embedded devices as well as traditional desktop computers. We propose Tinylnventor, an open...

  10. A Cross-Platform Smartphone Brain Scanner

    DEFF Research Database (Denmark)

    Larsen, Jakob Eg; Stopczynski, Arkadiusz; Stahlhut, Carsten

    We describe a smartphone brain scanner with a low-costwireless 14-channel Emotiv EEG neuroheadset interfacingwith multiple mobile devices. This personal informaticssystem enables minimally invasive and continuouscapturing of brain imaging data in natural settings. Thesystem applies an inverse...

  11. Cross-platform software development with Qt

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    Overview of the Qt framework, new features of Qt 5.10 and preview of the Qt roadmap Introduction of the Qt framework as well as the new features and improvements of Qt 5.10. Showcase the different Qt UI technologies such as Widgets and Qml, overview of the automation suite and preview of the new features to come in Qt Creator. Elaborating on the Qt 3D offering and preview of the Qt roadmap. Presenting use cases of Qt integration in the medical and automation sectors. About the speaker Ionut is a serial entrepreneur, now working as Qt Adviser for The Qt Company. He studied biomedical engineering in Montreal, Canada and worked for 4 years as a Senior Software Developer at a software company in the life science and medical imaging area. He founded two interactive media companies, one in Montreal, Canada and another one in Berlin, Germany. Ionut has also more than 10 years of experience in the digital signage industry. He has been working with the Qt framework since 2006 and is a huge fan of it.   &...

  12. Cross-platform development with React Native

    OpenAIRE

    Beshir, Aymen

    2016-01-01

    In this project a mobile application for dog owners is built, whichallows dog owners to create their own profile. The customer is a dogwhisperer with the aspiration to create a platform for dog ownerswhere they can share and access articles and experiences and structuretheir dog's life.This mobile application is built for both Android and iOS. Buildingnative mobile applications has never been easier given the manyresources and frameworks available for developers. But since theframeworks are o...

  13. Solid Waste Operations Complex W-113, Detail Design Report (Title II). Volume 5: Design validation assessments and lists

    International Nuclear Information System (INIS)

    1995-09-01

    The Solid Waste Retrieval Facility--Phase 1 (Project W113) will provide the infrastructure and the facility required to retrieve from Trench 04, Burial ground 4C, contact handled (CH) drums and boxes at a rate that supports all retrieved TRU waste batching, treatment, storage, and disposal plans. This includes (1) operations related equipment and facilities, viz., a weather enclosure for the trench, retrieval equipment, weighing, venting, obtaining gas samples, overpacking, NDE, NDA, shipment of waste and (2) operations support related facilities, viz., a general office building, a retrieval staff change facility, and infrastructure upgrades such as supply and routing of water, sewer, electrical power, fire protection, roads, and telecommunication. Title I design for the operations related equipment and facilities was performed by Raytheon/BNFL, and that for the operations support related facilities including infrastructure upgrade was performed by KEH. These two scopes were combined into an integrated W113 Title II scope that was performed by Raytheon/BNFL. The following Code Evaluation analyzes the applicable sections of the National Fire Protection Association (NFPA) 101, Life Safety Code, 1994 Edition and the 1994 Edition of the Uniform Building Code (UBC) to the W113 Trench Enclosure. A Building Code Analysis generally establishes four primary design criteria: occupancy classification; separation requirements; egress requirements; and construction type. The UBC establishes requirements for all criteria. This analysis is limited to the Trench Enclosure Building. The General Office Building and the Retrieval Staff Change Building is not within the scope of this analysis

  14. Development and Pre-Operational Validation of NEMO Based Eddy Ressolving Regional Configuration for Gulf of Finland

    Science.gov (United States)

    Sofina, Ekaterina; Vankevich, Roman; Tatiana, Eremina

    2014-05-01

    At the present day RSHU the Operational Oceanographic System for the Gulf of Finland (GULFOOS) is in a trial operation. For the future development of the operational system, the quality of which also strongly depends on the hydrothermodynamic model spatial resolution. The new model configuration has been implemented, based on the international project NEMO (Nucleus for European Modelling of the Ocean). Based on NEMO toolbox a new eddy permitting z-coordinated configuration realized with horizontal resolution 30x15'' (~500 m) and 1 m vertical step. Chosen horizontal resolution enough to resolve typical submesoscale eddies in this basin where the internal Rossby radius is usually 2-4 km [1]. Verification performed with use all available measurements including vessel, ferry boxes, autonomous profilers, satellite SST. It was shown that submesoscale eddies and filaments generated by baroclinic instability of fronts in upper layers of the Gulf can change vertical stratification and deepening of the mixed layer. Increase in the model resolution leads to a clear improvement of the representation of the key hydro-physical fields: filaments propagation, local eddies. Obtained results confirm that model adequately reproduce general circulation and seasonal evolution of vertical water structure. It is shown that NEMO model initially designed for a global ocean can be used in regional operational application in case of highly stratified shallow basin with complex bathymetry. Computation efficiency of the system including 3DVar assimilation was enough for 24x7 operational task on 12 nodes of Intel based cluster. Proposed regional modeling system has potential to give information on non-observed physical quantities and to provide links between observations by identifying small-scale patterns and processes. References 1. Alenius P., Nekrasov A., Myrberg, K. The baroclinic Rossby-radius in the Gulf of Finland. Continental Shelf Research, 2003, 23, 563-573.

  15. Using Modeling and Simulation to Predict Operator Performance and Automation-Induced Complacency With Robotic Automation: A Case Study and Empirical Validation.

    Science.gov (United States)

    Wickens, Christopher D; Sebok, Angelia; Li, Huiyang; Sarter, Nadine; Gacy, Andrew M

    2015-09-01

    The aim of this study was to develop and validate a computational model of the automation complacency effect, as operators work on a robotic arm task, supported by three different degrees of automation. Some computational models of complacency in human-automation interaction exist, but those are formed and validated within the context of fairly simplified monitoring failures. This research extends model validation to a much more complex task, so that system designers can establish, without need for human-in-the-loop (HITL) experimentation, merits and shortcomings of different automation degrees. We developed a realistic simulation of a space-based robotic arm task that could be carried out with three different levels of trajectory visualization and execution automation support. Using this simulation, we performed HITL testing. Complacency was induced via several trials of correctly performing automation and then was assessed on trials when automation failed. Following a cognitive task analysis of the robotic arm operation, we developed a multicomponent model of the robotic operator and his or her reliance on automation, based in part on visual scanning. The comparison of model predictions with empirical results revealed that the model accurately predicted routine performance and predicted the responses to these failures after complacency developed. However, the scanning models do not account for the entire attention allocation effects of complacency. Complacency modeling can provide a useful tool for predicting the effects of different types of imperfect automation. The results from this research suggest that focus should be given to supporting situation awareness in automation development. © 2015, Human Factors and Ergonomics Society.

  16. Validity of the CR-POSSUM model in surgery for colorectal cancer in Spain (CCR-CARESS study) and comparison with other models to predict operative mortality.

    Science.gov (United States)

    Baré, Marisa; Alcantara, Manuel Jesús; Gil, Maria José; Collera, Pablo; Pont, Marina; Escobar, Antonio; Sarasqueta, Cristina; Redondo, Maximino; Briones, Eduardo; Dujovne, Paula; Quintana, Jose Maria

    2018-01-29

    To validate and recalibrate the CR- POSSUM model and compared its discriminatory capacity with other European models such as POSSUM, P-POSSUM, AFC or IRCS to predict operative mortality in surgery for colorectal cancer. Prospective multicenter cohort study from 22 hospitals in Spain. We included patients undergoing planned or urgent surgery for primary invasive colorectal cancers between June 2010 and December 2012 (N = 2749). Clinical data were gathered through medical chart review. We validated and recalibrated the predictive models using logistic regression techniques. To calculate the discriminatory power of each model, we estimated the areas under the curve - AUC (95% CI). We also assessed the calibration of the models by applying the Hosmer-Lemeshow test. In-hospital mortality was 1.5% and 30-day mortality, 1.7%. In the validation process, the discriminatory power of the CR-POSSUM for predicting in-hospital mortality was 73.6%. However, in the recalibration process, the AUCs improved slightly: the CR-POSSUM reached 75.5% (95% CI: 67.3-83.7). The discriminatory power of the CR-POSSUM for predicting 30-day mortality was 74.2% (95% CI: 67.1-81.2) after recalibration; among the other models the POSSUM had the greatest discriminatory power, with an AUC of 77.0% (95% CI: 68.9-85.2). The Hosmer-Lemeshow test showed good fit for all the recalibrated models. The CR-POSSUM and the other models showed moderate capacity to discriminate the risk of operative mortality in our context, where the actual operative mortality is low. Nevertheless the IRCS might better predict in-hospital mortality, with fewer variables, while the CR-POSSUM could be slightly better for predicting 30-day mortality. Registered at: ClinicalTrials.gov Identifier: NCT02488161.

  17. Initial Validation of Robotic Operations for In-Space Assembly of a Large Solar Electric Propulsion Transport Vehicle

    Science.gov (United States)

    Komendera, Erik E.; Dorsey, John T.

    2017-01-01

    Developing a capability for the assembly of large space structures has the potential to increase the capabilities and performance of future space missions and spacecraft while reducing their cost. One such application is a megawatt-class solar electric propulsion (SEP) tug, representing a critical transportation ability for the NASA lunar, Mars, and solar system exploration missions. A series of robotic assembly experiments were recently completed at Langley Research Center (LaRC) that demonstrate most of the assembly steps for the SEP tug concept. The assembly experiments used a core set of robotic capabilities: long-reach manipulation and dexterous manipulation. This paper describes cross-cutting capabilities and technologies for in-space assembly (ISA), applies the ISA approach to a SEP tug, describes the design and development of two assembly demonstration concepts, and summarizes results of two sets of assembly experiments that validate the SEP tug assembly steps.

  18. Technical realization of a sensorized neonatal intubation skill trainer for operators' retraining and a pilot study for its validation.

    Science.gov (United States)

    Panizza, Davide; Scaramuzzo, Rosa T; Moscuzza, Francesca; Vannozzi, Ilaria; Ciantelli, Massimiliano; Gentile, Marzia; Baldoli, Ilaria; Tognarelli, Selene; Boldrini, Antonio; Cuttano, Armando

    2018-01-04

    In neonatal endotracheal intubation, excessive pressure on soft tissues during laryngoscopy can determine permanent injury. Low-fidelity skill trainers do not give valid feedback about this issue. This study describes the technical realization and validation of an active neonatal intubation skill trainer providing objective feedback. We studied expert health professionals' performances in neonatal intubation, underlining chance for procedure retraining. We identified the most critical points in epiglottis and dental arches and fixed commercial force sensors on chosen points on a ©Laerdal Neonatal Intubation Trainer. Our skill trainer was set up as a grade 3 on Cormack and Lehane's scale, i.e. a model of difficult intubation. An associated software provided real time sound feedback if pressure during laryngoscopy exceeded an established threshold. Pressure data were recorded in a database, for subsequent analysis with non-parametric statistical tests. We organized our study in two intubation sessions (5 attempts each one) for everyone of our participants, held 24 h apart. Between the two sessions, a debriefing phase took place. In addition, we gave our participants two interview, one at the beginning and one at the end of the study, to get information about our subjects and to have feedback about our design. We obtained statistical significant differences between consecutive attempts, with evidence of learning trends. Pressure on critical points was significantly lower during the second session (p < 0.0001). Epiglottis' sensor was the most stressed (p < 0.000001). We found a significant correlation between time spent for each attempt and pressures applied to the airways in the two sessions, more significant in the second one (shorter attempts with less pressure, r s  = 0.603). Our skill trainer represents a reliable model of difficult intubation. Our results show its potential to optimize procedures related to the control of trauma risk and to improve

  19. Track 5: safety in engineering, construction, operations, and maintenance. Reactor physics design, validation, and operating experience. 5. A Negative Reactivity Feedback Device for Actinide Burner Cores

    International Nuclear Information System (INIS)

    Driscoll, M.J.; Hejzlar, P.

    2001-01-01

    Lead-bismuth eutectic (LBE) cooled reactors are of considerable interest because they may be useful for destruction of actinides in a cost-effective manner, particularly cores fueled predominantly with minor actinides, which gain reactivity with burnup. However, they also pose several design challenges: 1. a small (and perhaps even slightly positive) Doppler feedback; 2. small effective delayed neutron yield; 3. a small negative feedback from axial fuel expansion; 4. positive coolant void and temperature coefficients for conventional designs. This has motivated a search for palliative measures, leading to conceptualization of the reactivity feedback device (RFD). The RFD consists of an in-core flask containing helium gas, tungsten wool, and a small reservoir of LBE that communicates with vertical tubes housing neutron absorber floats. The upper part of these guide tubes contains helium gas that is vented into a separate, cooler ex-core helium gas plenum. The principle of operation is as follows: 1. The tungsten wool, hence the helium gas in the in-core plenum, is heated by gammas and loses heat to the walls by convection and conduction (radiation is feeble for monatomic gases and, in any event, intercepted by the tungsten wool). An energy balance determines the gas temperature, hence, pressure, which is 10 atm here. The energy loss rate can be adjusted by using xenon or a gas mixture in place of helium. The tungsten wool mass, which is 1 vol% wool here, can also be increased to increase gamma heating and further retard convection; alternatively, a Dewar flask could be used in place of the additional wool. 2. An increase in core power causes a virtually instantaneous increase in gamma flux, hence, gas heatup: The thermal time constant of the tungsten filaments and their surrounding gas film is ∼40 μs. 3. The increased gas temperature is associated with an increased gas pressure, which forces more liquid metal into the float guide tubes: LBE will rise ∼100 cm

  20. Measurements of Humidity in the Atmosphere and Validation Experiments (MOHAVE-2009: overview of campaign operations and results

    Directory of Open Access Journals (Sweden)

    T. Leblanc

    2011-12-01

    Full Text Available The Measurements of Humidity in the Atmosphere and Validation Experiment (MOHAVE 2009 campaign took place on 11–27 October 2009 at the JPL Table Mountain Facility in California (TMF. The main objectives of the campaign were to (1 validate the water vapor measurements of several instruments, including, three Raman lidars, two microwave radiometers, two Fourier-Transform spectrometers, and two GPS receivers (column water, (2 cover water vapor measurements from the ground to the mesopause without gaps, and (3 study upper tropospheric humidity variability at timescales varying from a few minutes to several days.

    A total of 58 radiosondes and 20 Frost-Point hygrometer sondes were launched. Two types of radiosondes were used during the campaign. Non negligible differences in the readings between the two radiosonde types used (Vaisala RS92 and InterMet iMet-1 made a small, but measurable impact on the derivation of water vapor mixing ratio by the Frost-Point hygrometers. As observed in previous campaigns, the RS92 humidity measurements remained within 5% of the Frost-point in the lower and mid-troposphere, but were too dry in the upper troposphere.

    Over 270 h of water vapor measurements from three Raman lidars (JPL and GSFC were compared to RS92, CFH, and NOAA-FPH. The JPL lidar profiles reached 20 km when integrated all night, and 15 km when integrated for 1 h. Excellent agreement between this lidar and the frost-point hygrometers was found throughout the measurement range, with only a 3% (0.3 ppmv mean wet bias for the lidar in the upper troposphere and lower stratosphere (UTLS. The other two lidars provided satisfactory results in the lower and mid-troposphere (2–5% wet bias over the range 3–10 km, but suffered from contamination by fluorescence (wet bias ranging from 5 to 50% between 10 km and 15 km, preventing their use as an independent measurement in the UTLS.

    The comparison between all available stratospheric

  1. Dielectric barrier discharge-based plasma actuator operation in artificial atmospheres for validation of modeling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Mangina, R. S.; Enloe, C. L.; Font, G. I. [Department of Physics, United States Air Force Academy, Colorado 80840 (United States)

    2015-11-15

    We present an experimental case study of time-resolved force production by an aerodynamic plasma actuator immersed in various mixtures of electropositive (N{sub 2}) and electronegative gases (O{sub 2} and SF{sub 6}) at atmospheric pressure using a fixed AC high-voltage input of 16 kV peak amplitude at 200 Hz frequency. We have observed distinct changes in the discharge structures during both negative- and positive-going voltage half-cycles, with corresponding variations in the actuator's force production: a ratio of 4:1 in the impulse produced by the negative-going half-cycle of the discharge among the various gas mixtures we explored, 2:1 in the impulse produced by the positive-going half-cycle, and cases in which the negative-going half-cycle dominates force production (by a ratio of 1.5:1), where the half-cycles produce identical force levels, and where the positive-going half cycle dominates (by a ratio of 1:5). We also present time-resolved experimental evidence for the first time that shows electrons do play a significant role in the momentum coupling to surrounding neutrals during the negative going voltage half-cycle of the N{sub 2} discharge. We show that there is sufficient macroscopic variation in the plasma that the predictions of numerical models at the microscopic level can be validated even though the plasma itself cannot be measured directly on those spatial and temporal scales.

  2. Validation of a Numerical Model for the Prediction of the Annoyance Condition at the Operator Station of Construction Machines

    Directory of Open Access Journals (Sweden)

    Eleonora Carletti

    2016-11-01

    Full Text Available It is well-known that the reduction of noise levels is not strictly linked to the reduction of noise annoyance. Even earthmoving machine manufacturers are facing the problem of customer complaints concerning the noise quality of their machines with increasing frequency. Unfortunately, all the studies geared to the understanding of the relationship between multidimensional characteristics of noise signals and the auditory perception of annoyance require repeated sessions of jury listening tests, which are time-consuming. In this respect, an annoyance prediction model was developed for compact loaders to assess the annoyance sensation perceived by operators at their workplaces without repeating the full sound quality assessment but using objective parameters only. This paper aims at verifying the feasibility of the developed annoyance prediction model when applied to other kinds of earthmoving machines. For this purpose, an experimental investigation was performed on five earthmoving machines, different in type, dimension, and engine mechanical power, and the annoyance predicted by the numerical model was compared to the annoyance given by subjective listening tests. The results were evaluated by means of the squared value of the correlation coefficient, R2, and they confirm the possible applicability of the model to other kinds of machines.

  3. Optimization and Model Validation of Operation Control Strategies for a Novel Dual-Motor Coupling-Propulsion Pure Electric Vehicle

    Directory of Open Access Journals (Sweden)

    Jianjun Hu

    2018-03-01

    Full Text Available The strict operational condition of driving motors for vehicles propels the development of more complicated configurations in pure electric vehicles (PEVs. Multi-power-source powertrain configurations are one of the efficient technologies to reduce the manufacturing difficulty of driving motors. However, most of the existing studies are predominantly focused on optimal designs of powertrains and power distribution between the engine and motor of hybrid electric vehicles, which are not appropriate for PEVs. This paper proposes a novel dual-motor coupling-propulsion powertrain system that improves the dynamic and economic performance of the powertrain system in PEVs. The proposed powertrain system can realize both the single-motor driving mode and dual-motor coupling driving mode. The driving modes are divided and a power distribution strategy for the different driving modes based on an optimal system efficiency rule is employed, which enhances the performance of the proposed system. Further, a mode-switching strategy that ensures driving comfort by preventing jerk during mode switching is incorporated into the system. The results of comparative evaluations that were conducted using a dual-motor electric vehicle model implemented in MATLAB/Simulink, indicate that the mileage and dynamic performance of the proposed powertrain system are significantly better than those of the traditional single-motor powertrain system.

  4. Comparison and Validation of Operational Cost in Smart Houses with the Introduction of a Heat Pump or a Gas Engine

    Science.gov (United States)

    Shimoji, Tsubasa; Tahara, Hayato; Matayoshi, Hidehito; Yona, Atsushi; Senjyu, Tomonobu

    2015-02-01

    Due to the concerns of global warming and the depletion of energy resources, renewable energies such as wind generation (WG) and photovoltaic generation (PV) are gaining attention in distribution systems. Efficient electric equipment such as heat pumps (HP) not only contribute low levels of carbon to society, but are also beneficial for consumers. In addition, gas instruments such as the gas engine (GE) and fuel cells (FC) are expected to reduce electricity cost by exhaust heat. Thus, it is important to clarify which systems (HP or GE) are more beneficial for consumers throughout the year. This paper compares the operational cost for the smart house between using the HP and the GE. Current electricity and gas prices are used to calculate the cost of the smart house. The system considered in this research comprises a PV, battery, solar collector (SC), uncontrolled load and either an HP or a GE. In order to verify the effectiveness of the proposed system, MATLAB is used for simulations.

  5. Shield evaluation and validation for design and operation of facility for treatment of legacy Intermediate Level Radioactive Liquid Waste (ILW)

    International Nuclear Information System (INIS)

    Deepa, A.; Jakhete, A.P.; Rathish, K.R.; Saroj, S.K.; Patel, H.S.; Gopalakrishnan, R.K.; Gangadharan, Anand; Singh, Neelima

    2014-01-01

    An ion exchange treatment facility has been commissioned at PRIX facility, for the treatment of legacy ILW generated at reprocessing plant, Trombay. The treatment system is based on the deployment of selective sorbents for removal of cesium and strontium from ILW. Activity concentration due to beta emitters likely to be processed is of the order of 111-1850 MBq/l. Dose rates in different areas of the facility were evaluated using shielding code and design input. Present work give details of the comparison of dose rates estimated and dose rates measured at various stages of the processing of ILW. At PRIX, the ILW treatment system comprises of shielded IX columns (two cesium and one strontium) housed in a MS cubicle the process lines inlet and outlet of IX treatment system and effluent storage tanks. The MS cubicle, prefilter and piping are housed in a process cell of 500 mm concrete shielding. Effluent storage tanks are outside processing building. Theoretical assessment of expected dose rates were carried out prior to installation of various systems in different areas of PRIX. Dose rate on IX column and MS cubicle for a maximum inventory of 3.7x10 7 MBq of 137 Cs and its contribution in operating gallery was estimated

  6. Design and Validation of a Control Algorithm for a SAE J2954-Compliant Wireless Charger to Guarantee the Operational Electrical Constraints

    Directory of Open Access Journals (Sweden)

    José Manuel González-González

    2018-03-01

    Full Text Available Wireless power transfer is foreseen as a suitable technology to provide charge without cables to electric vehicles. This technology is mainly supported by two coupled coils, whose mutual inductance is sensitive to their relative positions. Variations on this coefficient greatly impact the electrical magnitudes of the wireless charger. The aim of this paper is the design and validation of a control algorithm for an Society of Automotive Engineers (SAE J2954-compliant wireless charger to guarantee some operational and electrical constraints. These constraints are designed to prevent some components from being damaged by excessive voltage or current. This paper also presents the details for the design and implementation of the bidirectional charger topology in which the proposed controller is incorporated. The controller is installed on the primary and on the secondary side, given that wireless communication is necessary with the other side. The input data of the controller helps it decide about the phase shift required to apply in the DC/AC converter. The experimental results demonstrate how the system regulates the output voltage of the DC/AC converter so that some electrical magnitudes do not exceed predefined thresholds. The regulation, which has been tested when coil misalignments occur, is proven to be effective.

  7. Developing and establishing the validity and reliability of the perceptions toward Aviation Safety Action Program (ASAP) and Line Operations Safety Audit (LOSA) questionnaires

    Science.gov (United States)

    Steckel, Richard J.

    Aviation Safety Action Program (ASAP) and Line Operations Safety Audits (LOSA) are voluntary safety reporting programs developed by the Federal Aviation Administration (FAA) to assist air carriers in discovering and fixing threats, errors and undesired aircraft states during normal flights that could result in a serious or fatal accident. These programs depend on voluntary participation of and reporting by air carrier pilots to be successful. The purpose of the study was to develop and validate a measurement scale to measure U.S. air carrier pilots' perceived benefits and/or barriers to participating in ASAP and LOSA programs. Data from these surveys could be used to make changes to or correct pilot misperceptions of these programs to improve participation and the flow of data. ASAP and LOSA a priori models were developed based on previous research in aviation and healthcare. Sixty thousand ASAP and LOSA paper surveys were sent to 60,000 current U.S. air carrier pilots selected at random from an FAA database of pilot certificates. Two thousand usable ASAP and 1,970 usable LOSA surveys were returned and analyzed using Confirmatory Factor Analysis. Analysis of the data using confirmatory actor analysis and model generation resulted in a five factor ASAP model (Ease of use, Value, Improve, Trust and Risk) and a five factor LOSA model (Value, Improve, Program Trust, Risk and Management Trust). ASAP and LOSA data were not normally distributed, so bootstrapping was used. While both final models exhibited acceptable fit with approximate fit indices, the exact fit hypothesis and the Bollen-Stine p value indicated possible model mis-specification for both ASAP and LOSA models.

  8. Validation of the IASI operational CH4 and N2O products using ground-based Fourier Transform Spectrometer: preliminary results at the Izaña Observatory (28ºN, 17ºW

    Directory of Open Access Journals (Sweden)

    Omaira García

    2014-01-01

    Full Text Available Within the project VALIASI (VALidation of IASI level 2 products the validation of the IASI operational atmospheric trace gas products (total column amounts of H2O, O3, CH4, N2O, CO2 and CO as well H2O and O3 profiles will be carried out. Ground-based FTS (Fourier Transform Spectrometer trace gas measurements made in the framework of NDACC (Network for the Detection of Atmospheric Composition Change serve as the validation reference. In this work, we will present the validation methodology developed for this project and show the first intercomparison results obtained for the Izaña Atmospheric Observatory between 2008 and 2012. As example, we will focus on two of the most important greenhouse gases, CH4 and N2O.

  9. Air Traffic Management Technology Demostration Phase 1 (ATD) Interval Management for Near-Term Operations Validation of Acceptability (IM-NOVA) Experiment

    Science.gov (United States)

    Kibler, Jennifer L.; Wilson, Sara R.; Hubbs, Clay E.; Smail, James W.

    2015-01-01

    The Interval Management for Near-term Operations Validation of Acceptability (IM-NOVA) experiment was conducted at the National Aeronautics and Space Administration (NASA) Langley Research Center (LaRC) in support of the NASA Airspace Systems Program's Air Traffic Management Technology Demonstration-1 (ATD-1). ATD-1 is intended to showcase an integrated set of technologies that provide an efficient arrival solution for managing aircraft using Next Generation Air Transportation System (NextGen) surveillance, navigation, procedures, and automation for both airborne and ground-based systems. The goal of the IMNOVA experiment was to assess if procedures outlined by the ATD-1 Concept of Operations were acceptable to and feasible for use by flight crews in a voice communications environment when used with a minimum set of Flight Deck-based Interval Management (FIM) equipment and a prototype crew interface. To investigate an integrated arrival solution using ground-based air traffic control tools and aircraft Automatic Dependent Surveillance-Broadcast (ADS-B) tools, the LaRC FIM system and the Traffic Management Advisor with Terminal Metering and Controller Managed Spacing tools developed at the NASA Ames Research Center (ARC) were integrated into LaRC's Air Traffic Operations Laboratory (ATOL). Data were collected from 10 crews of current 757/767 pilots asked to fly a high-fidelity, fixed-based simulator during scenarios conducted within an airspace environment modeled on the Dallas-Fort Worth (DFW) Terminal Radar Approach Control area. The aircraft simulator was equipped with the Airborne Spacing for Terminal Area Routes (ASTAR) algorithm and a FIM crew interface consisting of electronic flight bags and ADS-B guidance displays. Researchers used "pseudo-pilot" stations to control 24 simulated aircraft that provided multiple air traffic flows into the DFW International Airport, and recently retired DFW air traffic controllers served as confederate Center, Feeder, Final

  10. A comparative study to validate the use of ultrasonography and computed tomography in patients with post-operative intra-abdominal sepsis

    International Nuclear Information System (INIS)

    Go, H.L.S.; Baarslag, H.J.; Vermeulen, H.; Lameris, J.S.; Legemate, D.A.

    2005-01-01

    Purpose: To validate abdominal ultrasonography and helical computed tomography in detecting causes for sepsis in patients after abdominal surgery and to determine improved criteria for its use. Materials and methods: Eighty-five consecutive surgical patients primarily operated for non-infectious disease were included in this prospective study. Forty-one patients were admitted to the intensive care unit. All patients were suspected of an intra-abdominal sepsis after abdominal surgery. Both ultrasonography (US) and helical abdominal computed tomography (CT) were performed to investigate the origin of an intra-abdominal sepsis. The images of both US and CT were interpreted on a four-point scale by different radiologists or residents in radiology, the investigators were blinded of each other's test. Interpretations of US and CT were compared with a reference standard which was defined by the result of diagnostic aspiration of suspected fluid collections (re)laparotomy, clinical course or the opinion of an independent panel. Likelihood ratios and post-test probabilities were calculated and interobserver agreement was determined using κ statistics. Results: The overall prevalence of an abdominal infection was 0.49. The likelihood ratio (LR) of a positive test-result for US was 1.33 (95% CI: 0.8-2.5) and for CT scan 2.53 (95% CI: 1.4-5.0); corresponding post-test probabilities for US 0.57 (95% CI: 0.42-0.70) and for CT 0.71 (95% CI: 0.57-0.83). The LR of a negative test-result was, respectively, 0.60 (95% CI: 0.3-1.3) and 0.18 (95% CI: 0.06-0.5); corresponding post-test probabilities for US 0.37 (95% CI: 0.20-0.57) and for CT 0.15 (95% CI: 0.06-0.32) were calculated. Conclusion: Computed tomography can be used as the imaging modality of choice in patients suspected of intra-abdominal sepsis after abdominal surgery. Because of the low discriminatory power ultrasonography should not be performed as initial diagnostic test

  11. Construct and criterion validity testing of the Non-Technical Skills for Surgeons (NOTSS) behaviour assessment tool using videos of simulated operations.

    Science.gov (United States)

    Yule, S; Gupta, A; Gazarian, D; Geraghty, A; Smink, D S; Beard, J; Sundt, T; Youngson, G; McIlhenny, C; Paterson-Brown, S

    2018-05-01

    Surgeons' non-technical skills are an important part of surgical performance and surgical education. The most widely adopted assessment tool is the Non-Technical Skills for Surgeons (NOTSS) behaviour rating system. Psychometric analysis of this tool to date has focused on inter-rater reliability and feasibility rather than validation. NOTSS assessments were collected from two groups of consultant/attending surgeons in the UK and USA, who rated behaviours of the lead surgeon during a video-based simulated crisis scenario after either online or classroom instruction. The process of validation consisted of assessing construct validity, scale reliability and concurrent criterion validity, and undertaking a sensitivity analysis. Central to this was confirmatory factor analysis to evaluate the structure of the NOTSS taxonomy. Some 255 consultant surgeons participated in the study. The four-category NOTSS model was found to have robust construct validity evidence, and a superior fit compared with alternative models. Logistic regression and sensitivity analysis revealed that, after adjusting for technical skills, for every 1-point increase in NOTSS score of the lead surgeon, the odds of having a higher versus lower patient safety score was 2·29 times. The same pattern of results was obtained for a broad mix of surgical specialties (UK) as well as a single discipline (cardiothoracic, USA). The NOTSS tool can be applied in research and education settings to measure non-technical skills in a valid and efficient manner. © 2018 BJS Society Ltd Published by John Wiley & Sons Ltd.

  12. Operational validation of a multi-period and multi-criteria model conditioning approach for the prediction of rainfall-runoff processes in small forest catchments

    Science.gov (United States)

    Choi, H.; Kim, S.

    2012-12-01

    Most of hydrologic models have generally been used to describe and represent the spatio-temporal variability of hydrological processes in the watershed scale. Though it is an obvious fact that hydrological responses have the time varying nature, optimal values of model parameters were normally considered as time invariants or constants in most cases. The recent paper of Choi and Beven (2007) presents a multi-period and multi-criteria model conditioning approach. The approach is based on the equifinality thesis within the Generalised Likelihood Uncertainty Estimation (GLUE) framework. In their application, the behavioural TOPMODEL parameter sets are determined by several performance measures for global (annual) and short (30-days) periods, clustered using a Fuzzy C-means algorithm, into 15 types representing different hydrological conditions. Their study shows a good performance on the calibration of a rainfall-runoff model in a forest catchment, and also gives strong indications that it is uncommon to find model realizations that were behavioural over all multi-periods and all performance measures, and multi-period model conditioning approach may become new effective tool for predictions of hydrological processes in ungauged catchments. This study is a follow-up study on the Choi and Beven's (2007) model conditioning approach to test how the approach is effective for the prediction of rainfall-runoff responses in ungauged catchments. To achieve this purpose, 6 small forest catchments are selected among the several hydrological experimental catchments operated by Korea Forest Research Institute. In each catchment, long-term hydrological time series data varying from 10 to 30 years were available. The areas of the selected catchments range from 13.6 to 37.8 ha, and all areas are covered by coniferous or broad-leaves forests. The selected catchments locate in the southern coastal area to the northern part of South Korea. The bed rocks are Granite gneiss, Granite or

  13. Integrating the Analysis of Mental Operations into Multilevel Models to Validate an Assessment of Higher Education Students' Competency in Business and Economics

    Science.gov (United States)

    Brückner, Sebastian; Pellegrino, James W.

    2016-01-01

    The Standards for Educational and Psychological Testing indicate that validation of assessments should include analyses of participants' response processes. However, such analyses typically are conducted only to supplement quantitative field studies with qualitative data, and seldom are such data connected to quantitative data on student or item…

  14. Validation and operational measurements with SUSIE – A sar ice motion processing chain developed within promice (Programme for monitoring of Greenland ice-sheet)

    DEFF Research Database (Denmark)

    Merryman Boncori, John Peter; Dall, Jørgen; Ahlstrøm, A. P.

    2010-01-01

    This paper describes the validation of an ice-motion processing chain developed for the PROMICE project – a long-term program funded by the Danish ministry of Climate and Energy to monitor the mass budget of the Greenland ice-sheet. The processor, named SUSIE, (Scripts and Utilities for SAR Ice...

  15. Validation of HEDR models

    International Nuclear Information System (INIS)

    Napier, B.A.; Simpson, J.C.; Eslinger, P.W.; Ramsdell, J.V. Jr.; Thiede, M.E.; Walters, W.H.

    1994-05-01

    The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computer models for estimating the possible radiation doses that individuals may have received from past Hanford Site operations. This document describes the validation of these models. In the HEDR Project, the model validation exercise consisted of comparing computational model estimates with limited historical field measurements and experimental measurements that are independent of those used to develop the models. The results of any one test do not mean that a model is valid. Rather, the collection of tests together provide a level of confidence that the HEDR models are valid

  16. Explicating Validity

    Science.gov (United States)

    Kane, Michael T.

    2016-01-01

    How we choose to use a term depends on what we want to do with it. If "validity" is to be used to support a score interpretation, validation would require an analysis of the plausibility of that interpretation. If validity is to be used to support score uses, validation would require an analysis of the appropriateness of the proposed…

  17. Validation of mathematical models for predicting the swirling flow and the vortex rope in a Francis turbine operated at partial discharge

    DEFF Research Database (Denmark)

    Kuibin, P.A.; Okulov, Valery; Susan-Resiga, R.F.

    2010-01-01

    recover all this information without actually computing the full three-dimensional unsteady flow in the hydraulic turbine. As a result, we provide valuable mathematical tools for assessing the turbine behaviour at off-design operating regimes in the early stages of runner design, with computational effort......The vortex rope in a hydro turbine draft tube is one the main and strong sources of pulsations in non-optimal modes of hydro turbine operation. We examine the case of a Francis turbine model operated at partial discharge, where a strong precessing vortex rope is developed in the discharge cone...... several orders of magnitude less than the current approaches of simulating the complex turbine flow....

  18. Cross-platform comparison of SYBR® Green real-time PCR with TaqMan PCR, microarrays and other gene expression measurement technologies evaluated in the MicroArray Quality Control (MAQC study

    Directory of Open Access Journals (Sweden)

    Dial Stacey L

    2008-07-01

    Full Text Available Abstract Background The MicroArray Quality Control (MAQC project evaluated the inter- and intra-platform reproducibility of seven microarray platforms and three quantitative gene expression assays in profiling the expression of two commercially available Reference RNA samples (Nat Biotechnol 24:1115-22, 2006. The tested microarrays were the platforms from Affymetrix, Agilent Technologies, Applied Biosystems, GE Healthcare, Illumina, Eppendorf and the National Cancer Institute, and quantitative gene expression assays included TaqMan® Gene Expression PCR Assay, Standardized (Sta RT-PCR™ and QuantiGene®. The data showed great consistency in gene expression measurements across different microarray platforms, different technologies and test sites. However, SYBR® Green real-time PCR, another common technique utilized by half of all real-time PCR users for gene expression measurement, was not addressed in the MAQC study. In the present study, we compared the performance of SYBR Green PCR with TaqMan PCR, microarrays and other quantitative technologies using the same two Reference RNA samples as the MAQC project. We assessed SYBR Green real-time PCR using commercially available RT2 Profiler™ PCR Arrays from SuperArray, containing primer pairs that have been experimentally validated to ensure gene-specificity and high amplification efficiency. Results The SYBR Green PCR Arrays exhibit good reproducibility among different users, PCR instruments and test sites. In addition, the SYBR Green PCR Arrays have the highest concordance with TaqMan PCR, and a high level of concordance with other quantitative methods and microarrays that were evaluated in this study in terms of fold-change correlation and overlap of lists of differentially expressed genes. Conclusion These data demonstrate that SYBR Green real-time PCR delivers highly comparable results in gene expression measurement with TaqMan PCR and other high-density microarrays.

  19. HFETR operation management

    International Nuclear Information System (INIS)

    Liu Rong; Yang Shuchun; Peng Jun; Zhou Shoukang

    2003-01-01

    Experiences and work methods with High Flux Engineering Test Reactor (HFETR) operation are introduced, which have been accumulated in a long period of operation, in the aspects as reactor operation, test, maintenance, operator training and incident management. It's clear that the safety operation of HFETR has been ensured, and the methods are valid. (authors)

  20. Validation of fuel performance codes at the NRI Rez plc for Temelin and Dukovany NPPs fuel safety evaluations and operation support

    International Nuclear Information System (INIS)

    Valach, M.; Hejna, J.; Zymak, J.

    2003-05-01

    The report summarises the first phase of the FUMEX II related work performed in the period September 2002 - May 2003. An inventory of the PIN and FRAS codes family used and developed during previous years was made in light of their applicability (validity) in the domain of high burn-up and FUMEX II Project Experimental database. KOLA data were chosen as appropriate for the first step of both codes fixing (both tuned for VVER fuel originally). The modern requirements, expressed by adaptation of the UO 2 conductivity degradation from OECD HRP, RIM and FGR (athermal) modelling implementation into the PIN code and a diffusion FGR model development planned for embedding, into this code allow us to reasonably shadow or keep tight contact with top quality models as TRANSURANUS, COPERNIC, CYRANO, FEMAXI, FRAPCON3 or ENIGMA. Testing and validation runs with prepared input KOLA deck were made. FUMEX II exercise propose LOCA and RIA like transients, so we started development of those two codes coupling - denominated as PIN2FRAS code. Principles of the interface were tested, benchmarking on tentative RIA pulses on highly burned KOLA fuel are presented as the first achievement from our work. (author)

  1. Innovation, Product Development, and New Business Models in Networks: How to come from case studies to a valid and operational theory

    DEFF Research Database (Denmark)

    Rasmussen, Erik Stavnsager; Jørgensen, Jacob Høj; Goduscheit, René Chester

    2007-01-01

    We have in the research project NEWGIBM (New Global ICT based Business Models) during 2005 and 2006 closely cooperated with a group of firms. The focus in the project has been development of new business models (and innovation) in close cooperation with multiple partners. These partners have been...... customers, suppliers, R&D partners, and others. The methodological problem is thus, how to come from e.g. one in-depth case study to a more formalized theory or model on how firms can develop new projects and be innovative in a network. The paper is structured so that it starts with a short presentation...... of the two key concepts in our research setting and theoretical models: Innovation and networks. It is not our intention in this paper to present a lengthy discussion of the two concepts, but a short presentation is necessary to understand the validity and interpretation discussion later in the paper. Next...

  2. Validation of operant social motivation paradigms using BTBR T+tf/J and C57BL/6J inbred mouse strains.

    Science.gov (United States)

    Martin, Loren; Sample, Hannah; Gregg, Michael; Wood, Caleb

    2014-09-01

    As purported causal factors are identified for autism spectrum disorder (ASD), new assays are needed to better phenotype animal models designed to explore these factors. With recent evidence suggesting that deficits in social motivation are at the core of ASD behavior, the development of quantitative measures of social motivation is particularly important. The goal of our study was to develop and validate novel assays to quantitatively measure social motivation in mice. In order to test the validity of our paradigms, we compared the BTBR strain, with documented social deficits, to the prosocial C57BL/6J strain. Two novel conditioning paradigms were developed that allowed the test mouse to control access to a social partner. In the social motivation task, the test mice lever pressed for a social reward. The reward contingency was set on a progressive ratio of reinforcement and the number of lever presses achieved in the final trial of a testing session (breakpoint) was used as an index of social motivation. In the valence comparison task, motivation for a food reward was compared to a social reward. We also explored activity, social affiliation, and preference for social novelty through a series of tasks using an ANY-Maze video-tracking system in an open-field arena. BTBR mice had significantly lower breakpoints in the social motivation paradigm than C57BL/6J mice. However, the valence comparison task revealed that BTBR mice also made significantly fewer lever presses for a food reward. The results of the conditioning paradigms suggest that the BTBR strain has an overall deficit in motivated behavior. Furthermore, the results of the open-field observations may suggest that social differences in the BTBR strain are anxiety induced.

  3. Validation of operant social motivation paradigms using BTBR T+tf/J and C57BL/6J inbred mouse strains

    Science.gov (United States)

    Martin, Loren; Sample, Hannah; Gregg, Michael; Wood, Caleb

    2014-01-01

    Background As purported causal factors are identified for autism spectrum disorder (ASD), new assays are needed to better phenotype animal models designed to explore these factors. With recent evidence suggesting that deficits in social motivation are at the core of ASD behavior, the development of quantitative measures of social motivation is particularly important. The goal of our study was to develop and validate novel assays to quantitatively measure social motivation in mice. Methods In order to test the validity of our paradigms, we compared the BTBR strain, with documented social deficits, to the prosocial C57BL/6J strain. Two novel conditioning paradigms were developed that allowed the test mouse to control access to a social partner. In the social motivation task, the test mice lever pressed for a social reward. The reward contingency was set on a progressive ratio of reinforcement and the number of lever presses achieved in the final trial of a testing session (breakpoint) was used as an index of social motivation. In the valence comparison task, motivation for a food reward was compared to a social reward. We also explored activity, social affiliation, and preference for social novelty through a series of tasks using an ANY-Maze video-tracking system in an open-field arena. Results BTBR mice had significantly lower breakpoints in the social motivation paradigm than C57BL/6J mice. However, the valence comparison task revealed that BTBR mice also made significantly fewer lever presses for a food reward. Conclusions The results of the conditioning paradigms suggest that the BTBR strain has an overall deficit in motivated behavior. Furthermore, the results of the open-field observations may suggest that social differences in the BTBR strain are anxiety induced. PMID:25328850

  4. Hanaro operation

    International Nuclear Information System (INIS)

    Lee, Ji Bok; Jeon, Byung Jin; Kwack, Byung Ho

    1997-01-01

    HANARO was configurated its first operating core in 1995. Long term operation test was conducted up to 3-1 cycle during 1996, in order to investigate the reactor characteristics due to fuel depletion and additional fuel loading. Now HANARO has accumulated 168.4 days of total operation time and 2,687.5 MWD of total thermal output. Reactor analysis, producing operation datum and its validation with test, periodic inspection and maintenance of the facility are continuously conducted for safe operation of the HANARO. Conducted the verification tests for installed utilization facilities, and successfully performed the radiation emergency drill. The shutdown report of TRIGA Mark II and III was submitted to MOST, and decommissioning will be started from 1997. (author). 70 tabs., 50 figs., 27 refs

  5. Unmanned Aircraft Systems Detect and Avoid System: End-to-End Verification and Validation Simulation Study of Minimum Operations Performance Standards for Integrating Unmanned Aircraft into the National Airspace System

    Science.gov (United States)

    Ghatas, Rania W.; Jack, Devin P.; Tsakpinis, Dimitrios; Sturdy, James L.; Vincent, Michael J.; Hoffler, Keith D.; Myer, Robert R.; DeHaven, Anna M.

    2017-01-01

    As Unmanned Aircraft Systems (UAS) make their way to mainstream aviation operations within the National Airspace System (NAS), research efforts are underway to develop a safe and effective environment for their integration into the NAS. Detect and Avoid (DAA) systems are required to account for the lack of "eyes in the sky" due to having no human on-board the aircraft. The technique, results, and lessons learned from a detailed End-to-End Verification and Validation (E2-V2) simulation study of a DAA system representative of RTCA SC-228's proposed Phase I DAA Minimum Operational Performance Standards (MOPS), based on specific test vectors and encounter cases, will be presented in this paper.

  6. Study on team evaluation (4). Reliability and validity of questionnaire survey-based team work evaluation method of power plant operator team

    International Nuclear Information System (INIS)

    Sasou, Kunihide; Hirose, Ayako; Misawa, Ryou; Yamaguchi, Hiroyuki

    2006-01-01

    The series of this study describes the necessity of the evaluation of team work from two aspects of operator's behavior and operators' mind. The authors propose Team Work Element Model which consists of necessary elements to build high performance team. This report discusses a method to evaluate team work from the second aspect, that is, competency trust, competition, for-the team spirit, etc. The authors survey the previous studies on psychological measures and organize a set of questions to evaluate 10 team work sub elements that are the parts of Team Work Element Model. The factor analysis shows that this set of questions is consists of 13 factors such as task-oriented leadership, harmony-oriented team atmosphere, etc. Close examination of the questions in each factor shows that 8 of 10 team work sub elements can be evaluated by this questionnaire. In addition, this questionnaire comprises scales additional 8 scales such as job satisfaction, leadership, etc. As a result, it is possible to evaluate team work from more comprehensive view points. (author)

  7. HEDR model validation plan

    International Nuclear Information System (INIS)

    Napier, B.A.; Gilbert, R.O.; Simpson, J.C.; Ramsdell, J.V. Jr.; Thiede, M.E.; Walters, W.H.

    1993-06-01

    The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computational ''tools'' for estimating the possible radiation dose that individuals may have received from past Hanford Site operations. This document describes the planned activities to ''validate'' these tools. In the sense of the HEDR Project, ''validation'' is a process carried out by comparing computational model predictions with field observations and experimental measurements that are independent of those used to develop the model

  8. The validation of the standard Chinese version of the European Organization for Research and Treatment of Cancer Quality of Life Core Questionnaire 30 (EORTC QLQ-C30 in pre-operative patients with brain tumor in China

    Directory of Open Access Journals (Sweden)

    Zhang Hong-ying

    2011-04-01

    Full Text Available Abstract Background Health related quality of life (HRQOL has increasingly emphasized on cancer patients. The psychometric properties of the standard Chinese version of the European Organization for Research and Treatment of Cancer Quality of Life Core Questionnaire 30 (EORTC QLQ-C30, version 3.0 in brain tumor patients wasn't proven, and there was no baseline HRQOL in brain tumor patients prior to surgery. Methods The questionnaire EORTC QLQ-C30 (version 3.0 was administered at three time points: T1, the first or the second day that patients were hospitalized after the brain tumor suspected or diagnosed by MRI or CT; T2, 1 to 2 days after T1, (T1 and T2 were both before surgery; T3, the day before discharge. Clinical variables included disease histologic types, cognitive function, and Karnofsky Performance Status. Results Cronbach's alpha coefficients for multi-item scales were greater than .70 and multitrait scaling analysis showed that most of the item-scale correlation coefficients met the standards of convergent and discriminant validity, except for the cognitive functioning scale. All scales and items exhibited construct validity. Score changes over peri-operation were observed in physical and role functioning scales. Compared with mixed cancer patients assessed after surgery but before adjuvant treatment, brain tumor patients assessed pre-surgery presented better function and fewer symptoms. Conclusions The standard Chinese version of the EORTC QLQ-C30 was overall a valid instrument to assess HRQOL in brain tumor patients in China. The baseline HRQOL in brain tumor patients pre-surgery was better than that in mixed cancer patients post-surgery. Future study should modify cognitive functioning scale and examine test-retest reliability and response validity.

  9. Validation of electro-thermal simulation with experimental data to prepare online operation of a molten salt target at ISOLDE for the Beta Beams

    CERN Document Server

    Cimmino, S; Marzari, S; Stora, T

    2013-01-01

    The main objective of the Beta Beams is to study oscillation property of pure electrons neutrinos. It produces high energy beams of pure electron neutrinos and anti-neutrinos for oscillation experiments by beta decay of He-6 and Ne-18 radioactive ion beams, stored in a decay ring at gamma = 100. The production of He-6 beam has already been accomplished using a thick beryllium oxide target. However, the production of the needed rate of Ne-18 has proven to be more challenging. In order to achieve the requested yield for Ne-18 a new high power target design based on a circulating molten salt loop has been proposed. To verify some elements of the design, a static molten salt target prototype has been developed at ISOLDE and operated successfully. This paper describes the electro-thermal study of the molten salt target taking into account the heat produced by Joule effect, radiative heat exchange, active water cooling due to forced convection and air passive cooling due to natural convection. The numerical results...

  10. Cross-Platform Development Techniques for Mobile Devices

    Science.gov (United States)

    2013-09-01

    solutions. Mobile devices run on diverse platforms requiring differing constraints that the developer must adhere to. Thus, extra time and resources...and growing market for providing solutions. Mobile devices run on diverse platforms requiring differing constraints that the developer must adhere...testing are an iOS- based Apple iPhone 4 and an Android-based Samsung Galaxy S III. For user interface analysis this chapter also includes, from both

  11. Mining the archives: a cross-platform analysis of gene ...

    Science.gov (United States)

    Formalin-fixed paraffin-embedded (FFPE) tissue samples represent a potentially invaluable resource for genomic research into the molecular basis of disease. However, use of FFPE samples in gene expression studies has been limited by technical challenges resulting from degradation of nucleic acids. Here we evaluated gene expression profiles derived from fresh-frozen (FRO) and FFPE mouse liver tissues using two DNA microarray protocols and two whole transcriptome sequencing (RNA-seq) library preparation methodologies. The ribo-depletion protocol outperformed the other three methods by having the highest correlations of differentially expressed genes (DEGs) and best overlap of pathways between FRO and FFPE groups. We next tested the effect of sample time in formalin (18 hours or 3 weeks) on gene expression profiles. Hierarchical clustering of the datasets indicated that test article treatment, and not preservation method, was the main driver of gene expression profiles. Meta- and pathway analyses indicated that biological responses were generally consistent for 18-hour and 3-week FFPE samples compared to FRO samples. However, clear erosion of signal intensity with time in formalin was evident, and DEG numbers differed by platform and preservation method. Lastly, we investigated the effect of age in FFPE block on genomic profiles. RNA-seq analysis of 8-, 19-, and 26-year-old control blocks using the ribo-depletion protocol resulted in comparable quality metrics, inc

  12. Free, cross-platform gRaphical software

    DEFF Research Database (Denmark)

    Dethlefsen, Claus

    2006-01-01

    -recursive graphical models, and models defined using the BUGS language. Today, there exists a wide range of packages to support the analysis of data using graphical models. Here, we focus on Open Source software, making it possible to extend the functionality by integrating these packages into more general tools. We...... will attempt to give an overview of the available Open Source software, with focus on the gR project. This project was launched in 2002 to make facilities in R for graphical modelling. Several R packages have been developed within the gR project both for display and analysis of graphical models...

  13. Behind the scenes of GS: cross-platform

    CERN Multimedia

    Katarina Anthony

    2014-01-01

    The year was 1989: the dawn of administrative computing. In a laboratory filled to the rafters with paperwork, CERN's then Director-General Carlo Rubbia saw an opportunity for a complete administrative overhaul. He established the Advanced Information Systems (AIS) project to analyse CERN's administration, which in turn suggested the Electronic Document Handling (EDH) system. By 1992, EDH was up and running - the start of a new chapter in CERN history.   If you think you've never come accross EDH, think again. The system is an integral part of CERN life, handling everything from the purchase of materials to leave requests. EDH sees you through your entire CERN life: from your first CERN job application to your final retirement checklist. One platform, sixty-five functions What makes EDH so special is its solitary nature: it is one platform that carries out dozens of varied functions. "Most companies organise their administration in 'vertical' ...

  14. FACTAR validation

    International Nuclear Information System (INIS)

    Middleton, P.B.; Wadsworth, S.L.; Rock, R.C.; Sills, H.E.; Langman, V.J.

    1995-01-01

    A detailed strategy to validate fuel channel thermal mechanical behaviour codes for use of current power reactor safety analysis is presented. The strategy is derived from a validation process that has been recently adopted industry wide. Focus of the discussion is on the validation plan for a code, FACTAR, for application in assessing fuel channel integrity safety concerns during a large break loss of coolant accident (LOCA). (author)

  15. How to Conduct Multimethod Field Studies in the Operating Room: The iPad Combined With a Survey App as a Valid and Reliable Data Collection Tool.

    Science.gov (United States)

    Tscholl, David W; Weiss, Mona; Spahn, Donat R; Noethiger, Christoph B

    2016-01-05

    Tablet computers such as the Apple iPad are progressively replacing traditional paper-and-pencil-based data collection. We combined the iPad with the ready-to-use survey software, iSurvey (from Harvestyourdata), to create a straightforward tool for data collection during the Anesthesia Pre-Induction Checklist (APIC) study, a hospital-wide multimethod intervention study involving observation of team performance and team member surveys in the operating room (OR). We aimed to provide an analysis of the factors that led to the use of the iPad- and iSurvey-based tool for data collection, illustrate our experiences with the use of this data collection tool, and report the results of an expert survey about user experience with this tool. We used an iPad- and iSurvey-based tool to observe anesthesia inductions conducted by 205 teams (N=557 team members) in the OR. In Phase 1, expert raters used the iPad- and iSurvey-based tool to rate team performance during anesthesia inductions, and anesthesia team members were asked to indicate their perceptions after the inductions. In Phase 2, we surveyed the expert raters about their perceptions regarding the use of the iPad- and iSurvey-based tool to observe, rate, and survey teams in the ORs. The results of Phase 1 showed that training data collectors on the iPad- and iSurvey-based data collection tool was effortless and there were no serious problems during data collection, upload, download, and export. Interrater agreement of the combined data collection tool was found to be very high for the team observations (median Fleiss' kappa=0.88, 95% CI 0.78-1.00). The results of the follow-up expert rater survey (Phase 2) showed that the raters did not prefer a paper-and-pencil-based data collection method they had used during other earlier studies over the iPad- and iSurvey-based tool (median response 1, IQR 1-1; 1=do not agree, 2=somewhat disagree, 3=neutral, 4=somewhat agree, 5=fully agree). They found the iPad (median 5, IQR 4

  16. Applied Operations Research: Operator's Assistant

    Science.gov (United States)

    Cole, Stuart K.

    2015-01-01

    NASA operates high value critical equipment (HVCE) that requires trouble shooting, periodic maintenance and continued monitoring by Operations staff. The complexity HVCE and information required to maintain and trouble shoot HVCE to assure continued mission success as paper is voluminous. Training on new HVCE is commensurate with the need for equipment maintenance. LaRC Research Directorate has undertaken a proactive research to support Operations staff by initiation of the development and prototyping an electronic computer based portable maintenance aid (Operator's Assistant). This research established a goal with multiple objectives and a working prototype was developed. The research identified affordable solutions; constraints; demonstrated use of commercial off the shelf software; use of the US Coast Guard maintenance solution; NASA Procedure Representation Language; and the identification of computer system strategies; where these demonstrations and capabilities support the Operator, and maintenance. The results revealed validation against measures of effectiveness and overall proved a substantial training and capability sustainment tool. The research indicated that the OA could be deployed operationally at the LaRC Compressor Station with an expectation of satisfactorily results and to obtain additional lessons learned prior to deployment at other LaRC Research Directorate Facilities. The research revealed projected cost and time savings.

  17. Using fractional order method to generalize strengthening generating operator buffer operator and weakening buffer operator

    OpenAIRE

    Wu, L.; Liu, S.; Yang, Yingjie

    2016-01-01

    Traditional integer order buffer operator is extended to fractional order buffer operator, the corresponding relationship between the weakening buffer operator and the strengthening buffer operator is revealed. Fractional order buffer operator not only can generalize the weakening buffer operator and the strengthening buffer operator, but also realize tiny adjustment of buffer effect. The effectiveness of GM(1,1) with the fractional order buffer operator is validated by six cases.

  18. Validation philosophy

    International Nuclear Information System (INIS)

    Vornehm, D.

    1994-01-01

    To determine when a set of calculations falls within an umbrella of an existing validation documentation, it is necessary to generate a quantitative definition of range of applicability (our definition is only qualitative) for two reasons: (1) the current trend in our regulatory environment will soon make it impossible to support the legitimacy of a validation without quantitative guidelines; and (2) in my opinion, the lack of support by DOE for further critical experiment work is directly tied to our inability to draw a quantitative open-quotes line-in-the-sandclose quotes beyond which we will not use computer-generated values

  19. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  20. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  1. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William L.; Trucano, Timothy G.

    2008-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  2. What is validation

    International Nuclear Information System (INIS)

    Clark, H.K.

    1985-01-01

    Criteria for establishing the validity of a computational method to be used in assessing nuclear criticality safety, as set forth in ''American Standard for Nuclear Criticality Safety in Operations with Fissionable Materials Outside Reactors,'' ANSI/ANS-8.1-1983, are examined and discussed. Application of the criteria is illustrated by describing the procedures followed in deriving subcritical limits that have been incorporated in the Standard

  3. Development, Validation, and Potential Enhancements to the Second-Generation Operational Aerosol Product at the National Environmental Satellite, Data, and Information Service of the National Oceanic and Atmospheric Administration

    Science.gov (United States)

    Stowe, Larry L.; Ignatov, Alexander M.; Singh, Ramdas R.

    1997-01-01

    A revised (phase 2) single-channel algorithm for aerosol optical thickness, tau(sup A)(sub SAT), retrieval over oceans from radiances in channel 1 (0.63 microns) of the Advanced Very High Resolution Radiometer (AVHRR) has been implemented at the National Oceanic and Atmospheric Administration's National Environmental Satellite Data and Information Service for the NOAA 14 satellite launched December 30, 1994. It is based on careful validation of its operational predecessor (phase 1 algorithm), implemented for NOAA 14 in 1989. Both algorithms scale the upward satellite radiances in cloud-free conditions to aerosol optical thickness using an updated radiative transfer model of the ocean and atmosphere. Application of the phase 2 algorithm to three matchup Sun-photometer and satellite data sets, one with NOAA 9 in 1988 and two with NOAA 11 in 1989 and 1991, respectively, show systematic error is less than 10%, with a random error of sigma(sub tau) approx. equal 0.04. First results of tau(sup A)(sub SAT) retrievals from NOAA 14 using the phase 2 algorithm, and from checking its internal consistency, are presented. The potential two-channel (phase 3) algorithm for the retrieval of an aerosol size parameter, such as the Junge size distribution exponent, by adding either channel 2 (0.83 microns) from the current AVHRR instrument, or a 1.6-microns channel to be available on the Tropical Rainfall Measurement Mission and the NOAA-KLM satellites by 1997 is under investigation. The possibility of using this additional information in the retrieval of a more accurate estimate of aerosol optical thickness is being explored.

  4. Terminology, Emphasis, and Utility in Validation

    Science.gov (United States)

    Kane, Michael T.

    2008-01-01

    Lissitz and Samuelsen (2007) have proposed an operational definition of "validity" that shifts many of the questions traditionally considered under validity to a separate category associated with the utility of test use. Operational definitions support inferences about how well people perform some kind of task or how they respond to some kind of…

  5. Universal Library for Building Radar Operator Interface

    Directory of Open Access Journals (Sweden)

    A. A. Karankevich

    2014-01-01

    Full Text Available The article contains the results of the development of a software library, used for building software interfaces for radars being developed in BMSTU Radioelectronic Technics Scientific and Research Institute. The library is a software application library written in C++ using Qt and OpenGL libraries.The article describes the requirements, that the library is supposed to meet, in particular — cross-platform capabilities and versatility of the solution. The data types, that library uses, are described. The description of theinterface elements developed is shown, and some pictures of their operation are given.The article shows the main interface elements used. They are: «Matrix» that shows twodimensional data, «Waterfall», that is used for time scanning of the parameter specified, and «Plan Position Indicator» that shows circular scan from surveillance radar without geometric distortions.The part «Library implementation» shows the example of radiolocation station interface, that was based on this library, used in the working model of ultrashortpulse radar. Some results of the operation of this interface are also shown. The experiment shows the system working with two people in the field. As people start to move, the system becomes capable of distinguishing moving targets and stationary surface. The article shows the system operation the same way as the system operator can see it through his interface.The conclusion contains brief results of the development, the sphere of application of the software, and the prospects of the further development of the library.

  6. Software Validation in ATLAS

    International Nuclear Information System (INIS)

    Hodgkinson, Mark; Seuster, Rolf; Simmons, Brinick; Sherwood, Peter; Rousseau, David

    2012-01-01

    The ATLAS collaboration operates an extensive set of protocols to validate the quality of the offline software in a timely manner. This is essential in order to process the large amounts of data being collected by the ATLAS detector in 2011 without complications on the offline software side. We will discuss a number of different strategies used to validate the ATLAS offline software; running the ATLAS framework software, Athena, in a variety of configurations daily on each nightly build via the ATLAS Nightly System (ATN) and Run Time Tester (RTT) systems; the monitoring of these tests and checking the compilation of the software via distributed teams of rotating shifters; monitoring of and follow up on bug reports by the shifter teams and periodic software cleaning weeks to improve the quality of the offline software further.

  7. Experimental validation of a kinetic multi-component mechanism in a wide HCCI engine operating range for mixtures of n-heptane, iso-octane and toluene: Influence of EGR parameters

    International Nuclear Information System (INIS)

    Machrafi, Hatim

    2008-01-01

    The parameters that are present in exhaust gas recirculation (EGR) are believed to provide an important contribution to control the auto-ignition process of the homogeneous charge compression ignition (HCCI) in an engine. For the investigation of the behaviour of the auto-ignition process, a kinetic multi-component mechanism has been developed in former work, containing 62 reactions and 49 species for mixtures of n-heptane, iso-octane and toluene. This paper presents an experimental validation of this mechanism, comparing the calculated pressure, heat release, ignition delays and CO 2 emissions with experimental data performed on a HCCI engine. The validation is performed in a broad range of EGR parameters by varying the dilution by N 2 and CO 2 from 0 to 46 vol.%, changing the EGR temperature from 30 to 120 deg. C, altering the addition of CO and NO from 0 to 170 ppmv and varying the addition of CH 2 O from 0 to 1400 ppmv. These validations were performed respecting the HCCI conditions for the inlet temperature and the equivalence ratio. The results showed that the mechanism is validated experimentally in dilution ranges going up to 21-30 vol.%, depending on the species of dilution and over the whole range of the EGR temperature. The mechanism is validated over the whole range of CO and CH 2 O addition. As for the addition of NO, the mechanism is validated quantitatively up to 50 ppmv and qualitatively up to 170 ppmv

  8. MARS Validation Plan and Status

    International Nuclear Information System (INIS)

    Ahn, Seung-hoon; Cho, Yong-jin

    2008-01-01

    The KINS Reactor Thermal-hydraulic Analysis System (KINS-RETAS) under development is directed toward a realistic analysis approach of best-estimate (BE) codes and realistic assumptions. In this system, MARS is pivoted to provide the BE Thermal-Hydraulic (T-H) response in core and reactor coolant system to various operational transients and accidental conditions. As required for other BE codes, the qualification is essential to ensure reliable and reasonable accuracy for a targeted MARS application. Validation is a key element of the code qualification, and determines the capability of a computer code in predicting the major phenomena expected to occur. The MARS validation was made by its developer KAERI, on basic premise that its backbone code RELAP5/MOD3.2 is well qualified against analytical solutions, test or operational data. A screening was made to select the test data for MARS validation; some models transplanted from RELAP5, if already validated and found to be acceptable, were screened out from assessment. It seems to be reasonable, but does not demonstrate whether code adequacy complies with the software QA guidelines. Especially there may be much difficulty in validating the life-cycle products such as code updates or modifications. This paper presents the plan for MARS validation, and the current implementation status

  9. Experimental validation of a kinetic multi-component mechanism in a wide HCCI engine operating range for mixtures of n-heptane, iso-octane and toluene: Influence of EGR parameters

    Energy Technology Data Exchange (ETDEWEB)

    Machrafi, Hatim [LGPPTS, Ecole Nationale Superieure de Chimie de Paris/ Universite Pierre et Marie Curie (Paris 6), 11, rue de Pierre et Marie Curie, 75231 Paris Cedex 05 (France)

    2008-11-15

    The parameters that are present in exhaust gas recirculation (EGR) are believed to provide an important contribution to control the auto-ignition process of the homogeneous charge compression ignition (HCCI) in an engine. For the investigation of the behaviour of the auto-ignition process, a kinetic multi-component mechanism has been developed in former work, containing 62 reactions and 49 species for mixtures of n-heptane, iso-octane and toluene. This paper presents an experimental validation of this mechanism, comparing the calculated pressure, heat release, ignition delays and CO{sub 2} emissions with experimental data performed on a HCCI engine. The validation is performed in a broad range of EGR parameters by varying the dilution by N{sub 2} and CO{sub 2} from 0 to 46 vol.%, changing the EGR temperature from 30 to 120 C, altering the addition of CO and NO from 0 to 170 ppmv and varying the addition of CH{sub 2}O from 0 to 1400 ppmv. These validations were performed respecting the HCCI conditions for the inlet temperature and the equivalence ratio. The results showed that the mechanism is validated experimentally in dilution ranges going up to 21-30 vol.%, depending on the species of dilution and over the whole range of the EGR temperature. The mechanism is validated over the whole range of CO and CH{sub 2}O addition. As for the addition of NO, the mechanism is validated quantitatively up to 50 ppmv and qualitatively up to 170 ppmv. (author)

  10. 抑郁症临床治愈的指标及其有效性研究%The operational criterion and its validity of remission of major depressive disorder

    Institute of Scientific and Technical Information of China (English)

    曹瑞想; 马辉; 杨华; 杨娟; 罗厚员; 曲海涛; 张宁

    2014-01-01

    目的 探讨界定抑郁症临床治愈的17项HAMD(HAMD17)评分的临界值,并检验其有效性.方法 对251例经治疗症状得到明显改善的抑郁症患者评定HAMD17、大体评定量表、生活质量综合评定问卷,并应用受试者工作特征曲线法分析临界值.结果 抑郁症患者经过急性期治疗,HAMD17≤7分时,92.3%(203/220)的被试者存在至少一种阈下症状,48.6% (107/220)的患者心理社会功能未恢复正常.HAMD17评分0~3分组患者心理功能、社会功能、躯体功能评分[分别为(60.1±15.4)、(62.0±l1.2)、(60.5±14.3)分],高于4~ 14分组患者[分别为(52.0±12.3)、(54.0±11.0)、(52.7±10.2)分],差异具有统计学意义(t=3.307、4.119、3.626,均P<0.01).0~3分组心理社会功能恢复正常的比例(74.2%,92/124)高于4~ 14分组患者(12.6%,16/127;x2=97.103,P<0.01).结论 抑郁症患者经过急性期治疗,HAMD17评分0~3分的患者心理社会功能更好,当HAMD17评分>3分时,患者可能存在较多的阈下症状,临床上应予以重视.%Objective To explore the optimal cutoff point defining remission on 17-item Hamilton Depression Scale (HAMD17) for major depressive disorder (MDD),and examine the validity of this cutoff point.Method We interviewed 251 MDD responders with the HAMD17,Global Assessment of Functioning,Generic Quality of Life Inventory.Results After acute phase treatment for MDD,even the HAMD17 scored 7 or less,92.3% (203/220) of remitters had at least one subthreshold depressive symptoms,48.6% (107/220) of them also experienced functioning impairment,when psychosocial functioning taken into account,a cutoff of ≤ 3 might be the operational criterion of MDD remission.Compared to MDD patients scoring 4-14 on HAMD17,patients scoring 0-3 had significantly better psychological functioning,social functioning and physical functioning (separately 60.1 ± 15.4 vs.52.0 ± 12.3,62.0 ± 11.2 vs.54.0 ± 11.0,60.5±14.3vs.52.7±10.2; t=3

  11. Construct Validity and Case Validity in Assessment

    Science.gov (United States)

    Teglasi, Hedwig; Nebbergall, Allison Joan; Newman, Daniel

    2012-01-01

    Clinical assessment relies on both "construct validity", which focuses on the accuracy of conclusions about a psychological phenomenon drawn from responses to a measure, and "case validity", which focuses on the synthesis of the full range of psychological phenomena pertaining to the concern or question at hand. Whereas construct validity is…

  12. 78 FR 5866 - Pipeline Safety: Annual Reports and Validation

    Science.gov (United States)

    2013-01-28

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket ID PHMSA-2012-0319] Pipeline Safety: Annual Reports and Validation AGENCY: Pipeline and Hazardous Materials... 2012 gas transmission and gathering annual reports, remind pipeline owners and operators to validate...

  13. PIV Data Validation Software Package

    Science.gov (United States)

    Blackshire, James L.

    1997-01-01

    A PIV data validation and post-processing software package was developed to provide semi-automated data validation and data reduction capabilities for Particle Image Velocimetry data sets. The software provides three primary capabilities including (1) removal of spurious vector data, (2) filtering, smoothing, and interpolating of PIV data, and (3) calculations of out-of-plane vorticity, ensemble statistics, and turbulence statistics information. The software runs on an IBM PC/AT host computer working either under Microsoft Windows 3.1 or Windows 95 operating systems.

  14. Validation of thermalhydraulic codes

    International Nuclear Information System (INIS)

    Wilkie, D.

    1992-01-01

    Thermalhydraulic codes require to be validated against experimental data collected over a wide range of situations if they are to be relied upon. A good example is provided by the nuclear industry where codes are used for safety studies and for determining operating conditions. Errors in the codes could lead to financial penalties, to the incorrect estimation of the consequences of accidents and even to the accidents themselves. Comparison between prediction and experiment is often described qualitatively or in approximate terms, e.g. ''agreement is within 10%''. A quantitative method is preferable, especially when several competing codes are available. The codes can then be ranked in order of merit. Such a method is described. (Author)

  15. Spatial Operations

    Directory of Open Access Journals (Sweden)

    Anda VELICANU

    2010-09-01

    Full Text Available This paper contains a brief description of the most important operations that can be performed on spatial data such as spatial queries, create, update, insert, delete operations, conversions, operations on the map or analysis on grid cells. Each operation has a graphical example and some of them have code examples in Oracle and PostgreSQL.

  16. Operational amplifiers

    CERN Document Server

    Dostal, Jiri

    1993-01-01

    This book provides the reader with the practical knowledge necessary to select and use operational amplifier devices. It presents an extensive treatment of applications and a practically oriented, unified theory of operational circuits.Provides the reader with practical knowledge necessary to select and use operational amplifier devices. Presents an extensive treatment of applications and a practically oriented, unified theory of operational circuits

  17. Development and Experimental Validation of Large Eddy Simulation Techniques for the Prediction of Combustion-Dynamic Process in Syngas Combustion: Characterization of Autoignition, Flashback, and Flame-Liftoff at Gas-Turbine Relevant Operating Conditions

    Energy Technology Data Exchange (ETDEWEB)

    Ihme, Matthias [Univ. of Michigan, Ann Arbor, MI (United States); Driscoll, James [Univ. of Michigan, Ann Arbor, MI (United States)

    2015-08-31

    The objective of this closely coordinated experimental and computational research effort is the development of simulation techniques for the prediction of combustion processes, relevant to the oxidation of syngas and high hydrogen content (HHC) fuels at gas-turbine relevant operating conditions. Specifically, the research goals are (i) the characterization of the sensitivity of syngas ignition processes to hydrodynamic processes and perturbations in temperature and mixture composition in rapid compression machines and ow-reactors and (ii) to conduct comprehensive experimental investigations in a swirl-stabilized gas turbine (GT) combustor under realistic high-pressure operating conditions in order (iii) to obtain fundamental understanding about mechanisms controlling unstable flame regimes in HHC-combustion.

  18. Construction of vertex operators using operator formalism techniques

    International Nuclear Information System (INIS)

    Gato, B.; Massachusetts Inst. of Tech., Cambridge

    1989-01-01

    We derive vertex operators in oscillator form as an application of the conserved charges method developed by Vafa for the operator formalism in higher genus Riemann surfaces. This construction proves to be clear, direct and valid for the bosonic and fermionic strings as wells as for twisted strings on orbifolds. We discuss the method and construct vertex operators for the bosonic string moving on Z N orbifolds and for the fermionic string in the NSR formulation. (orig.)

  19. Operating systems

    CERN Document Server

    Tsichritzis, Dionysios C; Rheinboldt, Werner

    1974-01-01

    Operating Systems deals with the fundamental concepts and principles that govern the behavior of operating systems. Many issues regarding the structure of operating systems, including the problems of managing processes, processors, and memory, are examined. Various aspects of operating systems are also discussed, from input-output and files to security, protection, reliability, design methods, performance evaluation, and implementation methods.Comprised of 10 chapters, this volume begins with an overview of what constitutes an operating system, followed by a discussion on the definition and pr

  20. Operational calculus

    CERN Document Server

    Boehme, Thomas K

    1987-01-01

    Operational Calculus, Volume II is a methodical presentation of operational calculus. An outline of the general theory of linear differential equations with constant coefficients is presented. Integral operational calculus and advanced topics in operational calculus, including locally integrable functions and convergence in the space of operators, are also discussed. Formulas and tables are included.Comprised of four sections, this volume begins with a discussion on the general theory of linear differential equations with constant coefficients, focusing on such topics as homogeneous and non-ho

  1. Lesson 6: Signature Validation

    Science.gov (United States)

    Checklist items 13 through 17 are grouped under the Signature Validation Process, and represent CROMERR requirements that the system must satisfy as part of ensuring that electronic signatures it receives are valid.

  2. Your Lung Operation: After Your Operation

    Medline Plus

    Full Text Available ... State Requirements Contact Online Education Accreditation, Verification, and Validation Accreditation, Verification, and Validation Programs Accreditation, Verification, and ...

  3. Spent Nuclear Fuel (SNF) Process Validation Technical Support Plan

    Energy Technology Data Exchange (ETDEWEB)

    SEXTON, R.A.

    2000-03-13

    The purpose of Process Validation is to confirm that nominal process operations are consistent with the expected process envelope. The Process Validation activities described in this document are not part of the safety basis, but are expected to demonstrate that the process operates well within the safety basis. Some adjustments to the process may be made as a result of information gathered in Process Validation.

  4. Spent Nuclear Fuel (SNF) Process Validation Technical Support Plan

    International Nuclear Information System (INIS)

    SEXTON, R.A.

    2000-01-01

    The purpose of Process Validation is to confirm that nominal process operations are consistent with the expected process envelope. The Process Validation activities described in this document are not part of the safety basis, but are expected to demonstrate that the process operates well within the safety basis. Some adjustments to the process may be made as a result of information gathered in Process Validation

  5. Spacecraft operations

    CERN Document Server

    Sellmaier, Florian; Schmidhuber, Michael

    2015-01-01

    The book describes the basic concepts of spaceflight operations, for both, human and unmanned missions. The basic subsystems of a space vehicle are explained in dedicated chapters, the relationship of spacecraft design and the very unique space environment are laid out. Flight dynamics are taught as well as ground segment requirements. Mission operations are divided into preparation including management aspects, execution and planning. Deep space missions and space robotic operations are included as special cases. The book is based on a course held at the German Space Operation Center (GSOC).

  6. Principles of Proper Validation

    DEFF Research Database (Denmark)

    Esbensen, Kim; Geladi, Paul

    2010-01-01

    to suffer from the same deficiencies. The PPV are universal and can be applied to all situations in which the assessment of performance is desired: prediction-, classification-, time series forecasting-, modeling validation. The key element of PPV is the Theory of Sampling (TOS), which allow insight......) is critically necessary for the inclusion of the sampling errors incurred in all 'future' situations in which the validated model must perform. Logically, therefore, all one data set re-sampling approaches for validation, especially cross-validation and leverage-corrected validation, should be terminated...

  7. VAlidation STandard antennas: Past, present and future

    DEFF Research Database (Denmark)

    Drioli, Luca Salghetti; Ostergaard, A; Paquay, M

    2011-01-01

    designed for validation campaigns of antenna measurement ranges. The driving requirements of VAST antennas are their mechanical stability over a given operational temperature range and with respect to any orientation of the gravity field. The mechanical design shall ensure extremely stable electrical....../V-band of telecom satellites. The paper will address requirements for future VASTs and possible architecture for multi-frequency Validation Standard antennas....

  8. Operator substitution

    NARCIS (Netherlands)

    Hautus, M.L.J.

    1994-01-01

    Substitution of an operator into an operator-valued map is defined and studied. A Bezout-type remainder theorem is used to derive a number of results. The tensor map is used to formulate solvability conditions for linear matrix equations. Some applications to system theory are given, in particular

  9. Operation amplifier

    NARCIS (Netherlands)

    Tetsuya, Saito; Nauta, Bram

    2008-01-01

    To provide an operation amplifier which improves power source voltage removal ratios while assuring phase compensation characteristics, and therefore can be realized with a small-scale circuit and low power consumption. SOLUTION: The operation amplifier comprises: a differential amplifier circuit 1;

  10. Operation Amplifier

    NARCIS (Netherlands)

    Tetsuya, Saito; Nauta, Bram

    2011-01-01

    PROBLEM TO BE SOLVED: To provide an operation amplifier which improves power source voltage removal ratios while assuring phase compensation characteristics, and therefore can be realized with a small-scale circuit and low power consumption. SOLUTION: The operation amplifier comprises: a

  11. Operation Amplifier

    NARCIS (Netherlands)

    Tetsuya, S.; Nauta, Bram

    2007-01-01

    PROBLEM TO BE SOLVED: To provide an operation amplifier which improves power source voltage removal ratios while assuring phase compensation characteristics, and therefore can be realized with a small-scale circuit and low power consumption. ; SOLUTION: The operation amplifier comprises: a

  12. Accelerator operations

    International Nuclear Information System (INIS)

    Anon.

    1979-01-01

    Operations of the SuperHILAC, the Bevatron/Bevalac, and the 184-inch Synchrocyclotron during the period from October 1977 to September 1978 are discussed. These include ion source development, accelerator facilities, the Heavy Ion Spectrometer System, and Bevelac biomedical operations

  13. SMAP RADAR Calibration and Validation

    Science.gov (United States)

    West, R. D.; Jaruwatanadilok, S.; Chaubel, M. J.; Spencer, M.; Chan, S. F.; Chen, C. W.; Fore, A.

    2015-12-01

    The Soil Moisture Active Passive (SMAP) mission launched on Jan 31, 2015. The mission employs L-band radar and radiometer measurements to estimate soil moisture with 4% volumetric accuracy at a resolution of 10 km, and freeze-thaw state at a resolution of 1-3 km. Immediately following launch, there was a three month instrument checkout period, followed by six months of level 1 (L1) calibration and validation. In this presentation, we will discuss the calibration and validation activities and results for the L1 radar data. Early SMAP radar data were used to check commanded timing parameters, and to work out issues in the low- and high-resolution radar processors. From April 3-13 the radar collected receive only mode data to conduct a survey of RFI sources. Analysis of the RFI environment led to a preferred operating frequency. The RFI survey data were also used to validate noise subtraction and scaling operations in the radar processors. Normal radar operations resumed on April 13. All radar data were examined closely for image quality and calibration issues which led to improvements in the radar data products for the beta release at the end of July. Radar data were used to determine and correct for small biases in the reported spacecraft attitude. Geo-location was validated against coastline positions and the known positions of corner reflectors. Residual errors at the time of the beta release are about 350 m. Intra-swath biases in the high-resolution backscatter images are reduced to less than 0.3 dB for all polarizations. Radiometric cross-calibration with Aquarius was performed using areas of the Amazon rain forest. Cross-calibration was also examined using ocean data from the low-resolution processor and comparing with the Aquarius wind model function. Using all a-priori calibration constants provided good results with co-polarized measurements matching to better than 1 dB, and cross-polarized measurements matching to about 1 dB in the beta release. During the

  14. Plant monitoring and signal validation at HFIR

    International Nuclear Information System (INIS)

    Mullens, J.A.

    1991-01-01

    This paper describes a monitoring system for the Oak Ridge National Laboratory's (ORNL'S) High Flux Isotope Reactor (HFIR). HFIR is an 85 MW pressurized water reactor designed to produce isotopes and intense neutron beams. The monitoring system is described with respect to plant signals and computer system; monitoring overview; data acquisition, logging and network distribution; signal validation; status displays; reactor condition monitoring; reactor operator aids. Future work will include the addition of more plant signals, more signal validation and diagnostic capabilities, improved status display, integration of the system with the RELAP plant simulation and graphical interface, improved operator aids, and an alarm filtering system. 8 refs., 7 figs. (MB)

  15. Content validation applied to job simulation and written examinations

    International Nuclear Information System (INIS)

    Saari, L.M.; McCutchen, M.A.; White, A.S.; Huenefeld, J.C.

    1984-08-01

    The application of content validation strategies in work settings have become increasingly popular over the last few years, perhaps spurred by an acknowledgment in the courts of content validation as a method for validating employee selection procedures (e.g., Bridgeport Guardians v. Bridgeport Police Dept., 1977). Since criterion-related validation is often difficult to conduct, content validation methods should be investigated as an alternative for determining job related selection procedures. However, there is not yet consensus among scientists and professionals concerning how content validation should be conducted. This may be because there is a lack of clear cut operations for conducting content validation for different types of selection procedures. The purpose of this paper is to discuss two content validation approaches being used for the development of a licensing examination that involves a job simulation exam and a written exam. These represent variations in methods for applying content validation. 12 references

  16. Reactor operation

    CERN Document Server

    Shaw, J

    2013-01-01

    Reactor Operation covers the theoretical aspects and design information of nuclear reactors. This book is composed of nine chapters that also consider their control, calibration, and experimentation.The opening chapters present the general problems of reactor operation and the principles of reactor control and operation. The succeeding chapters deal with the instrumentation, start-up, pre-commissioning, and physical experiments of nuclear reactors. The remaining chapters are devoted to the control rod calibrations and temperature coefficient measurements in the reactor. These chapters also exp

  17. Operator programs and operator processes

    NARCIS (Netherlands)

    Bergstra, J.A.; Walters, P.

    2003-01-01

    We define a notion of program which is not a computer program but an operator program: a detailed description of actions performed and decisions taken by a human operator (computer user) performing a task to achieve a goal in a simple setting consisting of that user, one or more computers and a

  18. Validity in Qualitative Evaluation

    OpenAIRE

    Vasco Lub

    2015-01-01

    This article provides a discussion on the question of validity in qualitative evaluation. Although validity in qualitative inquiry has been widely reflected upon in the methodological literature (and is still often subject of debate), the link with evaluation research is underexplored. Elaborating on epistemological and theoretical conceptualizations by Guba and Lincoln and Creswell and Miller, the article explores aspects of validity of qualitative research with the explicit objective of con...

  19. Operator theory

    CERN Document Server

    2015-01-01

    A one-sentence definition of operator theory could be: The study of (linear) continuous operations between topological vector spaces, these being in general (but not exclusively) Fréchet, Banach, or Hilbert spaces (or their duals). Operator theory is thus a very wide field, with numerous facets, both applied and theoretical. There are deep connections with complex analysis, functional analysis, mathematical physics, and electrical engineering, to name a few. Fascinating new applications and directions regularly appear, such as operator spaces, free probability, and applications to Clifford analysis. In our choice of the sections, we tried to reflect this diversity. This is a dynamic ongoing project, and more sections are planned, to complete the picture. We hope you enjoy the reading, and profit from this endeavor.

  20. Operation Starvation

    National Research Council Canada - National Science Library

    Mason, Gerald

    2002-01-01

    More than 1,250,000 tons of shipping was sunk or damaged in the last five months of World War II when Twenty-first Bomber Command executed an aerial mining campaign against Japan known as Operation STARVATION...

  1. Peace Operations

    National Research Council Canada - National Science Library

    Proks, Josef

    2000-01-01

    Peace operations are more and more important in the contemporary world. The end of the Cold War increased not only possibilities of solving disputes by the international community but also by the number and diversity of threats and issues...

  2. Operator training

    International Nuclear Information System (INIS)

    Wirstad, J.

    1983-12-01

    The traditional operator job is changing, which among other things has generated a need for better job training. Surprisingly increased process automation has lead to increased operator qualifications, i.e. basic job training but also up-date and rehearsal training within certain fixed intervals. There are several, similar models for instructional system development available in the literature. One model which is of special interest integrates Operator Training development and Man-Machine Interfaces development. The extent to which Systematic Operator Training has been implemented varies with branches and companies. The nuclear power branch is given as an example in the report. This branch probably represents something better than the average among the process industries.(author)

  3. Operative Links

    DEFF Research Database (Denmark)

    Wistoft, Karen; Højlund, Holger

    2012-01-01

    educational goals, learning content, or value clarification. Health pedagogy is often a matter of retrospective rationalization rather than the starting point of planning. Health and risk behaviour approaches override health educational approaches. Conclusions: Operational links between health education......, health professionalism, and management strategies pose the foremost challenge. Operational links indicates cooperative levels that facilitate a creative and innovative effort across traditional professional boundaries. It is proposed that such links are supported by network structures, shared semantics...

  4. PEMFC modeling and experimental validation

    Energy Technology Data Exchange (ETDEWEB)

    Vargas, J.V.C. [Federal University of Parana (UFPR), Curitiba, PR (Brazil). Dept. of Mechanical Engineering], E-mail: jvargas@demec.ufpr.br; Ordonez, J.C.; Martins, L.S. [Florida State University, Tallahassee, FL (United States). Center for Advanced Power Systems], Emails: ordonez@caps.fsu.edu, martins@caps.fsu.edu

    2009-07-01

    In this paper, a simplified and comprehensive PEMFC mathematical model introduced in previous studies is experimentally validated. Numerical results are obtained for an existing set of commercial unit PEM fuel cells. The model accounts for pressure drops in the gas channels, and for temperature gradients with respect to space in the flow direction, that are investigated by direct infrared imaging, showing that even at low current operation such gradients are present in fuel cell operation, and therefore should be considered by a PEMFC model, since large coolant flow rates are limited due to induced high pressure drops in the cooling channels. The computed polarization and power curves are directly compared to the experimentally measured ones with good qualitative and quantitative agreement. The combination of accuracy and low computational time allow for the future utilization of the model as a reliable tool for PEMFC simulation, control, design and optimization purposes. (author)

  5. Validation of simulation models

    DEFF Research Database (Denmark)

    Rehman, Muniza; Pedersen, Stig Andur

    2012-01-01

    In philosophy of science, the interest for computational models and simulations has increased heavily during the past decades. Different positions regarding the validity of models have emerged but the views have not succeeded in capturing the diversity of validation methods. The wide variety...

  6. NVN 5694 intra laboratory validation. Feasibility study for interlaboratory- validation

    International Nuclear Information System (INIS)

    Voors, P.I.; Baard, J.H.

    1998-11-01

    Within the project NORMSTAR 2 a number of Dutch prenormative protocols have been defined for radioactivity measurements. Some of these protocols, e.g. the Dutch prenormative protocol NVN 5694, titled Methods for radiochemical determination of polonium-210 and lead-210, have not been validated, neither by intralaboratory nor interlaboratory studies. Validation studies are conducted within the framework of the programme 'Normalisatie and Validatie van Milieumethoden 1993-1997' (Standardization and Validation of test methods for environmental parameters) of the Dutch Ministry of Housing, Physical Planning and the Environment (VROM). The aims of this study were (a) a critical evaluation of the protocol, (b) investigation on the feasibility of an interlaboratory study, and (c) the interlaboratory validation of NVN 5694. The evaluation of the protocol resulted in a list of deficiencies varying from missing references to incorrect formulae. From the survey by interview it appeared that for each type of material, there are 4 to 7 laboratories willing to participate in a interlaboratory validation study. This reflects the situation in 1997. Consequently, if 4 or 6 (the minimal number) laboratories are participating and each laboratory analyses 3 subsamples, the uncertainty in the repeatability standard deviation is 49 or 40 %, respectively. If the ratio of reproducibility standard deviation to the repeatability standard deviation is equal to 1 or 2, then the uncertainty in the reproducibility standard deviation increases from 42 to 67 % and from 34 to 52 % for 4 or 6 laboratories, respectively. The intralaboratory validation was established on four different types of materials. Three types of materials (milkpowder condensate and filter) were prepared in the laboratory using the raw material and certified Pb-210 solutions, and one (sediment) was obtained from the IAEA. The ECN-prepared reference materials were used after testing on homogeneity. The pre-normative protocol can

  7. Your Lung Operation: After Your Operation

    Medline Plus

    Full Text Available ... Special Activities Resources Housing and Travel ... Contact Online Education Accreditation, Verification, and Validation Accreditation, Verification, and Validation Programs Accreditation, Verification, and ...

  8. Your Lung Operation: After Your Operation

    Medline Plus

    Full Text Available ... Validation Programs Accreditation, Verification, and Validation Programs Accredited Education Institutes ... Entering Resident Readiness Assessment Evidence-Based Decisions in ...

  9. Operating experience

    International Nuclear Information System (INIS)

    McRae, L.P.; Six, D.E.

    1991-01-01

    In 1987, Westinghouse Hanford Company began operating a first-generation integrated safeguards system in the Plutonium Finishing Plant storage vaults. This Vault Safety and Inventory System is designed to integrate data into a computer-based nuclear material inventory monitoring system. The system gathers, in real time, measured physical parameters that generate nuclear material inventory status data for thousands of stored items and sends tailored report to the appropriate users. These data include canister temperature an bulge data reported to Plant Operations and Material Control and Accountability personnel, item presence and identification data reported to Material Control and Accountability personnel, and unauthorized item movement data reported to Security response forces and Material Control and Accountability personnel. The Westinghouse Hanford Company's experience and operational benefits in using this system for reduce radiation exposure, increase protection against insider threat, and real-time inventory control are discussed in this paper

  10. Operator companion

    International Nuclear Information System (INIS)

    Natalizio, A.; Anderson, J.W.D.; Sills, H.E.

    1988-01-01

    Abundant, cheap computing power has provided industry with a far greater opportunity than was available one or two decades ago to automate industrial processes and to improve the man-machine interface. Exciting innovations in knowledge representation methods arising from artificial intelligence research pave the way for advanced support systems for assisting plant operators. AECL has recognized the importance of knowledge based system technology, particularly expert systems, in the achievement of this objective and also, as a strategic technology to be fully exploited in the next generation of CANDU reactors. Operator Companion, an expert system intended to diagnose plant faults and advise the operator on appropriate restoring or corrective actions, is a major undertaking which is receiving support within the research and engineering groups of AECL

  11. Operative arthroscopy.

    Science.gov (United States)

    Guhl, J F

    1979-01-01

    In a period of 20 months, over 200 patients (age ranged from high school students to middle-aged persons) with knee injuries were treated by operative arthroscopy. The majority of the injuries were incurred while the patients had been participating in athletic events, either competitive or recreational. Operative arthroscopy offers the advantage of shortened hospital stay, rapid rehabilitation, lack of disfiguring scar, and reduced costs. Patients are followed yearly after the first postoperative year. Improved long-term results from diagnostic and operative arthroscopy, as compared to conventional surgical procedures, are expected. The proof of those expectations will be determined in the next several years as this group of patients requiring partial meniscectomies or procedures for pathologic and degenerative conditions is reevaluated.

  12. Operational Analysis on Torpedo Defence

    NARCIS (Netherlands)

    Grootendorst, H.J.; Benders, F.P.A.; Fitski, H.J.; Veldhoven, E.R. van

    2007-01-01

    Since 1998, TNO Defence, Security and Safety has performed operational analysis with the Underwater Warfare Testbed, which provides an environment for evaluation and validation of systems, concepts, and tactics. On top of this testbed the Torpedo Defence System TestBed has been built to simulate

  13. 3D GIS spatial operation based on extended Euler operators

    Science.gov (United States)

    Xu, Hongbo; Lu, Guonian; Sheng, Yehua; Zhou, Liangchen; Guo, Fei; Shang, Zuoyan; Wang, Jing

    2008-10-01

    The implementation of 3 dimensions spatial operations, based on certain data structure, has a lack of universality and is not able to treat with non-manifold cases, at present. ISO/DIS 19107 standard just presents the definition of Boolean operators and set operators for topological relationship query, and OGC GeoXACML gives formal definitions for several set functions without implementation detail. Aiming at these problems, based mathematical foundation on cell complex theory, supported by non-manifold data structure and using relevant research in the field of non-manifold geometry modeling for reference, firstly, this paper according to non-manifold Euler-Poincaré formula constructs 6 extended Euler operators and inverse operators to carry out creating, updating and deleting 3D spatial elements, as well as several pairs of supplementary Euler operators to convenient for implementing advanced functions. Secondly, we change topological element operation sequence of Boolean operation and set operation as well as set functions defined in GeoXACML into combination of extended Euler operators, which separates the upper functions and lower data structure. Lastly, we develop underground 3D GIS prototype system, in which practicability and credibility of extended Euler operators faced to 3D GIS presented by this paper are validated.

  14. Position Error Covariance Matrix Validation and Correction

    Science.gov (United States)

    Frisbee, Joe, Jr.

    2016-01-01

    In order to calculate operationally accurate collision probabilities, the position error covariance matrices predicted at times of closest approach must be sufficiently accurate representations of the position uncertainties. This presentation will discuss why the Gaussian distribution is a reasonable expectation for the position uncertainty and how this assumed distribution type is used in the validation and correction of position error covariance matrices.

  15. Kursk Operation Simulation and Validation Exercise - Phase II (KOSAVE II)

    National Research Council Canada - National Science Library

    Bauman, Walter

    1998-01-01

    ... (KOSAVE) Study (KOSAVE II) documents, in this report a statistical record of the Kursk battle, as represented in the KDB, for use as both a standalone descriptive record for historians, and as a baseline for a subsequent Phase...

  16. Validation of Virtual Environments Incorporating Virtual Operators for Procedural Learning

    Science.gov (United States)

    2012-09-01

    Percent Correct RADHAZ Radiation Hazard RAM Random Access Memory DRDC Toronto TM 2011-132 xiii RCA Radio Corporation of America RSD Rapid... colour head mounted display (HMD; for hardware details, see Appendix B). The instantaneous point of view was determined by a magnetic, head- tracking...readily recalled ; deck landings are not tracked, so the values provided are crude estimates at best. For those subjects who provided an estimated range

  17. Accelerator operations

    International Nuclear Information System (INIS)

    Anon.

    1980-01-01

    This section is concerned with the operation of both the tandem-linac system and the Dynamitron, two accelerators that are used for entirely different research. Developmental activities associated with the tandem and the Dynamitron are also treated here, but developmental activities associated with the superconducting linac are covered separately because this work is a program of technology development in its own right

  18. Operating Systems

    Indian Academy of Sciences (India)

    areas in which this type is useful are multimedia, virtual reality, and advanced scientific projects such as undersea exploration and planetary rovers. Because of the expanded uses for soft real-time functionality, it is finding its way into most current operating systems, including major versions of Unix and Windows NT OS.

  19. Peace Operations

    National Research Council Canada - National Science Library

    Lewis, William

    1995-01-01

    .... Indeed, despite the energetic leadership of Under Secretary-General Kofi R. Annan who directs the Department of Peacekeeping Operations, the organization has increasing difficulty in acquiring properly trained and equipped forces in time to intervene in conflict situations and humanitarian crises.

  20. Operational Circulars

    CERN Multimedia

    2003-01-01

    Operational Circular N° 4 - April 2003 Conditions for use by members of the CERN personnel of vehicles belonging to or rented by CERN - This circular has been drawn up. Operational Circular N° 5 - October 2000 Use of CERN computing facilities - Further details on the personal use of CERN computing facilities Operational Circular N° 5 and its Subsidiary Rules http://cern.ch/ComputingRules defines the rules for the use of CERN computing facilities. One of the basic principles governing such use is that it must come within the professional duties of the user concerned, as defined by the user's divisional hierarchy. However, personal use of the computing facilities is tolerated or allowed provided : a) It is in compliance with Operational Circular N° 5 and not detrimental to official duties, including those of other users; b) the frequency and duration is limited and there is a negligible use of CERN resources; c) it does not constitute a political, commercial and/or profit-making activity; d) it is not...

  1. Operation Context

    DEFF Research Database (Denmark)

    Stüben, Henning; Tietjen, Anne

    2006-01-01

    Abstract: This paper seeks to challenge the notion of context from an operational perspective. Can we grasp the forces that shape the complex conditions for an architectural or urban design within the notion of context? By shifting the gaze towards the agency of architecture, contextual analysis...

  2. Operational indicators

    International Nuclear Information System (INIS)

    2010-01-01

    The chapter presents the operational indicators related to budget, travel costs and tickets, the evolution of the annual program for regulatory inspection, the scientific production, requested patents and the numbers related to the production of the services offered by the Institution

  3. Regulatory perspectives on human factors validation

    International Nuclear Information System (INIS)

    Harrison, F.; Staples, L.

    2001-01-01

    Validation is an important avenue for controlling the genesis of human error, and thus managing loss, in a human-machine system. Since there are many ways in which error may intrude upon system operation, it is necessary to consider the performance-shaping factors that could introduce error and compromise system effectiveness. Validation works to this end by examining, through objective testing and measurement, the newly developed system, procedure or staffing level, in order to identify and eliminate those factors which may negatively influence human performance. It is essential that validation be done in a high-fidelity setting, in an objective and systematic manner, using appropriate measures, if meaningful results are to be obtained, In addition, inclusion of validation work in any design process can be seen as contributing to a good safety culture, since such activity allows licensees to eliminate elements which may negatively impact on human behaviour. (author)

  4. Model Validation Status Review

    International Nuclear Information System (INIS)

    E.L. Hardin

    2001-01-01

    The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M and O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and

  5. Model Validation Status Review

    Energy Technology Data Exchange (ETDEWEB)

    E.L. Hardin

    2001-11-28

    The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M&O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and

  6. Validity in Qualitative Evaluation

    Directory of Open Access Journals (Sweden)

    Vasco Lub

    2015-12-01

    Full Text Available This article provides a discussion on the question of validity in qualitative evaluation. Although validity in qualitative inquiry has been widely reflected upon in the methodological literature (and is still often subject of debate, the link with evaluation research is underexplored. Elaborating on epistemological and theoretical conceptualizations by Guba and Lincoln and Creswell and Miller, the article explores aspects of validity of qualitative research with the explicit objective of connecting them with aspects of evaluation in social policy. It argues that different purposes of qualitative evaluations can be linked with different scientific paradigms and perspectives, thus transcending unproductive paradigmatic divisions as well as providing a flexible yet rigorous validity framework for researchers and reviewers of qualitative evaluations.

  7. Site operations

    International Nuclear Information System (INIS)

    House, W.B.; Ebenhack, D.G.

    1989-01-01

    This chapter is a discussion of the management and operations practices used at the Barnwell Waste Management Facility in Barnwell, SC. The following topics are discussed: (1) Waste receiving and inspection, including manifest and certificates of compliance, radiological surveys, disposition of nonconforming items, and decontamination and disposition of secondary waste streams; (2) Waste disposal, including Title 10 CFR 61 requirements, disposal area evaluations, shipment offloading, container emplacement, and radiation protection; (3) Trench closure, including trench backfilling, trench capping, and permanent markers; (4) Site maintenance and stabilization, including trench maintenance, surface water management, and site closure activities; (5) Site monitoring programs, including operational monitoring, and environmental monitoring program; (6) Personnel training and qualifications, including basic training program, safety training program, special skills training, and physical qualifications; (7) Records management, including waste records, personnel training records, personnel dosimetry records, site monitoring records, trench qualification and construction records, and site drawings and stabilization records; (8) Site security; (9) Emergency response plans; and (10) Quality assurance

  8. Operators perspective

    International Nuclear Information System (INIS)

    Scragg, D.M.

    1991-01-01

    There are very few Energy from Municipal Waste processing plants in the U.K. Those which were built have usually been financed and operated by Local Authorities and are now in excess of 17 years old. The Environmental Protection Act and constraints on Public Sector spending have brought about fundamental changes in the approach taken to developing new schemes of this kind. The Public Sector and the Private Sector must work together. The investment in Mass Burning Incineration Schemes generating energy is high and the pressures to keep the waste disposal costs as low as possible mean that recovery of the investment needs to be spread over many years. For any Scheme to be successful and financially viable requires a long term commitment on the part of those involved. This paper sets out the key role which the Operating Contractor can play in this situation. (author)

  9. Operating Cigeo

    International Nuclear Information System (INIS)

    Launeau, F.

    2016-01-01

    The CIGEO facility dedicated to the geological disposal of high- and intermediate-level radioactive wastes will be composed of 2 parts: an underground facility at a depth of 500 m to dispose the waste packages in tunnels and a surface facility to take delivery of the wastes and prepare the packages. The underground facility will be built progressively and will cover a surface of 15 km 2 at the end of Cigeo operating-life. A large part of the surface facility (located a few km away from the waste reception place) will be dedicated to the works led deep underground to build the tunnels and will receive drilling cuttings. The article describes also the ramp and carts to lead waste packages underground. Most of the operations will be automated. The definitive closure of the tunnels will be made with swelling clay and concrete plugs. (A.C.)

  10. Operation Poorman

    International Nuclear Information System (INIS)

    Pruvost, N.; Tsitouras, J.

    1981-01-01

    The objectives of Operation Poorman were to design and build a portable seismic system and to set up and use this system in a cold-weather environment. The equipment design uses current technology to achieve a low-power, lightweight system that is configured into three modules. The system was deployed in Alaska during wintertime, and the results provide a basis for specifying a mission-ready seismic verification system

  11. GNF2 Operating Experience

    International Nuclear Information System (INIS)

    Schardt, John

    2007-01-01

    GNF's latest generation fuel product, GNF2, is designed to deliver improved nuclear efficiency, higher bundle and cycle energy capability, and more operational flexibility. But along with high performance, our customers face a growing need for absolute fuel reliability. This is driven by a general sense in the industry that LWR fuel reliability has plateaued. Too many plants are operating with fuel leakers, and the impact on plant operations and operator focus is unacceptable. The industry has responded by implementing an INPO-coordinated program aimed at achieving leaker-free reliability by 2010. One focus area of the program is the relationship between fuel performance (i.e., duty) and reliability. The industry recognizes that the right balance between performance and problem-free fuel reliability is critical. In the development of GNF2, GNF understood the requirement for a balanced solution and utilized a product development and introduction strategy that specifically addressed reliability: evolutionary design features supported by an extensive experience base; thoroughly tested components; and defense-in-depth mitigation of all identified failure mechanisms. The final proof test that the balance has been achieved is the application of the design, initially through lead use assemblies (LUAs), in a variety of plants that reflect the diversity of the BWR fleet. Regular detailed surveillance of these bundles provides the verification that the proper balance between performance and reliability has been achieved. GNF currently has GNF2 lead use assemblies operating in five plants. Included are plants that have implemented extended power up-rates, plants on one and two-year operating cycles, and plants with and without NobleChem TM and zinc injection. The leading plant has undergone three pool-side inspections outages to date. This paper reviews the actions taken to insure GNF2's reliability, and the lead use assembly surveillance data accumulated to date to validate

  12. Cross validation in LULOO

    DEFF Research Database (Denmark)

    Sørensen, Paul Haase; Nørgård, Peter Magnus; Hansen, Lars Kai

    1996-01-01

    The leave-one-out cross-validation scheme for generalization assessment of neural network models is computationally expensive due to replicated training sessions. Linear unlearning of examples has recently been suggested as an approach to approximative cross-validation. Here we briefly review...... the linear unlearning scheme, dubbed LULOO, and we illustrate it on a systemidentification example. Further, we address the possibility of extracting confidence information (error bars) from the LULOO ensemble....

  13. Transient FDTD simulation validation

    OpenAIRE

    Jauregui Tellería, Ricardo; Riu Costa, Pere Joan; Silva Martínez, Fernando

    2010-01-01

    In computational electromagnetic simulations, most validation methods have been developed until now to be used in the frequency domain. However, the EMC analysis of the systems in the frequency domain many times is not enough to evaluate the immunity of current communication devices. Based on several studies, in this paper we propose an alternative method of validation of the transients in time domain allowing a rapid and objective quantification of the simulations results.

  14. Validation suite for MCNP

    International Nuclear Information System (INIS)

    Mosteller, Russell D.

    2002-01-01

    Two validation suites, one for criticality and another for radiation shielding, have been defined and tested for the MCNP Monte Carlo code. All of the cases in the validation suites are based on experiments so that calculated and measured results can be compared in a meaningful way. The cases in the validation suites are described, and results from those cases are discussed. For several years, the distribution package for the MCNP Monte Carlo code1 has included an installation test suite to verify that MCNP has been installed correctly. However, the cases in that suite have been constructed primarily to test options within the code and to execute quickly. Consequently, they do not produce well-converged answers, and many of them are physically unrealistic. To remedy these deficiencies, sets of validation suites are being defined and tested for specific types of applications. All of the cases in the validation suites are based on benchmark experiments. Consequently, the results from the measurements are reliable and quantifiable, and calculated results can be compared with them in a meaningful way. Currently, validation suites exist for criticality and radiation-shielding applications.

  15. Prototyping Aplikasi E-Health sebagai Bagian Pengenalan Obat-Obatan Dengan Teknologi Cross-Platform

    Directory of Open Access Journals (Sweden)

    Ari Muzakir

    2018-01-01

    Full Text Available Nowadays peoples usually do various ways to know the types and benefits of medicines to solve their symptoms. For example they do search information through Internet, but they don’t know the appropriate drug to consume according their symptoms. This research aimed to build an encyclopedia about E-Health-based drugs. By utilizing encyclopedia about E-Health-based drugs, its much easier to most peoples to understand and recognize various types of medicines including herbal and chemical medicines. The result of this research is an application prototype about encyclopedia of E-Health-based drugs that runs on web and mobile platforms tested by blacbox method. Furthermore, the result of this research, can be a basis to develop a larger and functional E-Health-based drugs encyclopedia modules.

  16. Crossing Cross-Platform: Comparing Skills Preferences and Convergence Attitudes in Strategic Communication and News Disciplines

    Science.gov (United States)

    Hubbard, Glenn T.; Kang, Jin-Ae; Crawford, Elizabeth Crisp

    2016-01-01

    National survey of college mass communication students (N = 247) analyzed attitudes on the teaching of print and electronic media skills, using journalism students as comparison group. Previous research had not explored strategic communication student responses to convergence. Found identity variables within public relations (PR) field related to…

  17. Wrox Cross Platform Android and iOS Mobile Development Three-Pack

    CERN Document Server

    McClure, Wallace B; Croft, John J; Dick, Jonathan; Hardy, Chris; Olson, Scott; Hunter, John; Horgen, Ben; Goers, Kenny; Blyth, Rory; Dunn, Craig; Bowling, Martin

    2012-01-01

    A bundle of 3 best-selling and respected mobile development e-books from Wrox form a complete library on the key tools and techniques for developing apps across the hottest platforms including Android and iOS.  This collection includes the full content of these three books, at a special price:Professional Android Programming with Mono for Android and .NET/C#, ISBN: 9781118026434, by Wallace B. McClure, Nathan Blevins, John J. Croft, IV, Jonathan Dick, and Chris HardyProfessional iPhone Programming with MonoTouch and .NET/C#, ISBN: 9780470637821, by Wallace B. McClure, Rory Blyth, Craig Dunn, C

  18. Unified, Cross-Platform, Open-Source Library Package for High-Performance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Kozacik, Stephen [EM Photonics, Inc., Newark, DE (United States)

    2017-05-15

    Compute power is continually increasing, but this increased performance is largely found in sophisticated computing devices and supercomputer resources that are difficult to use, resulting in under-utilization. We developed a unified set of programming tools that will allow users to take full advantage of the new technology by allowing them to work at a level abstracted away from the platform specifics, encouraging the use of modern computing systems, including government-funded supercomputer facilities.

  19. The Ontology Lookup Service, a lightweight cross-platform tool for controlled vocabulary queries

    Directory of Open Access Journals (Sweden)

    Apweiler Rolf

    2006-02-01

    Full Text Available Abstract Background With the vast amounts of biomedical data being generated by high-throughput analysis methods, controlled vocabularies and ontologies are becoming increasingly important to annotate units of information for ease of search and retrieval. Each scientific community tends to create its own locally available ontology. The interfaces to query these ontologies tend to vary from group to group. We saw the need for a centralized location to perform controlled vocabulary queries that would offer both a lightweight web-accessible user interface as well as a consistent, unified SOAP interface for automated queries. Results The Ontology Lookup Service (OLS was created to integrate publicly available biomedical ontologies into a single database. All modified ontologies are updated daily. A list of currently loaded ontologies is available online. The database can be queried to obtain information on a single term or to browse a complete ontology using AJAX. Auto-completion provides a user-friendly search mechanism. An AJAX-based ontology viewer is available to browse a complete ontology or subsets of it. A programmatic interface is available to query the webservice using SOAP. The service is described by a WSDL descriptor file available online. A sample Java client to connect to the webservice using SOAP is available for download from SourceForge. All OLS source code is publicly available under the open source Apache Licence. Conclusion The OLS provides a user-friendly single entry point for publicly available ontologies in the Open Biomedical Ontology (OBO format. It can be accessed interactively or programmatically at http://www.ebi.ac.uk/ontology-lookup/.

  20. Programming HTML5 Applications Building Powerful Cross-Platform Environments in JavaScript

    CERN Document Server

    Kessin, Zachary

    2011-01-01

    HTML5 is not just a replacement for plugins. It also makes the Web a first-class development environment by giving JavaScript programmers a solid foundation for building industrial-strength applications. This practical guide takes you beyond simple site creation and shows you how to build self-contained HTML5 applications that can run on mobile devices and compete with desktop apps. You'll learn powerful JavaScript tools for exploiting HTML5 elements, and discover new methods for working with data, such as offline storage and multithreaded processing. Complete with code samples, this book is

  1. Mastering CMake a cross-platform build system : version 3.1

    CERN Document Server

    Martin, Ken

    2015-01-01

    CMake is an open-source build tool enabling collaboration among software developers working on distinct platforms by using a common build specification to drive their native build tools. Mastering CMake explains how to use the CMake suite of tools, including CTest and CPack, to develop, build, test, and package software for distribution. It covers use of the command-line and GUI tools on Linux (UNIX), Microsoft Windows, and Mac OS X. This book also contains a guide for converting projects to CMake and writing CMake code to specify build rules to compile sources, create static and shared libraries, link executables, run custom commands, run tests, and install artifacts. It also includes a copy of key portions of the official reference documentation.

  2. Open, Cross Platform Chemistry Application Unifying Structure Manipulation, External Tools, Databases and Visualization

    Science.gov (United States)

    2014-05-30

    dataType ="xsd:double" dictRef="cml:molwt" units...34units:g ">30.0690 </scalar > </property > <property dictRef="cml:monoisotopicwt" title="Monoisotopic weight" > <scalar dataType ="xsd:double" dictRef...34cml:monoisotopicwt" units ="units:g">30.0469502 </scalar > </property > <property dictRef="cml:mp" title="Melting point"> <scalar dataType

  3. Developing a Cross-Platform Web Application for Online EFL Vocabulary Learning Courses

    Science.gov (United States)

    Enokida, Kazumichi; Sakaue, Tatsuya; Morita, Mitsuhiro; Kida, Shusaku; Ohnishi, Akio

    2017-01-01

    In this paper, the development of a web application for self-access English vocabulary courses at a national university in Japan will be reported upon. Whilst the basic concepts are inherited from an old Flash-based online vocabulary learning system that had been long used at the university, the new HTML5-based app comes with several new features…

  4. Sparse canonical methods for biological data integration: application to a cross-platform study

    Directory of Open Access Journals (Sweden)

    Robert-Granié Christèle

    2009-01-01

    Full Text Available Abstract Background In the context of systems biology, few sparse approaches have been proposed so far to integrate several data sets. It is however an important and fundamental issue that will be widely encountered in post genomic studies, when simultaneously analyzing transcriptomics, proteomics and metabolomics data using different platforms, so as to understand the mutual interactions between the different data sets. In this high dimensional setting, variable selection is crucial to give interpretable results. We focus on a sparse Partial Least Squares approach (sPLS to handle two-block data sets, where the relationship between the two types of variables is known to be symmetric. Sparse PLS has been developed either for a regression or a canonical correlation framework and includes a built-in procedure to select variables while integrating data. To illustrate the canonical mode approach, we analyzed the NCI60 data sets, where two different platforms (cDNA and Affymetrix chips were used to study the transcriptome of sixty cancer cell lines. Results We compare the results obtained with two other sparse or related canonical correlation approaches: CCA with Elastic Net penalization (CCA-EN and Co-Inertia Analysis (CIA. The latter does not include a built-in procedure for variable selection and requires a two-step analysis. We stress the lack of statistical criteria to evaluate canonical correlation methods, which makes biological interpretation absolutely necessary to compare the different gene selections. We also propose comprehensive graphical representations of both samples and variables to facilitate the interpretation of the results. Conclusion sPLS and CCA-EN selected highly relevant genes and complementary findings from the two data sets, which enabled a detailed understanding of the molecular characteristics of several groups of cell lines. These two approaches were found to bring similar results, although they highlighted the same phenomenons with a different priority. They outperformed CIA that tended to select redundant information.

  5. PhoneGap and AngularJS for cross-platform development

    CERN Document Server

    Liang, Yuxian Eugene

    2014-01-01

    This book is intended for people who are not familiar with AngularJS and who want to take their PhoneGap development skills further by developing apps using different JavaScript libraries. People with some knowledge of PhoneGap, HTML, CSS, and JavaScript will find this book immediately useful.

  6. Considerations for Achieving Cross-Platform Point Cloud Data Fusion across Different Dryland Ecosystem Structural States.

    Science.gov (United States)

    Swetnam, Tyson L; Gillan, Jeffrey K; Sankey, Temuulen T; McClaran, Mitchel P; Nichols, Mary H; Heilman, Philip; McVay, Jason

    2017-01-01

    Remotely sensing recent growth, herbivory, or disturbance of herbaceous and woody vegetation in dryland ecosystems requires high spatial resolution and multi-temporal depth. Three dimensional (3D) remote sensing technologies like lidar, and techniques like structure from motion (SfM) photogrammetry, each have strengths and weaknesses at detecting vegetation volume and extent, given the instrument's ground sample distance and ease of acquisition. Yet, a combination of platforms and techniques might provide solutions that overcome the weakness of a single platform. To explore the potential for combining platforms, we compared detection bias amongst two 3D remote sensing techniques (lidar and SfM) using three different platforms [ground-based, small unmanned aerial systems (sUAS), and manned aircraft]. We found aerial lidar to be more accurate for characterizing the bare earth (ground) in dense herbaceous vegetation than either terrestrial lidar or aerial SfM photogrammetry. Conversely, the manned aerial lidar did not detect grass and fine woody vegetation while the terrestrial lidar and high resolution near-distance (ground and sUAS) SfM photogrammetry detected these and were accurate. UAS SfM photogrammetry at lower spatial resolution under-estimated maximum heights in grass and shrubs. UAS and handheld SfM photogrammetry in near-distance high resolution collections had similar accuracy to terrestrial lidar for vegetation, but difficulty at measuring bare earth elevation beneath dense herbaceous cover. Combining point cloud data and derivatives (i.e., meshes and rasters) from two or more platforms allowed for more accurate measurement of herbaceous and woody vegetation (height and canopy cover) than any single technique alone. Availability and costs of manned aircraft lidar collection preclude high frequency repeatability but this is less limiting for terrestrial lidar, sUAS and handheld SfM. The post-processing of SfM photogrammetry data became the limiting factor at larger spatial scale and temporal repetition. Despite the utility of sUAS and handheld SfM for monitoring vegetation phenology and structure, their spatial extents are small relative to manned aircraft.

  7. Cross-Platform Learning: On the Nature of Children's Learning from Multiple Media Platforms

    Science.gov (United States)

    Fisch, Shalom M.

    2013-01-01

    It is increasingly common for an educational media project to span several media platforms (e.g., TV, Web, hands-on materials), assuming that the benefits of learning from multiple media extend beyond those gained from one medium alone. Yet research typically has investigated learning from a single medium in isolation. This paper reviews several…

  8. Verification and Validation of Tropospheric Model/Database

    National Research Council Canada - National Science Library

    Junho, choi

    1998-01-01

    A verification and validation of tropospheric models and databases has been performed based on ray tracing algorithm, statistical analysis, test on real time system operation, and other technical evaluation process...

  9. Validation of the Automation Attitude Questionnaire for Airline Pilots ...

    African Journals Online (AJOL)

    AAQ), which assesses airline pilots' perceptions about operating advanced commercial aircraft. A total of 262 airline pilots from a large South African carrier participated in the validation of the instrument. A five-factor measurement model was ...

  10. Validation Aspects of Water Treatment Systems for Pharmaceutical ...

    African Journals Online (AJOL)

    The goal of conducting validation is to demonstrate that a process, when operated within established limits, produces a product of consistent and specified quality with a high degree of assurance. Validation of water treatment systems is necessary to obtain water with all desired quality attributes. This also provides a ...

  11. Containment Code Validation Matrix

    International Nuclear Information System (INIS)

    Chin, Yu-Shan; Mathew, P.M.; Glowa, Glenn; Dickson, Ray; Liang, Zhe; Leitch, Brian; Barber, Duncan; Vasic, Aleks; Bentaib, Ahmed; Journeau, Christophe; Malet, Jeanne; Studer, Etienne; Meynet, Nicolas; Piluso, Pascal; Gelain, Thomas; Michielsen, Nathalie; Peillon, Samuel; Porcheron, Emmanuel; Albiol, Thierry; Clement, Bernard; Sonnenkalb, Martin; Klein-Hessling, Walter; Arndt, Siegfried; Weber, Gunter; Yanez, Jorge; Kotchourko, Alexei; Kuznetsov, Mike; Sangiorgi, Marco; Fontanet, Joan; Herranz, Luis; Garcia De La Rua, Carmen; Santiago, Aleza Enciso; Andreani, Michele; Paladino, Domenico; Dreier, Joerg; Lee, Richard; Amri, Abdallah

    2014-01-01

    The Committee on the Safety of Nuclear Installations (CSNI) formed the CCVM (Containment Code Validation Matrix) task group in 2002. The objective of this group was to define a basic set of available experiments for code validation, covering the range of containment (ex-vessel) phenomena expected in the course of light and heavy water reactor design basis accidents and beyond design basis accidents/severe accidents. It was to consider phenomena relevant to pressurised heavy water reactor (PHWR), pressurised water reactor (PWR) and boiling water reactor (BWR) designs of Western origin as well as of Eastern European VVER types. This work would complement the two existing CSNI validation matrices for thermal hydraulic code validation (NEA/CSNI/R(1993)14) and In-vessel core degradation (NEA/CSNI/R(2001)21). The report initially provides a brief overview of the main features of a PWR, BWR, CANDU and VVER reactors. It also provides an overview of the ex-vessel corium retention (core catcher). It then provides a general overview of the accident progression for light water and heavy water reactors. The main focus is to capture most of the phenomena and safety systems employed in these reactor types and to highlight the differences. This CCVM contains a description of 127 phenomena, broken down into 6 categories: - Containment Thermal-hydraulics Phenomena; - Hydrogen Behaviour (Combustion, Mitigation and Generation) Phenomena; - Aerosol and Fission Product Behaviour Phenomena; - Iodine Chemistry Phenomena; - Core Melt Distribution and Behaviour in Containment Phenomena; - Systems Phenomena. A synopsis is provided for each phenomenon, including a description, references for further information, significance for DBA and SA/BDBA and a list of experiments that may be used for code validation. The report identified 213 experiments, broken down into the same six categories (as done for the phenomena). An experiment synopsis is provided for each test. Along with a test description

  12. Groundwater Model Validation

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed E. Hassan

    2006-01-24

    Models have an inherent uncertainty. The difficulty in fully characterizing the subsurface environment makes uncertainty an integral component of groundwater flow and transport models, which dictates the need for continuous monitoring and improvement. Building and sustaining confidence in closure decisions and monitoring networks based on models of subsurface conditions require developing confidence in the models through an iterative process. The definition of model validation is postulated as a confidence building and long-term iterative process (Hassan, 2004a). Model validation should be viewed as a process not an end result. Following Hassan (2004b), an approach is proposed for the validation process of stochastic groundwater models. The approach is briefly summarized herein and detailed analyses of acceptance criteria for stochastic realizations and of using validation data to reduce input parameter uncertainty are presented and applied to two case studies. During the validation process for stochastic models, a question arises as to the sufficiency of the number of acceptable model realizations (in terms of conformity with validation data). Using a hierarchical approach to make this determination is proposed. This approach is based on computing five measures or metrics and following a decision tree to determine if a sufficient number of realizations attain satisfactory scores regarding how they represent the field data used for calibration (old) and used for validation (new). The first two of these measures are applied to hypothetical scenarios using the first case study and assuming field data consistent with the model or significantly different from the model results. In both cases it is shown how the two measures would lead to the appropriate decision about the model performance. Standard statistical tests are used to evaluate these measures with the results indicating they are appropriate measures for evaluating model realizations. The use of validation

  13. Validation of Serious Games

    Directory of Open Access Journals (Sweden)

    Katinka van der Kooij

    2015-09-01

    Full Text Available The application of games for behavioral change has seen a surge in popularity but evidence on the efficacy of these games is contradictory. Anecdotal findings seem to confirm their motivational value whereas most quantitative findings from randomized controlled trials (RCT are negative or difficult to interpret. One cause for the contradictory evidence could be that the standard RCT validation methods are not sensitive to serious games’ effects. To be able to adapt validation methods to the properties of serious games we need a framework that can connect properties of serious game design to the factors that influence the quality of quantitative research outcomes. The Persuasive Game Design model [1] is particularly suitable for this aim as it encompasses the full circle from game design to behavioral change effects on the user. We therefore use this model to connect game design features, such as the gamification method and the intended transfer effect, to factors that determine the conclusion validity of an RCT. In this paper we will apply this model to develop guidelines for setting up validation methods for serious games. This way, we offer game designers and researchers handles on how to develop tailor-made validation methods.

  14. Checklists for external validity

    DEFF Research Database (Denmark)

    Dyrvig, Anne-Kirstine; Kidholm, Kristian; Gerke, Oke

    2014-01-01

    to an implementation setting. In this paper, currently available checklists on external validity are identified, assessed and used as a basis for proposing a new improved instrument. METHOD: A systematic literature review was carried out in Pubmed, Embase and Cinahl on English-language papers without time restrictions....... The retrieved checklist items were assessed for (i) the methodology used in primary literature, justifying inclusion of each item; and (ii) the number of times each item appeared in checklists. RESULTS: Fifteen papers were identified, presenting a total of 21 checklists for external validity, yielding a total...... of 38 checklist items. Empirical support was considered the most valid methodology for item inclusion. Assessment of methodological justification showed that none of the items were supported empirically. Other kinds of literature justified the inclusion of 22 of the items, and 17 items were included...

  15. Shift Verification and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Pandya, Tara M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Evans, Thomas M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Davidson, Gregory G [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Seth R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Godfrey, Andrew T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-09-07

    This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of Light Water Reactors (CASL). Five main types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed-source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results, and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over a burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.

  16. WSRC approach to validation of criticality safety computer codes

    International Nuclear Information System (INIS)

    Finch, D.R.; Mincey, J.F.

    1991-01-01

    Recent hardware and operating system changes at Westinghouse Savannah River Site (WSRC) have necessitated review of the validation for JOSHUA criticality safety computer codes. As part of the planning for this effort, a policy for validation of JOSHUA and other criticality safety codes has been developed. This policy will be illustrated with the steps being taken at WSRC. The objective in validating a specific computational method is to reliably correlate its calculated neutron multiplication factor (K eff ) with known values over a well-defined set of neutronic conditions. Said another way, such correlations should be: (1) repeatable; (2) demonstrated with defined confidence; and (3) identify the range of neutronic conditions (area of applicability) for which the correlations are valid. The general approach to validation of computational methods at WSRC must encompass a large number of diverse types of fissile material processes in different operations. Special problems are presented in validating computational methods when very few experiments are available (such as for enriched uranium systems with principal second isotope 236 U). To cover all process conditions at WSRC, a broad validation approach has been used. Broad validation is based upon calculation of many experiments to span all possible ranges of reflection, nuclide concentrations, moderation ratios, etc. Narrow validation, in comparison, relies on calculations of a few experiments very near anticipated worst-case process conditions. The methods and problems of broad validation are discussed

  17. Operator errors

    International Nuclear Information System (INIS)

    Knuefer; Lindauer

    1980-01-01

    Besides that at spectacular events a combination of component failure and human error is often found. Especially the Rasmussen-Report and the German Risk Assessment Study show for pressurised water reactors that human error must not be underestimated. Although operator errors as a form of human error can never be eliminated entirely, they can be minimized and their effects kept within acceptable limits if a thorough training of personnel is combined with an adequate design of the plant against accidents. Contrary to the investigation of engineering errors, the investigation of human errors has so far been carried out with relatively small budgets. Intensified investigations in this field appear to be a worthwhile effort. (orig.)

  18. Validating Animal Models

    Directory of Open Access Journals (Sweden)

    Nina Atanasova

    2015-06-01

    Full Text Available In this paper, I respond to the challenge raised against contemporary experimental neurobiology according to which the field is in a state of crisis because of the multiple experimental protocols employed in different laboratories and strengthening their reliability that presumably preclude the validity of neurobiological knowledge. I provide an alternative account of experimentation in neurobiology which makes sense of its experimental practices. I argue that maintaining a multiplicity of experimental protocols and strengthening their reliability are well justified and they foster rather than preclude the validity of neurobiological knowledge. Thus, their presence indicates thriving rather than crisis of experimental neurobiology.

  19. Validation Process Methods

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, John E. [National Renewable Energy Lab. (NREL), Golden, CO (United States); English, Christine M. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Gesick, Joshua C. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mukkamala, Saikrishna [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2018-01-04

    This report documents the validation process as applied to projects awarded through Funding Opportunity Announcements (FOAs) within the U.S. Department of Energy Bioenergy Technologies Office (DOE-BETO). It describes the procedures used to protect and verify project data, as well as the systematic framework used to evaluate and track performance metrics throughout the life of the project. This report also describes the procedures used to validate the proposed process design, cost data, analysis methodologies, and supporting documentation provided by the recipients.

  20. Your Lung Operation: After Your Operation

    Medline Plus

    Full Text Available ... Lung Operation After Your Operation Your Discharge and Recovery Complete Video After Your Operation Guidance for after ... Your Lung Operation Read Next Your Discharge and Recovery Back to Top Find A Surgeon Find A ...

  1. A Validity Review of the Color Company Competition at the United States Naval Academy

    National Research Council Canada - National Science Library

    Dryden, Derek S

    2006-01-01

    .... Using data obtained through the Institutional Research Department, the Physical Education Department, as well as the Activities and Operations Offices, this study examines the validity of the current...

  2. The dialogic validation

    DEFF Research Database (Denmark)

    Musaeus, Peter

    2005-01-01

    This paper is inspired by dialogism and the title is a paraphrase on Bakhtin's (1981) "The Dialogic Imagination". The paper investigates how dialogism can inform the process of validating inquiry-based qualitative research. The paper stems from a case study on the role of recognition...

  3. A valid licence

    NARCIS (Netherlands)

    Spoolder, H.A.M.; Ingenbleek, P.T.M.

    2010-01-01

    A valid licence Tuesday, April 20, 2010 Dr Hans Spoolder and Dr Paul Ingenbleek, of Wageningen University and Research Centres, share their thoughts on improving farm animal welfare in Europe At the presentation of the European Strategy 2020 on 3rd March, President Barroso emphasised the need for

  4. The Chimera of Validity

    Science.gov (United States)

    Baker, Eva L.

    2013-01-01

    Background/Context: Education policy over the past 40 years has focused on the importance of accountability in school improvement. Although much of the scholarly discourse around testing and assessment is technical and statistical, understanding of validity by a non-specialist audience is essential as long as test results drive our educational…

  5. Validating year 2000 compliance

    NARCIS (Netherlands)

    A. van Deursen (Arie); P. Klint (Paul); M.P.A. Sellink

    1997-01-01

    textabstractValidating year 2000 compliance involves the assessment of the correctness and quality of a year 2000 conversion. This entails inspecting both the quality of the conversion emph{process followed, and of the emph{result obtained, i.e., the converted system. This document provides an

  6. Validation and test report

    DEFF Research Database (Denmark)

    Pedersen, Jens Meldgaard; Andersen, T. Bull

    2012-01-01

    . As a consequence of extensive movement artefacts seen during dynamic contractions, the following validation and test report consists of a report that investigates the physiological responses to a static contraction in a standing and a supine position. Eight subjects performed static contractions of the ankle...

  7. Statistical Analysis and validation

    NARCIS (Netherlands)

    Hoefsloot, H.C.J.; Horvatovich, P.; Bischoff, R.

    2013-01-01

    In this chapter guidelines are given for the selection of a few biomarker candidates from a large number of compounds with a relative low number of samples. The main concepts concerning the statistical validation of the search for biomarkers are discussed. These complicated methods and concepts are

  8. Validity and Fairness

    Science.gov (United States)

    Kane, Michael

    2010-01-01

    This paper presents the author's critique on Xiaoming Xi's article, "How do we go about investigating test fairness?," which lays out a broad framework for studying fairness as comparable validity across groups within the population of interest. Xi proposes to develop a fairness argument that would identify and evaluate potential fairness-based…

  9. DTU PMU Laboratory Development - Testing and Validation

    DEFF Research Database (Denmark)

    Garcia-Valle, Rodrigo; Yang, Guang-Ya; Martin, Kenneth E.

    2010-01-01

    This is a report of the results of phasor measurement unit (PMU) laboratory development and testing done at the Centre for Electric Technology (CET), Technical University of Denmark (DTU). Analysis of the PMU performance first required the development of tools to convert the DTU PMU data into IEEE...... standard, and the validation is done for the DTU-PMU via a validated commercial PMU. The commercial PMU has been tested from the authors' previous efforts, where the response can be expected to follow known patterns and provide confirmation about the test system to confirm the design and settings....... In a nutshell, having 2 PMUs that observe same signals provides validation of the operation and flags questionable results with more certainty. Moreover, the performance and accuracy of the DTU-PMU is tested acquiring good and precise results, when compared with a commercial phasor measurement device, PMU-1....

  10. WRAP TRUPACT loading systems operational test report

    International Nuclear Information System (INIS)

    DOSRAMOS, E.V.

    1999-01-01

    This Operational Test Report documents the operational testing of the TRUPACT process equipment HNF-3918, Revision 0, TRUPACT Operational Test Procedure. The test accomplished the following: Procedure validation; Facility equipment interface; Facility personnel support; and Subcontractor personnel support interface. Field changes are documented as test exceptions with resolutions. All resolutions are completed or a formal method is identified to track the resolution through to completion

  11. Operational Law Handbook,2007

    National Research Council Canada - National Science Library

    2007-01-01

    ... & SOFAs, legal assistance, combating terrorism, domestic operations, noncombatant evacuation operations, special operations, civil affairs, air, sea, and space law, detainee operations, reserve...

  12. Neutron flux control systems validation

    International Nuclear Information System (INIS)

    Hascik, R.

    2003-01-01

    In nuclear installations main requirement is to obtain corresponding nuclear safety in all operation conditions. From the nuclear safety point of view is commissioning and start-up after reactor refuelling appropriate period for safety systems verification. In this paper, methodology, performance and results of neutron flux measurements systems validation is presented. Standard neutron flux measuring chains incorporated into the reactor protection and control system are used. Standard neutron flux measuring chain contains detector, preamplifier, wiring to data acquisition unit, data acquisition unit, wiring to display at control room and display at control room. During reactor outage only data acquisition unit and wiring and displaying at reactor control room is verified. It is impossible to verify detector, preamplifier and wiring to data acquisition recording unit during reactor refuelling according to low power. Adjustment and accurate functionality of these chains is confirmed by start-up rate (SUR) measurement during start-up tests after refuelling of the reactors. This measurement has direct impact to nuclear safety and increase operational nuclear safety level. Briefly description of each measuring system is given. Results are illustrated on measurements performed at Bohunice NPP during reactor start-up tests. Main failures and their elimination are described (Authors)

  13. Verification and validation in computational fluid dynamics

    Science.gov (United States)

    Oberkampf, William L.; Trucano, Timothy G.

    2002-04-01

    Verification and validation (V&V) are the primary means to assess accuracy and reliability in computational simulations. This paper presents an extensive review of the literature in V&V in computational fluid dynamics (CFD), discusses methods and procedures for assessing V&V, and develops a number of extensions to existing ideas. The review of the development of V&V terminology and methodology points out the contributions from members of the operations research, statistics, and CFD communities. Fundamental issues in V&V are addressed, such as code verification versus solution verification, model validation versus solution validation, the distinction between error and uncertainty, conceptual sources of error and uncertainty, and the relationship between validation and prediction. The fundamental strategy of verification is the identification and quantification of errors in the computational model and its solution. In verification activities, the accuracy of a computational solution is primarily measured relative to two types of highly accurate solutions: analytical solutions and highly accurate numerical solutions. Methods for determining the accuracy of numerical solutions are presented and the importance of software testing during verification activities is emphasized. The fundamental strategy of validation is to assess how accurately the computational results compare with the experimental data, with quantified error and uncertainty estimates for both. This strategy employs a hierarchical methodology that segregates and simplifies the physical and coupling phenomena involved in the complex engineering system of interest. A hypersonic cruise missile is used as an example of how this hierarchical structure is formulated. The discussion of validation assessment also encompasses a number of other important topics. A set of guidelines is proposed for designing and conducting validation experiments, supported by an explanation of how validation experiments are different

  14. Statistical Validation of Normal Tissue Complication Probability Models

    Energy Technology Data Exchange (ETDEWEB)

    Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Veld, Aart A. van' t; Langendijk, Johannes A. [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schilstra, Cornelis [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Radiotherapy Institute Friesland, Leeuwarden (Netherlands)

    2012-09-01

    Purpose: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. Methods and Materials: A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Results: Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Conclusion: Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use.

  15. Statistical validation of normal tissue complication probability models.

    Science.gov (United States)

    Xu, Cheng-Jian; van der Schaaf, Arjen; Van't Veld, Aart A; Langendijk, Johannes A; Schilstra, Cornelis

    2012-09-01

    To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use. Copyright © 2012 Elsevier Inc. All rights reserved.

  16. An integrated approach for signal validation in nuclear power plants

    International Nuclear Information System (INIS)

    Upadhyaya, B.R.; Kerlin, T.W.; Gloeckler, O.; Frei, Z.; Qualls, L.; Morgenstern, V.

    1987-08-01

    A signal validation system, based on several parallel signal processing modules, is being developed at the University of Tennessee. The major modules perform (1) general consistency checking (GCC) of a set of redundant measurements, (2) multivariate data-driven modeling of dynamic signal components for maloperation detection, (3) process empirical modeling for prediction and redundancy generation, (4) jump, pulse, noise detection, and (5) an expert system for qualitative signal validation. A central database stores information related to sensors, diagnostics rules, past system performance, subsystem models, etc. We are primarily concerned with signal validation during steady-state operation and slow degradations. In general, the different modules will perform signal validation during all operating conditions. The techniques have been successfully tested using PWR steam generator simulation, and efforts are currently underway in applying the techniques to Millstone-III operational data. These methods could be implemented in advanced reactors, including advanced liquid metal reactors

  17. Improving Perovskite Solar Cells: Insights From a Validated Device Model

    NARCIS (Netherlands)

    Sherkar, Tejas S.; Momblona, Cristina; Gil-Escrig, Lidon; Bolink, Henk J.; Koster, L. Jan Anton

    2017-01-01

    To improve the efficiency of existing perovskite solar cells (PSCs), a detailed understanding of the underlying device physics during their operation is essential. Here, a device model has been developed and validated that describes the operation of PSCs and quantitatively explains the role of

  18. User Validation of VIIRS Satellite Imagery

    Directory of Open Access Journals (Sweden)

    Don Hillger

    2015-12-01

    Full Text Available Visible/Infrared Imaging Radiometer Suite (VIIRS Imagery from the Suomi National Polar-orbiting Partnership (S-NPP satellite is the finest spatial resolution (375 m multi-spectral imagery of any operational meteorological satellite to date. The Imagery environmental data record (EDR has been designated as a Key Performance Parameter (KPP for VIIRS, meaning that its performance is vital to the success of a series of Joint Polar Satellite System (JPSS satellites that will carry this instrument. Because VIIRS covers the high-latitude and Polar Regions especially well via overlapping swaths from adjacent orbits, the Alaska theatre in particular benefits from VIIRS more than lower-latitude regions. While there are no requirements that specifically address the quality of the EDR Imagery aside from the VIIRS SDR performance requirements, the value of VIIRS Imagery to operational users is an important consideration in the Cal/Val process. As such, engaging a wide diversity of users constitutes a vital part of the Imagery validation strategy. The best possible image quality is of utmost importance. This paper summarizes the Imagery Cal/Val Team’s quality assessment in this context. Since users are a vital component to the validation of VIIRS Imagery, specific examples of VIIRS imagery applied to operational needs are presented as an integral part of the post-checkout Imagery validation.

  19. Flight code validation simulator

    Science.gov (United States)

    Sims, Brent A.

    1996-05-01

    An End-To-End Simulation capability for software development and validation of missile flight software on the actual embedded computer has been developed utilizing a 486 PC, i860 DSP coprocessor, embedded flight computer and custom dual port memory interface hardware. This system allows real-time interrupt driven embedded flight software development and checkout. The flight software runs in a Sandia Digital Airborne Computer and reads and writes actual hardware sensor locations in which Inertial Measurement Unit data resides. The simulator provides six degree of freedom real-time dynamic simulation, accurate real-time discrete sensor data and acts on commands and discretes from the flight computer. This system was utilized in the development and validation of the successful premier flight of the Digital Miniature Attitude Reference System in January of 1995 at the White Sands Missile Range on a two stage attitude controlled sounding rocket.

  20. CIPS Validation Data Plan

    Energy Technology Data Exchange (ETDEWEB)

    Nam Dinh

    2012-03-01

    This report documents analysis, findings and recommendations resulted from a task 'CIPS Validation Data Plan (VDP)' formulated as an POR4 activity in the CASL VUQ Focus Area (FA), to develop a Validation Data Plan (VDP) for Crud-Induced Power Shift (CIPS) challenge problem, and provide guidance for the CIPS VDP implementation. The main reason and motivation for this task to be carried at this time in the VUQ FA is to bring together (i) knowledge of modern view and capability in VUQ, (ii) knowledge of physical processes that govern the CIPS, and (iii) knowledge of codes, models, and data available, used, potentially accessible, and/or being developed in CASL for CIPS prediction, to devise a practical VDP that effectively supports the CASL's mission in CIPS applications.

  1. CIPS Validation Data Plan

    International Nuclear Information System (INIS)

    Dinh, Nam

    2012-01-01

    This report documents analysis, findings and recommendations resulted from a task 'CIPS Validation Data Plan (VDP)' formulated as an POR4 activity in the CASL VUQ Focus Area (FA), to develop a Validation Data Plan (VDP) for Crud-Induced Power Shift (CIPS) challenge problem, and provide guidance for the CIPS VDP implementation. The main reason and motivation for this task to be carried at this time in the VUQ FA is to bring together (i) knowledge of modern view and capability in VUQ, (ii) knowledge of physical processes that govern the CIPS, and (iii) knowledge of codes, models, and data available, used, potentially accessible, and/or being developed in CASL for CIPS prediction, to devise a practical VDP that effectively supports the CASL's mission in CIPS applications.

  2. Validating MEDIQUAL Constructs

    Science.gov (United States)

    Lee, Sang-Gun; Min, Jae H.

    In this paper, we validate MEDIQUAL constructs through the different media users in help desk service. In previous research, only two end-users' constructs were used: assurance and responsiveness. In this paper, we extend MEDIQUAL constructs to include reliability, empathy, assurance, tangibles, and responsiveness, which are based on the SERVQUAL theory. The results suggest that: 1) five MEDIQUAL constructs are validated through the factor analysis. That is, importance of the constructs have relatively high correlations between measures of the same construct using different methods and low correlations between measures of the constructs that are expected to differ; and 2) five MEDIQUAL constructs are statistically significant on media users' satisfaction in help desk service by regression analysis.

  3. CRED Optical Validation Data in the Auau Channel, Hawaii, 2007, to Support Benthic Habitat Mapping

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Optical validation data were collected using a RCV-150 remotely operated vehicle (ROV) operated by the Hawaii Undersea Research Laboratory (HURL). Data were...

  4. DDML Schema Validation

    Science.gov (United States)

    2016-02-08

    XML schema govern DDML instance documents. For information about XML, refer to RCC 125-15, XML Style Guide.2 Figure 4 provides an XML snippet of a...we have documented three main types of information .  User Stories: A user story describes a specific requirement of the schema in the terms of a...instance document is a schema -valid XML file that completely describes the information in the test case in a manner that satisfies the user story

  5. The ALICE Software Release Validation cluster

    International Nuclear Information System (INIS)

    Berzano, D; Krzewicki, M

    2015-01-01

    One of the most important steps of software lifecycle is Quality Assurance: this process comprehends both automatic tests and manual reviews, and all of them must pass successfully before the software is approved for production. Some tests, such as source code static analysis, are executed on a single dedicated service: in High Energy Physics, a full simulation and reconstruction chain on a distributed computing environment, backed with a sample “golden” dataset, is also necessary for the quality sign off. The ALICE experiment uses dedicated and virtualized computing infrastructures for the Release Validation in order not to taint the production environment (i.e. CVMFS and the Grid) with non-validated software and validation jobs: the ALICE Release Validation cluster is a disposable virtual cluster appliance based on CernVM and the Virtual Analysis Facility, capable of deploying on demand, and with a single command, a dedicated virtual HTCondor cluster with an automatically scalable number of virtual workers on any cloud supporting the standard EC2 interface. Input and output data are externally stored on EOS, and a dedicated CVMFS service is used to provide the software to be validated. We will show how the Release Validation Cluster deployment and disposal are completely transparent for the Release Manager, who simply triggers the validation from the ALICE build system's web interface. CernVM 3, based entirely on CVMFS, permits to boot any snapshot of the operating system in time: we will show how this allows us to certify each ALICE software release for an exact CernVM snapshot, addressing the problem of Long Term Data Preservation by ensuring a consistent environment for software execution and data reprocessing in the future. (paper)

  6. Your Lung Operation: After Your Operation

    Medline Plus

    Full Text Available ... You Want to Be a Surgeon Resident Resources Teaching Resources Online Guide to Choosing a Surgical Residency ... After Your Operation Your Discharge and Recovery Complete Video After Your Operation Guidance for after the operation ...

  7. Your Lung Operation: After Your Operation

    Medline Plus

    Full Text Available ... Medical Student Core Curriculum ACS/ASE Medical Student Simulation-Based Surgical Skills Curriculum Cancer Education Cancer Education ... Surgeons Education Patients and Family Skills Programs Your Lung Operation Your Lung Operation DVD After Your Operation ...

  8. Your Lung Operation: After Your Operation

    Medline Plus

    Full Text Available ... Overview ACS-AEI Consortium Quarterly ACS Chapter News Cancer ... American College of Surgeons Education Patients and Family Skills Programs Your Lung Operation Your Lung Operation DVD After Your Operation ...

  9. Your Lung Operation: After Your Operation

    Medline Plus

    Full Text Available ... Liability Surgeons as Advocates Surgeons and Bundled Payment Models Surgeons as Institutional Employees Our Changing Health Care ... Lung Operation After Your Operation Your Discharge and Recovery Complete Video After Your Operation Guidance for after ...

  10. Your Lung Operation: After Your Operation

    Medline Plus

    Full Text Available ... Safety Resources About the Patient Education Program The Recovery Room Choosing Wisely Educational Programs Educational Programs Educational ... Lung Operation After Your Operation Your Discharge and Recovery Complete Video After Your Operation Guidance for after ...

  11. Operator theory, operator algebras and applications

    CERN Document Server

    Lebre, Amarino; Samko, Stefan; Spitkovsky, Ilya

    2014-01-01

    This book consists of research papers that cover the scientific areas of the International Workshop on Operator Theory, Operator Algebras and Applications, held in Lisbon in September 2012. The volume particularly focuses on (i) operator theory and harmonic analysis (singular integral operators with shifts; pseudodifferential operators, factorization of almost periodic matrix functions; inequalities; Cauchy type integrals; maximal and singular operators on generalized Orlicz-Morrey spaces; the Riesz potential operator; modification of Hadamard fractional integro-differentiation), (ii) operator algebras (invertibility in groupoid C*-algebras; inner endomorphisms of some semi group, crossed products; C*-algebras generated by mappings which have finite orbits; Folner sequences in operator algebras; arithmetic aspect of C*_r SL(2); C*-algebras of singular integral operators; algebras of operator sequences) and (iii) mathematical physics (operator approach to diffraction from polygonal-conical screens; Poisson geo...

  12. Emergency operation procedure navigation to avoid commission errors

    International Nuclear Information System (INIS)

    Gofuku, Akio; Ito, Koji

    2004-01-01

    New types of operation control system equipped with a large screen and CRT-based operation panels have been installed in newly constructed nuclear power plants. The operators can share important information of plant conditions by the large screen. The operation control system can know the operations by operators through the computers connected to the operation panels. The software switches placed in the CRT-based operation panels have a problem such that operators may make an error to manipulate an irrelevant software switch with their current operation. This study develops an operation procedure navigation technique to avoid this kind of commission errors. The system lies between CRT-based operation panels and plant control systems and checks an operation by operators if it follows the operation procedure of operation manuals. When the operation is a right one, the operation is executed as if the operation command is directly transmitted to control systems. If the operation does not follow the operation procedure, the system warns the commission error to operators. This paper describes the operation navigation technique, format of base operation model, and a proto-type operation navigation system for a three loop pressurized water reactor plant. The validity of the proto-type system is demonstrated by the operation procedure navigation for a steam generator tube rupture accident. (author)

  13. Categorizing operational radioactive wastes

    International Nuclear Information System (INIS)

    2007-04-01

    The primary objective of this publication is to improve communications among waste management professionals and Member States relative to the properties and status of radioactive waste. This is accomplished by providing a standardized approach to operational waste categorization using accepted industry practices and experience. It is a secondary objective to draw a distinction between operational waste categorization and waste disposal classification. The approach set forth herein is applicable to waste generation by mature (major, advanced) nuclear programmes, small-to-medium sized nuclear programmes, and programmes with waste from other nuclear applications. It can be used for planning, developing or revising categorization methodologies. For existing categorization programmes, the approach set forth in this publication may be used as a validation and evaluation tool for assessing communication effectiveness among affected organizations or nations. This publication is intended for use by waste management professionals responsible for creating, implementing or communicating effective categorization, processing and disposal strategies. For the users of this publication, it is important to remember that waste categorization is a communication tool. As such, the operational waste categories are not suitable for regulatory purposes nor for use in health and safety evaluations. Following Section 1 (Introduction) Section 2 of this publication defines categorization and its relationship to existing waste classification and management standards, regulations and practices. It also describes the benefits of a comprehensive categorization programme and fundamental record considerations. Section 3 provides an overview of the categorization process, including primary categories and sub-categories. Sections 4 and 5 outline the specific methodology for categorizing unconditioned and conditioned wastes. Finally, Section 6 provides a brief summary of critical considerations that

  14. Content validity and its estimation

    Directory of Open Access Journals (Sweden)

    Yaghmale F

    2003-04-01

    Full Text Available Background: Measuring content validity of instruments are important. This type of validity can help to ensure construct validity and give confidence to the readers and researchers about instruments. content validity refers to the degree that the instrument covers the content that it is supposed to measure. For content validity two judgments are necessary: the measurable extent of each item for defining the traits and the set of items that represents all aspects of the traits. Purpose: To develop a content valid scale for assessing experience with computer usage. Methods: First a review of 2 volumes of International Journal of Nursing Studies, was conducted with onlyI article out of 13 which documented content validity did so by a 4-point content validity index (CV! and the judgment of 3 experts. Then a scale with 38 items was developed. The experts were asked to rate each item based on relevance, clarity, simplicity and ambiguity on the four-point scale. Content Validity Index (CVI for each item was determined. Result: Of 38 items, those with CVIover 0.75 remained and the rest were discarded reSulting to 25-item scale. Conclusion: Although documenting content validity of an instrument may seem expensive in terms of time and human resources, its importance warrants greater attention when a valid assessment instrument is to be developed. Keywords: Content Validity, Measuring Content Validity

  15. Operator-based metric for nuclear operations automation assessment

    Energy Technology Data Exchange (ETDEWEB)

    Zacharias, G.L.; Miao, A.X.; Kalkan, A. [Charles River Analytics Inc., Cambridge, MA (United States)] [and others

    1995-04-01

    Continuing advances in real-time computational capabilities will support enhanced levels of smart automation and AI-based decision-aiding systems in the nuclear power plant (NPP) control room of the future. To support development of these aids, we describe in this paper a research tool, and more specifically, a quantitative metric, to assess the impact of proposed automation/aiding concepts in a manner that can account for a number of interlinked factors in the control room environment. In particular, we describe a cognitive operator/plant model that serves as a framework for integrating the operator`s information-processing capabilities with his procedural knowledge, to provide insight as to how situations are assessed by the operator, decisions made, procedures executed, and communications conducted. Our focus is on the situation assessment (SA) behavior of the operator, the development of a quantitative metric reflecting overall operator awareness, and the use of this metric in evaluating automation/aiding options. We describe the results of a model-based simulation of a selected emergency scenario, and metric-based evaluation of a range of contemplated NPP control room automation/aiding options. The results demonstrate the feasibility of model-based analysis of contemplated control room enhancements, and highlight the need for empirical validation.

  16. Validering av Evolution 220

    OpenAIRE

    Krakeli, Tor-Arne

    2013-01-01

    - Det har blitt kjøpt inn et nytt spektrofotometer (Evolution 220, Thermo Scientific) til BioLab Nofima. I den forbindelsen har det blitt utført en validering som involverer kalibreringsstandarder fra produsenten og en test på normal distribusjon (t-test) på to metoder (Total fosfor, Tryptofan). Denne valideringen fant Evolution 220 til å være et akseptabelt alternativ til det allerede benyttede spektrofotometeret (Helios Beta). På bakgrunn av noen instrumentbegrensninger må de aktuelle an...

  17. Operating systems for experimental physics

    International Nuclear Information System (INIS)

    Davies, H.E.

    1976-01-01

    Modern high energy physics experiments are very dependent on the use of computers and present a fairly well defined list of technical demands on them. It is therefore possible to look at the construction of a computer operating system and to see how the design choices should be made in order to make the systems as useful as possible to physics experiments or, more practically, to look at existing operating systems to see which can most easily be used to do the jobs of rapid data acquisition and checking. In these notes, operating systems are looked at from the point of view of the informed user. Emphasis is placed on systems which are intended for single processor microcomputers of the type frequently used for data acquisition applications. The principles described are, of course, equally valid for other kinds of system. (Auth.)

  18. Join Operations in Temporal Databases

    DEFF Research Database (Denmark)

    Gao, D.; Jensen, Christian Søndergaard; Snodgrass, R.T.

    2005-01-01

    Joins are arguably the most important relational operators. Poor implementations are tantamount to computing the Cartesian product of the input relations. In a temporal database, the problem is more acute for two reasons. First, conventional techniques are designed for the evaluation of joins...... with equality predicates rather than the inequality predicates prevalent in valid-time queries. Second, the presence of temporally varying data dramatically increases the size of a database. These factors indicate that specialized techniques are needed to efficiently evaluate temporal joins. We address...... this need for efficient join evaluation in temporal databases. Our purpose is twofold. We first survey all previously proposed temporal join operators. While many temporal join operators have been defined in previous work, this work has been done largely in isolation from competing proposals, with little...

  19. Simulation Validation for Societal Systems

    National Research Council Canada - National Science Library

    Yahja, Alex

    2006-01-01

    .... There are however, substantial obstacles to validation. The nature of modeling means that there are implicit model assumptions, a complex model space and interactions, emergent behaviors, and uncodified and inoperable simulation and validation knowledge...

  20. Audit Validation Using Ontologies

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2015-01-01

    Full Text Available Requirements to increase quality audit processes in enterprises are defined. It substantiates the need for assessment and management audit processes using ontologies. Sets of rules, ways to assess the consistency of rules and behavior within the organization are defined. Using ontologies are obtained qualifications that assess the organization's audit. Elaboration of the audit reports is a perfect algorithm-based activity characterized by generality, determinism, reproducibility, accuracy and a well-established. The auditors obtain effective levels. Through ontologies obtain the audit calculated level. Because the audit report is qualitative structure of information and knowledge it is very hard to analyze and interpret by different groups of users (shareholders, managers or stakeholders. Developing ontology for audit reports validation will be a useful instrument for both auditors and report users. In this paper we propose an instrument for validation of audit reports contain a lot of keywords that calculates indicators, a lot of indicators for each key word there is an indicator, qualitative levels; interpreter who builds a table of indicators, levels of actual and calculated levels.

  1. Validation of dengue infection severity score

    Directory of Open Access Journals (Sweden)

    Pongpan S

    2014-03-01

    Full Text Available Surangrat Pongpan,1,2 Jayanton Patumanond,3 Apichart Wisitwong,4 Chamaiporn Tawichasri,5 Sirianong Namwongprom1,6 1Clinical Epidemiology Program, Faculty of Medicine, Chiang Mai University, Chiang Mai, Thailand; 2Department of Occupational Medicine, Phrae Hospital, Phrae, Thailand; 3Clinical Epidemiology Program, Faculty of Medicine, Thammasat University, Bangkok, Thailand; 4Department of Social Medicine, Sawanpracharak Hospital, Nakorn Sawan, Thailand; 5Clinical Epidemiology Society at Chiang Mai, Chiang Mai, Thailand; 6Department of Radiology, Faculty of Medicine, Chiang Mai University, Chiang Mai, Thailand Objective: To validate a simple scoring system to classify dengue viral infection severity to patients in different settings. Methods: The developed scoring system derived from 777 patients from three tertiary-care hospitals was applied to 400 patients in the validation data obtained from another three tertiary-care hospitals. Percentage of correct classification, underestimation, and overestimation was compared. The score discriminative performance in the two datasets was compared by analysis of areas under the receiver operating characteristic curves. Results: Patients in the validation data were different from those in the development data in some aspects. In the validation data, classifying patients into three severity levels (dengue fever, dengue hemorrhagic fever, and dengue shock syndrome yielded 50.8% correct prediction (versus 60.7% in the development data, with clinically acceptable underestimation (18.6% versus 25.7% and overestimation (30.8% versus 13.5%. Despite the difference in predictive performances between the validation and the development data, the overall prediction of the scoring system is considered high. Conclusion: The developed severity score may be applied to classify patients with dengue viral infection into three severity levels with clinically acceptable under- or overestimation. Its impact when used in routine

  2. A Survey on Operator Monotonicity, Operator Convexity, and Operator Means

    Directory of Open Access Journals (Sweden)

    Pattrawut Chansangiam

    2015-01-01

    Full Text Available This paper is an expository devoted to an important class of real-valued functions introduced by Löwner, namely, operator monotone functions. This concept is closely related to operator convex/concave functions. Various characterizations for such functions are given from the viewpoint of differential analysis in terms of matrix of divided differences. From the viewpoint of operator inequalities, various characterizations and the relationship between operator monotonicity and operator convexity are given by Hansen and Pedersen. In the viewpoint of measure theory, operator monotone functions on the nonnegative reals admit meaningful integral representations with respect to Borel measures on the unit interval. Furthermore, Kubo-Ando theory asserts the correspondence between operator monotone functions and operator means.

  3. Overview of SCIAMACHY validation: 2002 2004

    Science.gov (United States)

    Piters, A. J. M.; Bramstedt, K.; Lambert, J.-C.; Kirchhoff, B.

    2005-08-01

    SCIAMACHY, on board Envisat, is now in operation for almost three years. This UV/visible/NIR spectrometer measures the solar irradiance, the earthshine radiance scattered at nadir and from the limb, and the attenuation of solar radiation by the atmosphere during sunrise and sunset, from 240 to 2380 nm and at moderate spectral resolution. Vertical columns and profiles of a variety of atmospheric constituents are inferred from the SCIAMACHY radiometric measurements by dedicated retrieval algorithms. With the support of ESA and several international partners, a methodical SCIAMACHY validation programme has been developed jointly by Germany, the Netherlands and Belgium (the three instrument providing countries) to face complex requirements in terms of measured species, altitude range, spatial and temporal scales, geophysical states and intended scientific applications. This summary paper describes the approach adopted to address those requirements. The actual validation of the operational SCIAMACHY processors established at DLR on behalf of ESA has been hampered by data distribution and processor problems. Since first data releases in summer 2002, operational processors were upgraded regularly and some data products - level-1b spectra, level-2 O3, NO2, BrO and clouds data - have improved significantly. Validation results summarised in this paper conclude that for limited periods and geographical domains they can already be used for atmospheric research. Nevertheless, remaining processor problems cause major errors preventing from scientific usability in other periods and domains. Untied to the constraints of operational processing, seven scientific institutes (BIRA-IASB, IFE, IUP-Heidelberg, KNMI, MPI, SAO and SRON) have developed their own retrieval algorithms and generated SCIAMACHY data products, together addressing nearly all targeted constituents. Most of the UV-visible data products (both columns and profiles) already have acceptable, if not excellent, quality

  4. CTF Validation and Verification Manual

    Energy Technology Data Exchange (ETDEWEB)

    Salko, Robert K. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Blyth, Taylor S. [Pennsylvania State Univ., University Park, PA (United States); Dances, Christopher A. [Pennsylvania State Univ., University Park, PA (United States); Magedanz, Jeffrey W. [Pennsylvania State Univ., University Park, PA (United States); Jernigan, Caleb [Holtec International, Marlton, NJ (United States); Kelly, Joeseph [U.S. Nuclear Regulatory Commission (NRC), Rockville, MD (United States); Toptan, Aysenur [North Carolina State Univ., Raleigh, NC (United States); Gergar, Marcus [Pennsylvania State Univ., University Park, PA (United States); Gosdin, Chris [Pennsylvania State Univ., University Park, PA (United States); Avramova, Maria [Pennsylvania State Univ., University Park, PA (United States); Palmtag, Scott [Core Physics, Inc., Cary, NC (United States); Gehin, Jess C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-05-25

    Coolant-Boiling in Rod Arrays- Two Fluids (COBRA-TF) is a Thermal/Hydraulic (T/H) simulation code designed for Light Water Reactor (LWR) analysis. It uses a two-fluid, three-field (i.e. fluid film, fluid drops, and vapor) modeling approach. Both sub-channel and 3D Cartesian forms of nine conservation equations are available for LWR modeling. The code was originally developed by Pacific Northwest Laboratory in 1980 and has been used and modified by several institutions over the last several decades. COBRA-TF is also used at the Pennsylvania State University (PSU) by the Reactor Dynamics and Fuel Management Group (RDFMG), and has been improved, updated, and subsequently became the PSU RDFMG version of COBRA-TF (CTF). One part of the improvement process includes validating the methods in CTF. This document seeks to provide a certain level of certainty and confidence in the predictive capabilities of the code for the scenarios it was designed to model--rod bundle geometries with operating conditions that are representative of prototypical Pressurized Water Reactor (PWR)s and Boiling Water Reactor (BWR)s in both normal and accident conditions. This is done by modeling a variety of experiments that simulate these scenarios and then presenting a qualitative and quantitative analysis of the results that demonstrates the accuracy to which CTF is capable of capturing specific quantities of interest.

  5. Advanced training simulator models. Implementation and validation

    International Nuclear Information System (INIS)

    Borkowsky, Jeffrey; Judd, Jerry; Belblidia, Lotfi; O'farrell, David; Andersen, Peter

    2008-01-01

    Modern training simulators are required to replicate plant data for both thermal-hydraulic and neutronic response. Replication is required such that reactivity manipulation on the simulator properly trains the operator for reactivity manipulation at the plant. This paper discusses advanced models which perform this function in real-time using the coupled code system THOR/S3R. This code system models the all fluids systems in detail using an advanced, two-phase thermal-hydraulic a model. The nuclear core is modeled using an advanced, three-dimensional nodal method and also by using cycle-specific nuclear data. These models are configured to run interactively from a graphical instructor station or handware operation panels. The simulator models are theoretically rigorous and are expected to replicate the physics of the plant. However, to verify replication, the models must be independently assessed. Plant data is the preferred validation method, but plant data is often not available for many important training scenarios. In the absence of data, validation may be obtained by slower-than-real-time transient analysis. This analysis can be performed by coupling a safety analysis code and a core design code. Such a coupling exists between the codes RELAP5 and SIMULATE-3K (S3K). RELAP5/S3K is used to validate the real-time model for several postulated plant events. (author)

  6. Operator Arithmetic-Harmonic Mean Inequality on Krein Spaces

    Directory of Open Access Journals (Sweden)

    M. Dehghani

    2014-03-01

    Full Text Available We prove an operator arithmetic-harmonic mean type inequality in Krein space setting, by using some block matrix techniques of indefinite type. We also give an example which shows that the operator arithmetic-geometric-harmonic mean inequality for two invertible selfadjoint operators on Krein spaces is not valid, in general.

  7. Validation of the Social Inclusion Scale with Students

    Directory of Open Access Journals (Sweden)

    Ceri Wilson

    2015-07-01

    Full Text Available Interventions (such as participatory arts projects aimed at increasing social inclusion are increasingly in operation, as social inclusion is proving to play a key role in recovery from mental ill health and the promotion of mental wellbeing. These interventions require evaluation with a systematically developed and validated measure of social inclusion; however, a “gold-standard” measure does not yet exist. The Social Inclusion Scale (SIS has three subscales measuring social isolation, relations and acceptance. This scale has been partially validated with arts and mental health project users, demonstrating good internal consistency. However, test-retest reliability and construct validity require assessment, along with validation in the general population. The present study aimed to validate the SIS in a sample of university students. Test-retest reliability, internal consistency, and convergent validity (one aspect of construct validity were assessed by comparing SIS scores with scores on other measures of social inclusion and related concepts. Participants completed the measures at two time-points seven-to-14 days apart. The SIS demonstrated high internal consistency and test-retest reliability, although convergent validity was less well-established and possible reasons for this are discussed. This systematic validation of the SIS represents a further step towards the establishment of a “gold-standard” measure of social inclusion.

  8. Validation of a proposal for evaluating hospital infection control programs.

    Science.gov (United States)

    Silva, Cristiane Pavanello Rodrigues; Lacerda, Rúbia Aparecida

    2011-02-01

    To validate the construct and discriminant properties of a hospital infection prevention and control program. The program consisted of four indicators: technical-operational structure; operational prevention and control guidelines; epidemiological surveillance system; and prevention and control activities. These indicators, with previously validated content, were applied to 50 healthcare institutions in the city of São Paulo, Southeastern Brazil, in 2009. Descriptive statistics were used to characterize the hospitals and indicator scores, and Cronbach's α coefficient was used to evaluate the internal consistency. The discriminant validity was analyzed by comparing indicator scores between groups of hospitals: with versus without quality certification. The construct validity analysis was based on exploratory factor analysis with a tetrachoric correlation matrix. The indicators for the technical-operational structure and epidemiological surveillance presented almost 100% conformity in the whole sample. The indicators for the operational prevention and control guidelines and the prevention and control activities presented internal consistency ranging from 0.67 to 0.80. The discriminant validity of these indicators indicated higher and statistically significant mean conformity scores among the group of institutions with healthcare certification or accreditation processes. In the construct validation, two dimensions were identified for the operational prevention and control guidelines: recommendations for preventing hospital infection and recommendations for standardizing prophylaxis procedures, with good correlation between the analysis units that formed the guidelines. The same was found for the prevention and control activities: interfaces with treatment units and support units were identified. Validation of the measurement properties of the hospital infection prevention and control program indicators made it possible to develop a tool for evaluating these programs

  9. Content validation of the nursing diagnosis Nausea

    Directory of Open Access Journals (Sweden)

    Daniele Alcalá Pompeo

    2014-02-01

    Full Text Available This study aimed to evaluate the content validity of the nursing diagnosis of nausea in the immediate post-operative period, considering Fehring’s model. Descriptive study with 52 nurses experts who responded an instrument containing identification and validation of nausea diagnosis data. Most experts considered the domain 12 (Comfort, Class 1 (Physical Comfort and the statement (Nausea adequate to the diagnosis. Modifications were suggested in the current definition of this nursing diagnosis. Four defining characteristics were considered primary (reported nausea, increased salivation, aversion to food and vomiting sensation and eight secondary (increased swallowing, sour taste in the mouth, pallor, tachycardia, diaphoresis, sensation of hot and cold, changes in blood pressure and pupil dilation. The total score for the diagnosis of nausea was 0.79. Reports of nausea, vomiting sensation, increased salivation and aversion to food are strong predictors of nursing diagnosis of nausea.

  10. Experimental validation of prototype high voltage bushing

    Science.gov (United States)

    Shah, Sejal; Tyagi, H.; Sharma, D.; Parmar, D.; M. N., Vishnudev; Joshi, K.; Patel, K.; Yadav, A.; Patel, R.; Bandyopadhyay, M.; Rotti, C.; Chakraborty, A.

    2017-08-01

    Prototype High voltage bushing (PHVB) is a scaled down configuration of DNB High Voltage Bushing (HVB) of ITER. It is designed for operation at 50 kV DC to ensure operational performance and thereby confirming the design configuration of DNB HVB. Two concentric insulators viz. Ceramic and Fiber reinforced polymer (FRP) rings are used as double layered vacuum boundary for 50 kV isolation between grounded and high voltage flanges. Stress shields are designed for smooth electric field distribution. During ceramic to Kovar brazing, spilling cannot be controlled which may lead to high localized electrostatic stress. To understand spilling phenomenon and precise stress calculation, quantitative analysis was performed using Scanning Electron Microscopy (SEM) of brazed sample and similar configuration modeled while performing the Finite Element (FE) analysis. FE analysis of PHVB is performed to find out electrical stresses on different areas of PHVB and are maintained similar to DNB HV Bushing. With this configuration, the experiment is performed considering ITER like vacuum and electrical parameters. Initial HV test is performed by temporary vacuum sealing arrangements using gaskets/O-rings at both ends in order to achieve desired vacuum and keep the system maintainable. During validation test, 50 kV voltage withstand is performed for one hour. Voltage withstand test for 60 kV DC (20% higher rated voltage) have also been performed without any breakdown. Successful operation of PHVB confirms the design of DNB HV Bushing. In this paper, configuration of PHVB with experimental validation data is presented.

  11. Guided exploration of physically valid shapes for furniture design

    KAUST Repository

    Umetani, Nobuyuki

    2012-07-01

    Geometric modeling and the physical validity of shapes are traditionally considered independently. This makes creating aesthetically pleasing yet physically valid models challenging. We propose an interactive design framework for efficient and intuitive exploration of geometrically and physically valid shapes. During any geometric editing operation, the proposed system continuously visualizes the valid range of the parameter being edited. When one or more constraints are violated after an operation, the system generates multiple suggestions involving both discrete and continuous changes to restore validity. Each suggestion also comes with an editing mode that simultaneously adjusts multiple parameters in a coordinated way to maintain validity. Thus, while the user focuses on the aesthetic aspects of the design, our computational design framework helps to achieve physical realizability by providing active guidance to the user. We demonstrate our framework on plankbased furniture design with nail-joint and frictional constraints. We use our system to design a range of examples, conduct a user study, and also fabricate a physical prototype to test the validity and usefulness of the system. © 2012 ACM 0730-0301/2012/08- ART86.

  12. Design for validation: An approach to systems validation

    Science.gov (United States)

    Carter, William C.; Dunham, Janet R.; Laprie, Jean-Claude; Williams, Thomas; Howden, William; Smith, Brian; Lewis, Carl M. (Editor)

    1989-01-01

    Every complex system built is validated in some manner. Computer validation begins with review of the system design. As systems became too complicated for one person to review, validation began to rely on the application of adhoc methods by many individuals. As the cost of the changes mounted and the expense of failure increased, more organized procedures became essential. Attempts at devising and carrying out those procedures showed that validation is indeed a difficult technical problem. The successful transformation of the validation process into a systematic series of formally sound, integrated steps is necessary if the liability inherent in the future digita-system-based avionic and space systems is to be minimized. A suggested framework and timetable for the transformtion are presented. Basic working definitions of two pivotal ideas (validation and system life-cyle) are provided and show how the two concepts interact. Many examples are given of past and present validation activities by NASA and others. A conceptual framework is presented for the validation process. Finally, important areas are listed for ongoing development of the validation process at NASA Langley Research Center.

  13. Your Lung Operation: After Your Operation

    Medline Plus

    Full Text Available ... to Participate Resources Webinars for Young Surgeons YFA E-News YFA Advocacy Essay Contest Resident and Associate ... ACS Leader International Exchange Scholar Program Resources RAS E-News Medical Students Operation Giving Back Operation Giving ...

  14. Your Lung Operation: After Your Operation

    Medline Plus

    Full Text Available ... Careers at ACS Careers at ACS About ACS Career Types Working at ACS ... ( 0 ) Cart Donate American College of Surgeons Education Patients and Family Skills Programs Your Lung Operation Your Lung Operation DVD ...

  15. Signal validation with control-room information-processing computers

    International Nuclear Information System (INIS)

    Belblidia, L.A.; Carlson, R.W.; Russell, J.L. Jr.

    1985-01-01

    One of the 'lessons learned' from the Three Mile Island accident focuses upon the need for a validated source of plant-status information in the control room. The utilization of computer-generated graphics to display the readings of the major plant instrumentation has introduced the capability of validating signals prior to their presentation to the reactor operations staff. The current operations philosophies allow the operator a quick look at the gauges to form an impression of the fraction of full scale as the basis for knowledge of the current plant conditions. After the introduction of a computer-based information-display system such as the Safety Parameter Display System (SPDS), operational decisions can be based upon precise knowledge of the parameters that define the operation of the reactor and auxiliary systems. The principal impact of this system on the operator will be to remove the continuing concern for the validity of the instruments which provide the information that governs the operator's decisions. (author)

  16. Site characterization and validation

    International Nuclear Information System (INIS)

    Olsson, O.; Eriksson, J.; Falk, L.; Sandberg, E.

    1988-04-01

    The borehole radar investigation program of the SCV-site (Site Characterization and Validation) has comprised single hole reflection measurements with centre frequencies of 22, 45, and 60 MHz. The radar range obtained in the single hole reflection measurements was approximately 100 m for the lower frequency (22 MHz) and about 60 m for the centre frequency 45 MHz. In the crosshole measurements transmitter-receiver separations from 60 to 200 m have been used. The radar investigations have given a three dimensional description of the structure at the SCV-site. A generalized model of the site has been produced which includes three major zones, four minor zones and a circular feature. These features are considered to be the most significant at the site. Smaller features than the ones included in the generalized model certainly exist but no additional features comparable to the three major zones are thought to exist. The results indicate that the zones are not homogeneous but rather that they are highly irregular containing parts of considerably increased fracturing and parts where their contrast to the background rock is quite small. The zones appear to be approximately planar at least at the scale of the site. At a smaller scale the zones can appear quite irregular. (authors)

  17. Spare Items validation

    International Nuclear Information System (INIS)

    Fernandez Carratala, L.

    1998-01-01

    There is an increasing difficulty for purchasing safety related spare items, with certifications by manufacturers for maintaining the original qualifications of the equipment of destination. The main reasons are, on the top of the logical evolution of technology, applied to the new manufactured components, the quitting of nuclear specific production lines and the evolution of manufacturers quality systems, originally based on nuclear codes and standards, to conventional industry standards. To face this problem, for many years different Dedication processes have been implemented to verify whether a commercial grade element is acceptable to be used in safety related applications. In the same way, due to our particular position regarding the spare part supplies, mainly from markets others than the american, C.N. Trillo has developed a methodology called Spare Items Validation. This methodology, which is originally based on dedication processes, is not a single process but a group of coordinated processes involving engineering, quality and management activities. These are to be performed on the spare item itself, its design control, its fabrication and its supply for allowing its use in destinations with specific requirements. The scope of application is not only focussed on safety related items, but also to complex design, high cost or plant reliability related components. The implementation in C.N. Trillo has been mainly curried out by merging, modifying and making the most of processes and activities which were already being performed in the company. (Author)

  18. Reprocessing input data validation

    International Nuclear Information System (INIS)

    Persiani, P.J.; Bucher, R.G.; Pond, R.B.; Cornella, R.J.

    1990-01-01

    The Isotope Correlation Technique (ICT), in conjunction with the gravimetric (Pu/U ratio) method for mass determination, provides an independent verification of the input accountancy at the dissolver or accountancy stage of the reprocessing plant. The Isotope Correlation Technique has been applied to many classes of domestic and international reactor systems (light-water, heavy-water, graphite, and liquid-metal) operating in a variety of modes (power, research, production, and breeder), and for a variety of reprocessing fuel cycle management strategies. Analysis of reprocessing operations data based on isotopic correlations derived for assemblies in a PWR environment and fuel management scheme, yielded differences between the measurement-derived and ICT-derived plutonium mass determinations of (-0.02 ± 0.23)% for the measured U-235 and (+0.50 ± 0.31)% for the measured Pu-239, for a core campaign. The ICT analyses has been implemented for the plutonium isotopics in a depleted uranium assembly in a heavy-water, enriched uranium system and for the uranium isotopes in the fuel assemblies in light-water, highly-enriched systems. 7 refs., 5 figs., 4 tabs

  19. SHIELD verification and validation report

    International Nuclear Information System (INIS)

    Boman, C.

    1992-02-01

    This document outlines the verification and validation effort for the SHIELD, SHLDED, GEDIT, GENPRT, FIPROD, FPCALC, and PROCES modules of the SHIELD system code. Along with its predecessors, SHIELD has been in use at the Savannah River Site (SRS) for more than ten years. During this time the code has been extensively tested and a variety of validation documents have been issued. The primary function of this report is to specify the features and capabilities for which SHIELD is to be considered validated, and to reference the documents that establish the validation

  20. Regional Test Center Operations Manual

    Energy Technology Data Exchange (ETDEWEB)

    Stein, Joshua [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Burnham, Laurie [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jones, Christian Birk [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-07-01

    The U.S. DOE Regional Test Center for Solar Technologies program was established to validate photovoltaic (PV) technologies installed in a range of different climates. The program is funded by the Energy Department's SunShot Initiative. The initiative seeks to make solar energy cost competitive with other forms of electricity by the end of the decade. Sandia National Laboratory currently manages four different sites across the country. The National Renewable Energy Laboratory manages a fifth site in Colorado. The entire PV portfolio currently includes 20 industry partners and almost 500 kW of installed systems. The program follows a defined process that outlines tasks, milestones, agreements, and deliverables. The process is broken out into four main parts: 1) planning and design, 2) installation, 3) operations, and 4) decommissioning. This operations manual defines the various elements of each part.

  1. Radiochemical verification and validation in the environmental data collection process

    International Nuclear Information System (INIS)

    Rosano-Reece, D.; Bottrell, D.; Bath, R.J.

    1994-01-01

    A credible and cost effective environmental data collection process should produce analytical data which meets regulatory and program specific requirements. Analytical data, which support the sampling and analysis activities at hazardous waste sites, undergo verification and independent validation before the data are submitted to regulators. Understanding the difference between verification and validation and their respective roles in the sampling and analysis process is critical to the effectiveness of a program. Verification is deciding whether the measurement data obtained are what was requested. The verification process determines whether all the requirements were met. Validation is more complicated than verification. It attempts to assess the impacts on data use, especially when requirements are not met. Validation becomes part of the decision-making process. Radiochemical data consists of a sample result with an associated error. Therefore, radiochemical validation is different and more quantitative than is currently possible for the validation of hazardous chemical data. Radiochemical data include both results and uncertainty that can be statistically compared to identify significance of differences in a more technically defensible manner. Radiochemical validation makes decisions about analyte identification, detection, and uncertainty for a batch of data. The process focuses on the variability of the data in the context of the decision to be made. The objectives of this paper are to present radiochemical verification and validation for environmental data and to distinguish the differences between the two operations

  2. Validation of gamma irradiator controls for quality and regulatory compliance

    International Nuclear Information System (INIS)

    Harding, R.B.; Pinteric, F.J.A.

    1995-01-01

    Since 1978 the U.S. Food and Drug Administration (FDA) has had both the legal authority and the Current Good Manufacturing Practice (CGMP) regulations in place to require irradiator owners who process medical devices to produce evidence of Irradiation Process Validation. One of the key components of Irradiation Process Validation is the validation of the irradiator controls. However, it is only recently that FDA audits have focused on this component of the process validation. What is Irradiator Control System Validation? What constitutes evidence of control? How do owners obtain evidence? What is the irradiator supplier's role in validation? How does the ISO 9000 Quality Standard relate to the FDA's CGMP requirement for evidence of Control System Validation? This paper presents answers to these questions based on the recent experiences of Nordion's engineering and product management staff who have worked with several US-based irradiator owners. This topic - Validation of Irradiator Controls - is a significant regulatory compliance and operations issues within the irradiator suppliers' and users' community. (author)

  3. Space station operations management

    Science.gov (United States)

    Cannon, Kathleen V.

    1989-01-01

    Space Station Freedom operations management concepts must be responsive to the unique challenges presented by the permanently manned international laboratory. Space Station Freedom will be assembled over a three year period where the operational environment will change as significant capability plateaus are reached. First Element Launch, Man-Tended Capability, and Permanent Manned Capability, represent milestones in operational capability that is increasing toward mature operations capability. Operations management concepts are being developed to accomodate the varying operational capabilities during assembly, as well as the mature operational environment. This paper describes operations management concepts designed to accomodate the uniqueness of Space Station Freedoom, utilizing tools and processes that seek to control operations costs.

  4. Validation of human factor engineering integrated system

    International Nuclear Information System (INIS)

    Fang Zhou

    2013-01-01

    Apart from hundreds of thousands of human-machine interface resources, the control room of a nuclear power plant is a complex system integrated with many factors such as procedures, operators, environment, organization and management. In the design stage, these factors are considered by different organizations separately. However, whether above factors could corporate with each other well in operation and whether they have good human factors engineering (HFE) design to avoid human error, should be answered in validation of the HFE integrated system before delivery of the plant. This paper addresses the research and implementation of the ISV technology based on case study. After introduction of the background, process and methodology of ISV, the results of the test are discussed. At last, lessons learned from this research are summarized. (authors)

  5. Experimental validation of UTDefect

    Energy Technology Data Exchange (ETDEWEB)

    Eriksson, A.S. [ABB Tekniska Roentgencentralen AB, Taeby (Sweden); Bostroem, A.; Wirdelius, H. [Chalmers Univ. of Technology, Goeteborg (Sweden). Div. of Mechanics

    1997-01-01

    This study reports on conducted experiments and computer simulations of ultrasonic nondestructive testing (NDT). Experiments and simulations are compared with the purpose of validating the simulation program UTDefect. UTDefect simulates ultrasonic NDT of cracks and some other defects in isotropic and homogeneous materials. Simulations for the detection of surface breaking cracks are compared with experiments in pulse-echo mode on surface breaking cracks in carbon steel plates. The echo dynamics are plotted and compared with the simulations. The experiments are performed on a plate with thickness 36 mm and the crack depths are 7.2 mm and 18 mm. L- and T-probes with frequency 1, 2 and 4 MHz and angels 45, 60 and 70 deg are used. In most cases the probe and the crack is on opposite sides of the plate, but in some cases they are on the same side. Several cracks are scanned from two directions. In total 53 experiments are reported for 33 different combinations. Generally the simulations agree well with the experiments and UTDefect is shown to be able to, within certain limits, perform simulations that are close to experiments. It may be concluded that: For corner echoes the eight 45 deg cases and the eight 60 deg cases show good agreement between experiments and UTDefect, especially for the 7.2 mm crack. The amplitudes differ more for some cases where the defect is close to the probe and for the corner of the 18 mm crack. For the two 70 deg cases there are too few experimental values to compare the curve shapes, but the amplitudes do not differ too much. The tip diffraction echoes also agree well in general. For some cases, where the defect is close to the probe, the amplitudes differ more than 10-15 dB, but for all but two cases the difference in amplitude is less than 7 dB. 6 refs.

  6. Signal validation in nuclear power plants: Progress report No. 3

    International Nuclear Information System (INIS)

    Kerlin, T.W.; Upadhyaya, B.R.

    1987-01-01

    This report summarizes the progress in the Signal Validation Project. Specifically, the advances made in several of the modules have been described. Some of these modules are now ready for preliminary implementation using plant operational data. Arrangements with Northeast Utilities Service Company (NUSCO) for transferring several sets of plant operational data from Millstone-3 PWR have been made in preparation for this phase of the project

  7. Heat transfer operators associated with quantum operations

    International Nuclear Information System (INIS)

    Aksak, C; Turgut, S

    2011-01-01

    Any quantum operation applied on a physical system is performed as a unitary transformation on a larger extended system. If the extension used is a heat bath in thermal equilibrium, the concomitant change in the state of the bath necessarily implies a heat exchange with it. The dependence of the average heat transferred to the bath on the initial state of the system can then be found from the expectation value of a Hermitian operator, which is named as the heat transfer operator (HTO). The purpose of this paper is to investigate the relation between the HTOs and the associated quantum operations. Since any given quantum operation on a system can be realized by different baths and unitaries, many different HTOs are possible for each quantum operation. On the other hand, there are also strong restrictions on the HTOs which arise from the unitarity of the transformations. The most important of these is the Landauer erasure principle. This paper is concerned with the question of finding a complete set of restrictions on the HTOs that are associated with a given quantum operation. An answer to this question has been found only for a subset of quantum operations. For erasure operations, these characterizations are equivalent to the generalized Landauer erasure principle. For the case of generic quantum operations, however, it appears that the HTOs obey further restrictions which cannot be obtained from the entropic restrictions of the generalized Landauer erasure principle.

  8. Cleaning Validation of Fermentation Tanks

    DEFF Research Database (Denmark)

    Salo, Satu; Friis, Alan; Wirtanen, Gun

    2008-01-01

    Reliable test methods for checking cleanliness are needed to evaluate and validate the cleaning process of fermentation tanks. Pilot scale tanks were used to test the applicability of various methods for this purpose. The methods found to be suitable for validation of the clenlinees were visula...

  9. The validation of language tests

    African Journals Online (AJOL)

    KATEVG

    Stellenbosch Papers in Linguistics, Vol. ... validation is necessary because of the major impact which test results can have on the many ... Messick (1989: 20) introduces his much-quoted progressive matrix (cf. table 1), which ... argue that current accounts of validity only superficially address theories of measurement.

  10. Validity in SSM: neglected areas

    NARCIS (Netherlands)

    Pala, O.; Vennix, J.A.M.; Mullekom, T.L. van

    2003-01-01

    Contrary to the prevailing notion in hard OR, in soft system methodology (SSM), validity seems to play a minor role. The primary reason for this is that SSM models are of a different type, they are not would-be descriptions of real-world situations. Therefore, establishing their validity, that is

  11. The Consequences of Consequential Validity.

    Science.gov (United States)

    Mehrens, William A.

    1997-01-01

    There is no agreement at present about the importance or meaning of the term "consequential validity." It is important that the authors of revisions to the "Standards for Educational and Psychological Testing" recognize the debate and relegate discussion of consequences to a context separate from the discussion of validity.…

  12. Current Concerns in Validity Theory.

    Science.gov (United States)

    Kane, Michael

    Validity is concerned with the clarification and justification of the intended interpretations and uses of observed scores. It has not been easy to formulate a general methodology set of principles for validation, but progress has been made, especially as the field has moved from relatively limited criterion-related models to sophisticated…

  13. Open-Source as a strategy for operational software - the case of Enki

    Science.gov (United States)

    Kolberg, Sjur; Bruland, Oddbjørn

    2014-05-01

    Since 2002, SINTEF Energy has been developing what is now known as the Enki modelling system. This development has been financed by Norway's largest hydropower producer Statkraft, motivated by a desire for distributed hydrological models in operational use. As the owner of the source code, Statkraft has recently decided on Open Source as a strategy for further development, and for migration from an R&D context to operational use. A current cooperation project is currently carried out between SINTEF Energy, 7 large Norwegian hydropower producers including Statkraft, three universities and one software company. Of course, the most immediate task is that of software maturing. A more important challenge, however, is one of gaining experience within the operational hydropower industry. A transition from lumped to distributed models is likely to also require revision of measurement program, calibration strategy, use of GIS and modern data sources like weather radar and satellite imagery. On the other hand, map based visualisations enable a richer information exchange between hydrologic forecasters and power market traders. The operating context of a distributed hydrology model within hydropower planning is far from settled. Being both a modelling framework and a library of plugin-routines to build models from, Enki supports the flexibility needed in this situation. Recent development has separated the core from the user interface, paving the way for a scripting API, cross-platform compilation, and front-end programs serving different degrees of flexibility, robustness and security. The open source strategy invites anyone to use Enki and to develop and contribute new modules. Once tested, the same modules are available for the operational versions of the program. A core challenge is to offer rigid testing procedures and mechanisms to reject routines in an operational setting, without limiting the experimentation with new modules. The Open Source strategy also has

  14. Animal Feeding Operations

    Science.gov (United States)

    ... type=”submit” value=”Submit” /> Healthy Water Home Animal Feeding Operations Recommend on Facebook Tweet Share Compartir ... of Concentrated Animal Feeding Operations (CAFOs) What are Animal Feeding Operations (AFOs)? According to the United States ...

  15. IP validation in remote microelectronics testing

    Science.gov (United States)

    Osseiran, Adam; Eshraghian, Kamran; Lachowicz, Stefan; Zhao, Xiaoli; Jeffery, Roger; Robins, Michael

    2004-03-01

    This paper presents the test and validation of FPGA based IP using the concept of remote testing. It demonstrates how a virtual tester environment based on a powerful, networked Integrated Circuit testing facility, aimed to complement the emerging Australian microelectronics based research and development, can be employed to perform the tasks beyond the standard IC test. IC testing in production consists in verifying the tested products and eliminating defective parts. Defects could have a number of different causes, including process defects, process migration and IP design and implementation errors. One of the challenges in semiconductor testing is that while current fault models are used to represent likely faults (stuck-at, delay, etc.) in a global context, they do not account for all possible defects. Research in this field keeps growing but the high cost of ATE is preventing a large community from accessing test and verification equipment to validate innovative IP designs. For these reasons a world class networked IC teletest facility has been established in Australia under the support of the Commonwealth government. The facility is based on a state-of-the-art semiconductor tester operating as a virtual centre spanning Australia and accessible internationally. Through a novel approach the teletest network provides virtual access to the tester on which the DUT has previously been placed. The tester software is then accessible as if the designer is sitting next to the tester. This paper presents the approach used to test and validate FPGA based IPs using this remote test approach.

  16. Nuclear data to support computer code validation

    International Nuclear Information System (INIS)

    Fisher, S.E.; Broadhead, B.L.; DeHart, M.D.; Primm, R.T. III

    1997-04-01

    The rate of plutonium disposition will be a key parameter in determining the degree of success of the Fissile Materials Disposition Program. Estimates of the disposition rate are dependent on neutronics calculations. To ensure that these calculations are accurate, the codes and data should be validated against applicable experimental measurements. Further, before mixed-oxide (MOX) fuel can be fabricated and loaded into a reactor, the fuel vendors, fabricators, fuel transporters, reactor owners and operators, regulatory authorities, and the Department of Energy (DOE) must accept the validity of design calculations. This report presents sources of neutronics measurements that have potential application for validating reactor physics (predicting the power distribution in the reactor core), predicting the spent fuel isotopic content, predicting the decay heat generation rate, certifying criticality safety of fuel cycle facilities, and ensuring adequate radiation protection at the fuel cycle facilities and the reactor. The U.S. in-reactor experience with MOX fuel is first presented, followed by information related to other aspects of the MOX fuel performance information that is valuable to this program, but the data base remains largely proprietary. Thus, this information is not reported here. It is expected that the selected consortium will make the necessary arrangements to procure or have access to the requisite information

  17. Verification and validation methodology of training simulators

    International Nuclear Information System (INIS)

    Hassan, M.W.; Khan, N.M.; Ali, S.; Jafri, M.N.

    1997-01-01

    A full scope training simulator comprising of 109 plant systems of a 300 MWe PWR plant contracted by Pakistan Atomic Energy Commission (PAEC) from China is near completion. The simulator has its distinction in the sense that it will be ready prior to fuel loading. The models for the full scope training simulator have been developed under APROS (Advanced PROcess Simulator) environment developed by the Technical Research Center (VTT) and Imatran Voima (IVO) of Finland. The replicated control room of the plant is contracted from Shanghai Nuclear Engineering Research and Design Institute (SNERDI), China. The development of simulation models to represent all the systems of the target plant that contribute to plant dynamics and are essential for operator training has been indigenously carried out at PAEC. This multifunctional simulator is at present under extensive testing and will be interfaced with the control planes in March 1998 so as to realize a full scope training simulator. The validation of the simulator is a joint venture between PAEC and SNERDI. For the individual components and the individual plant systems, the results have been compared against design data and PSAR results to confirm the faithfulness of the simulator against the physical plant systems. The reactor physics parameters have been validated against experimental results and benchmarks generated using design codes. Verification and validation in the integrated state has been performed against the benchmark transients conducted using the RELAP5/MOD2 for the complete spectrum of anticipated transient covering the well known five different categories. (author)

  18. Conduct of operations: establishing operational focus and setting operational standards

    International Nuclear Information System (INIS)

    Lane, L.; McGuigan, K.

    1998-01-01

    Due to the nature of our business, we have often tended to focus on the technological aspects of the nuclear industry. The focus of this paper is directed towards the importance of addressing the people skills, attitudes, and 'culture' within, and surrounding, our facilities as key areas of improvement. Within Ontario Hydro Nuclear (OLIN) we have developed the terminology 'event free' operation and 'event free' culture. 'Event Free' recognizes errors as a part of human performance. 'Event Free' takes into account human weaknesses, and provides tools (such as standards) to manage, control, and mitigate errors. In essence, 'Event Free' encompasses two concepts: 1. Prevent errors from occurring; 2. If an error is made, catch it before it can affect safe operation of the facility, learn from the error, and ensure that it does not happen again. In addressing these business realities, Ontario Hydro has identified a number of key support mechanisms and corresponding performance standards that are essential for achieving operating excellence and an 'event free' business culture. This paper will discuss two operational aspects of an 'event free' culture, the first being a set of expectations to enhance the culture, and the second an example of cultural change: 1. Operating Standards - establishing clear expectations for human performance in operating staff; 2. Operational Focus - the understanding that, as a nuclear worker, you should consider every task, activity, in fact everything you do in this business, for the potential to affect safe and reliable operation of a nuclear facility. Note that although the term 'Operational' appears in the title, this concept applies to every individual in the nuclear business, from the cleaner, to the Board of Directors, to the external supplier. (author)

  19. Overview of SCIAMACHY validation: 2002–2004

    Directory of Open Access Journals (Sweden)

    A. J. M. Piters

    2006-01-01

    Full Text Available SCIAMACHY, on board Envisat, has been in operation now for almost three years. This UV/visible/NIR spectrometer measures the solar irradiance, the earthshine radiance scattered at nadir and from the limb, and the attenuation of solar radiation by the atmosphere during sunrise and sunset, from 240 to 2380 nm and at moderate spectral resolution. Vertical columns and profiles of a variety of atmospheric constituents are inferred from the SCIAMACHY radiometric measurements by dedicated retrieval algorithms. With the support of ESA and several international partners, a methodical SCIAMACHY validation programme has been developed jointly by Germany, the Netherlands and Belgium (the three instrument providing countries to face complex requirements in terms of measured species, altitude range, spatial and temporal scales, geophysical states and intended scientific applications. This summary paper describes the approach adopted to address those requirements. Since provisional releases of limited data sets in summer 2002, operational SCIAMACHY processors established at DLR on behalf of ESA were upgraded regularly and some data products – level-1b spectra, level-2 O3, NO2, BrO and clouds data – have improved significantly. Validation results summarised in this paper and also reported in this special issue conclude that for limited periods and geographical domains they can already be used for atmospheric research. Nevertheless, current processor versions still experience known limitations that hamper scientific usability in other periods and domains. Free from the constraints of operational processing, seven scientific institutes (BIRA-IASB, IFE/IUP-Bremen, IUP-Heidelberg, KNMI, MPI, SAO and SRON have developed their own retrieval algorithms and generated SCIAMACHY data products, together addressing nearly all targeted constituents. Most of the UV-visible data products – O3, NO2, SO2, H2O total columns; BrO, OClO slant columns; O3, NO2, BrO profiles

  20. Overview of SCIAMACHY validation: 2002-2004

    Science.gov (United States)

    Piters, A. J. M.; Bramstedt, K.; Lambert, J.-C.; Kirchhoff, B.

    2006-01-01

    SCIAMACHY, on board Envisat, has been in operation now for almost three years. This UV/visible/NIR spectrometer measures the solar irradiance, the earthshine radiance scattered at nadir and from the limb, and the attenuation of solar radiation by the atmosphere during sunrise and sunset, from 240 to 2380 nm and at moderate spectral resolution. Vertical columns and profiles of a variety of atmospheric constituents are inferred from the SCIAMACHY radiometric measurements by dedicated retrieval algorithms. With the support of ESA and several international partners, a methodical SCIAMACHY validation programme has been developed jointly by Germany, the Netherlands and Belgium (the three instrument providing countries) to face complex requirements in terms of measured species, altitude range, spatial and temporal scales, geophysical states and intended scientific applications. This summary paper describes the approach adopted to address those requirements. Since provisional releases of limited data sets in summer 2002, operational SCIAMACHY processors established at DLR on behalf of ESA were upgraded regularly and some data products - level-1b spectra, level-2 O3, NO2, BrO and clouds data - have improved significantly. Validation results summarised in this paper and also reported in this special issue conclude that for limited periods and geographical domains they can already be used for atmospheric research. Nevertheless, current processor versions still experience known limitations that hamper scientific usability in other periods and domains. Free from the constraints of operational processing, seven scientific institutes (BIRA-IASB, IFE/IUP-Bremen, IUP-Heidelberg, KNMI, MPI, SAO and SRON) have developed their own retrieval algorithms and generated SCIAMACHY data products, together addressing nearly all targeted constituents. Most of the UV-visible data products - O3, NO2, SO2, H2O total columns; BrO, OClO slant columns; O3, NO2, BrO profiles - already have acceptable

  1. PRA (Probabilistic Risk Assessments) Participation versus Validation

    Science.gov (United States)

    DeMott, Diana; Banke, Richard

    2013-01-01

    Probabilistic Risk Assessments (PRAs) are performed for projects or programs where the consequences of failure are highly undesirable. PRAs primarily address the level of risk those projects or programs posed during operations. PRAs are often developed after the design has been completed. Design and operational details used to develop models include approved and accepted design information regarding equipment, components, systems and failure data. This methodology basically validates the risk parameters of the project or system design. For high risk or high dollar projects, using PRA methodologies during the design process provides new opportunities to influence the design early in the project life cycle to identify, eliminate or mitigate potential risks. Identifying risk drivers before the design has been set allows the design engineers to understand the inherent risk of their current design and consider potential risk mitigation changes. This can become an iterative process where the PRA model can be used to determine if the mitigation technique is effective in reducing risk. This can result in more efficient and cost effective design changes. PRA methodology can be used to assess the risk of design alternatives and can demonstrate how major design changes or program modifications impact the overall program or project risk. PRA has been used for the last two decades to validate risk predictions and acceptability. Providing risk information which can positively influence final system and equipment design the PRA tool can also participate in design development, providing a safe and cost effective product.

  2. The LSST operations simulator

    Science.gov (United States)

    Delgado, Francisco; Saha, Abhijit; Chandrasekharan, Srinivasan; Cook, Kem; Petry, Catherine; Ridgway, Stephen

    2014-08-01

    The Operations Simulator for the Large Synoptic Survey Telescope (LSST; http://www.lsst.org) allows the planning of LSST observations that obey explicit science driven observing specifications, patterns, schema, and priorities, while optimizing against the constraints placed by design-specific opto-mechanical system performance of the telescope facility, site specific conditions as well as additional scheduled and unscheduled downtime. It has a detailed model to simulate the external conditions with real weather history data from the site, a fully parameterized kinematic model for the internal conditions of the telescope, camera and dome, and serves as a prototype for an automatic scheduler for the real time survey operations with LSST. The Simulator is a critical tool that has been key since very early in the project, to help validate the design parameters of the observatory against the science requirements and the goals from specific science programs. A simulation run records the characteristics of all observations (e.g., epoch, sky position, seeing, sky brightness) in a MySQL database, which can be queried for any desired purpose. Derivative information digests of the observing history are made with an analysis package called Simulation Survey Tools for Analysis and Reporting (SSTAR). Merit functions and metrics have been designed to examine how suitable a specific simulation run is for several different science applications. Software to efficiently compare the efficacy of different survey strategies for a wide variety of science applications using such a growing set of metrics is under development. A recent restructuring of the code allows us to a) use "look-ahead" strategies that avoid cadence sequences that cannot be completed due to observing constraints; and b) examine alternate optimization strategies, so that the most efficient scheduling algorithm(s) can be identified and used: even few-percent efficiency gains will create substantive scientific

  3. Saxton Transportation Operations Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The Saxton Transportation Operations Laboratory (Saxton Laboratory) is a state-of-the-art facility for conducting transportation operations research. The laboratory...

  4. The measurement of instrumental ADL: content validity and construct validity

    DEFF Research Database (Denmark)

    Avlund, K; Schultz-Larsen, K; Kreiner, S

    1993-01-01

    do not depend on help. It is also possible to add the items in a valid way. However, to obtain valid IADL-scales, we omitted items that were highly relevant to especially elderly women, such as house-work items. We conclude that the criteria employed for this IADL-measure are somewhat contradictory....... showed that 14 items could be combined into two qualitatively different additive scales. The IADL-measure complies with demands for content validity, distinguishes between what the elderly actually do, and what they are capable of doing, and is a good discriminator among the group of elderly persons who...

  5. Your Lung Operation: After Your Operation

    Medline Plus

    Full Text Available ... Surgical Skills for Exposure in Trauma Advanced Trauma Life Support Advanced Trauma Operative Management Basic Endovascular Skills for Trauma Disaster Management and Emergency ...

  6. Traffic modelling validation of advanced driver assistance systems

    NARCIS (Netherlands)

    Tongeren, R. van; Gietelink, O.J.; Schutter, B. de; Verhaegen, M.

    2007-01-01

    This paper presents a microscopic traffic model for the validation of advanced driver assistance systems. This model describes single-lane traffic and is calibrated with data from a field operational test. To illustrate the use of the model, a Monte Carlo simulation of single-lane traffic scenarios

  7. Use of fuzzy logic in signal processing and validation

    International Nuclear Information System (INIS)

    Heger, A.S.; Alang-Rashid, N.K.; Holbert, K.E.

    1993-01-01

    The advent of fuzzy logic technology has afforded another opportunity to reexamine the signal processing and validation process (SPV). The features offered by fuzzy logic can lend themselves to a more reliable and perhaps fault-tolerant approach to SPV. This is particularly attractive to complex system operations, where optimal control for safe operation depends on reliable input data. The reason for the use of fuzzy logic as the tool for SPV is its ability to transform information from the linguistic domain to a mathematical domain for processing and then transformation of its result back into the linguistic domain for presentation. To ensure the safe and optimal operation of a nuclear plant, for example, reliable and valid data must be available to the human and computer operators. Based on these input data, the operators determine the current state of the power plant and project corrective actions for future states. This determination is based on available data and the conceptual and mathematical models for the plant. A fault-tolerant SPV based on fuzzy logic can help the operators meet the objective of effective, efficient, and safe operation of the nuclear power plant. The ultimate product of this project will be a code that will assist plant operators in making informed decisions under uncertain conditions when conflicting signals may be present

  8. Separable quadratic stochastic operators

    International Nuclear Information System (INIS)

    Rozikov, U.A.; Nazir, S.

    2009-04-01

    We consider quadratic stochastic operators, which are separable as a product of two linear operators. Depending on properties of these linear operators we classify the set of the separable quadratic stochastic operators: first class of constant operators, second class of linear and third class of nonlinear (separable) quadratic stochastic operators. Since the properties of operators from the first and second classes are well known, we mainly study the properties of the operators of the third class. We describe some Lyapunov functions of the operators and apply them to study ω-limit sets of the trajectories generated by the operators. We also compare our results with known results of the theory of quadratic operators and give some open problems. (author)

  9. Selection/licensing of nuclear power plant operators

    International Nuclear Information System (INIS)

    Saari, L.M.

    1983-07-01

    An important aspect of nuclear power plant (NPP) safety is the reactor operator in the control room. The operators are the first individuals to deal with an emergency situation, and thus, effective performance on their part is essential for safe plant operations. Important issues pertaining to NPP reactor operators would fall within the personnel subsystem of our safety system analysis. While there are many potential aspects of the personnel subsystem, a key first step in this focus is the selection of individuals - attempting to choose individuals for the job of reactor operator who will safely perform the job. This requires a valid (job-related) selection process. Some background information on the Nuclear Regulatory Commission (NRC) licensing process used for selecting NPP reactor operators is briefly presented and a description of a research endeavor now underway at Battelle for developing a valid reactor operator licensing examination is included

  10. Software validation applied to spreadsheets used in laboratories working under ISO/IEC 17025

    Science.gov (United States)

    Banegas, J. M.; Orué, M. W.

    2016-07-01

    Several documents deal with software validation. Nevertheless, more are too complex to be applied to validate spreadsheets - surely the most used software in laboratories working under ISO/IEC 17025. The method proposed in this work is intended to be directly applied to validate spreadsheets. It includes a systematic way to document requirements, operational aspects regarding to validation, and a simple method to keep records of validation results and modifications history. This method is actually being used in an accredited calibration laboratory, showing to be practical and efficient.

  11. Bibliography for Verification and Validation in Computational Simulation

    International Nuclear Information System (INIS)

    Oberkampf, W.L.

    1998-01-01

    A bibliography has been compiled dealing with the verification and validation of computational simulations. The references listed in this bibliography are concentrated in the field of computational fluid dynamics (CFD). However, references from the following fields are also included: operations research, heat transfer, solid dynamics, software quality assurance, software accreditation, military systems, and nuclear reactor safety. This bibliography, containing 221 references, is not meant to be comprehensive. It was compiled during the last ten years in response to the author's interest and research in the methodology for verification and validation. The emphasis in the bibliography is in the following areas: philosophy of science underpinnings, development of terminology and methodology, high accuracy solutions for CFD verification, experimental datasets for CFD validation, and the statistical quantification of model validation. This bibliography should provide a starting point for individual researchers in many fields of computational simulation in science and engineering

  12. Mollusc reproductive toxicity tests - Development and validation of test guidelines

    DEFF Research Database (Denmark)

    Ducrot, Virginie; Holbech, Henrik; Kinnberg, Karin Lund

    . Draft standard operating procedures (SOPs) have been designed based upon literature and expert knowledge from project partners. Pre-validation studies have been implemented to validate the proposed test conditions and identify issues in performing the SOPs and analyzing test results. Pre-validation work......The Organisation for Economic Cooperation and Development is promoting the development and validation of mollusc toxicity tests within its test guidelines programme, eventually aiming for the standardization of mollusc apical toxicity tests. Through collaborative work between academia, industry...... and stakeholders, this study aims to develop innovative partial life-cycle tests on the reproduction of the freshwater gastropods Potamopyrgus antipodarum and Lymnaea stagnalis, which are relevant candidate species for the standardization of mollusc apical toxicity tests assessing reprotoxic effects of chemicals...

  13. Bibliography for Verification and Validation in Computational Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, W.L.

    1998-10-01

    A bibliography has been compiled dealing with the verification and validation of computational simulations. The references listed in this bibliography are concentrated in the field of computational fluid dynamics (CFD). However, references from the following fields are also included: operations research, heat transfer, solid dynamics, software quality assurance, software accreditation, military systems, and nuclear reactor safety. This bibliography, containing 221 references, is not meant to be comprehensive. It was compiled during the last ten years in response to the author's interest and research in the methodology for verification and validation. The emphasis in the bibliography is in the following areas: philosophy of science underpinnings, development of terminology and methodology, high accuracy solutions for CFD verification, experimental datasets for CFD validation, and the statistical quantification of model validation. This bibliography should provide a starting point for individual researchers in many fields of computational simulation in science and engineering.

  14. Validation of the TEXSAN thermal-hydraulic analysis program

    International Nuclear Information System (INIS)

    Burns, S.P.; Klein, D.E.

    1992-01-01

    The TEXSAN thermal-hydraulic analysis program has been developed by the University of Texas at Austin (UT) to simulate buoyancy driven fluid flow and heat transfer in spent fuel and high level nuclear waste (HLW) shipping applications. As part of the TEXSAN software quality assurance program, the software has been subjected to a series of test cases intended to validate its capabilities. The validation tests include many physical phenomena which arise in spent fuel and HLW shipping applications. This paper describes some of the principal results of the TEXSAN validation tests and compares them to solutions available in the open literature. The TEXSAN validation effort has shown that the TEXSAN program is stable and consistent under a range of operating conditions and provides accuracy comparable with other heat transfer programs and evaluation techniques. The modeling capabilities and the interactive user interface employed by the TEXSAN program should make it a useful tool in HLW transportation analysis

  15. Dosimetric studies for gamma radiation validation of medical devices

    International Nuclear Information System (INIS)

    Soliman, Y.S.; Beshir, W.B.; Abdel-Fattah, A.A.; Abdel-Rehim, F.

    2013-01-01

    The delivery and validation of a specified dose to medical devices are key concerns to operators of gamma radiation facilities. The objective of the present study was to characterize the industrial gamma radiation facility and map the dose distribution inside the product-loading pattern during the validation and routine control of the sterilization process using radiochromic films. Cardboard phantoms were designed to achieve the homogeneity of absorbed doses. The uncertainty of the dose delivered during validation of the sterilization process was assessed. - Highlights: ► Using γ-rays for sterilization of hollow fiber dialyzers and blood tubing sets according to ISO 11137, 2006. ► Dosimetry studies of validations of γ-irradiation facility and sterilized medical devices. ► Places of D min and D max have been determined using FWT-60 films. ► Determining the target minimum doses required to meet the desired SAL of 10 −6 for the two products.

  16. Convergent validity test, construct validity test and external validity test of the David Liberman algorithm

    Directory of Open Access Journals (Sweden)

    David Maldavsky

    2013-08-01

    Full Text Available The author first exposes a complement of a previous test about convergent validity, then a construct validity test and finally an external validity test of the David Liberman algorithm.  The first part of the paper focused on a complementary aspect, the differential sensitivity of the DLA 1 in an external comparison (to other methods, and 2 in an internal comparison (between two ways of using the same method, the DLA.  The construct validity test exposes the concepts underlined to DLA, their operationalization and some corrections emerging from several empirical studies we carried out.  The external validity test examines the possibility of using the investigation of a single case and its relation with the investigation of a more extended sample.

  17. Validation of EAF-2005 data

    International Nuclear Information System (INIS)

    Kopecky, J.

    2005-01-01

    Full text: Validation procedures applied on EAF-2003 starter file, which lead to the production of EAF-2005 library, are described. The results in terms of reactions with assigned quality scores in EAF-20005 are given. Further the extensive validation against the recent integral data is discussed together with the status of the final report 'Validation of EASY-2005 using integral measurements'. Finally, the novel 'cross section trend analysis' is presented with some examples of its use. This action will lead to the release of improved library EAF-2005.1 at the end of 2005, which shall be used as the starter file for EAF-2007. (author)

  18. Validity and validation of expert (Q)SAR systems.

    Science.gov (United States)

    Hulzebos, E; Sijm, D; Traas, T; Posthumus, R; Maslankiewicz, L

    2005-08-01

    At a recent workshop in Setubal (Portugal) principles were drafted to assess the suitability of (quantitative) structure-activity relationships ((Q)SARs) for assessing the hazards and risks of chemicals. In the present study we applied some of the Setubal principles to test the validity of three (Q)SAR expert systems and validate the results. These principles include a mechanistic basis, the availability of a training set and validation. ECOSAR, BIOWIN and DEREK for Windows have a mechanistic or empirical basis. ECOSAR has a training set for each QSAR. For half of the structural fragments the number of chemicals in the training set is >4. Based on structural fragments and log Kow, ECOSAR uses linear regression to predict ecotoxicity. Validating ECOSAR for three 'valid' classes results in predictivity of > or = 64%. BIOWIN uses (non-)linear regressions to predict the probability of biodegradability based on fragments and molecular weight. It has a large training set and predicts non-ready biodegradability well. DEREK for Windows predictions are supported by a mechanistic rationale and literature references. The structural alerts in this program have been developed with a training set of positive and negative toxicity data. However, to support the prediction only a limited number of chemicals in the training set is presented to the user. DEREK for Windows predicts effects by 'if-then' reasoning. The program predicts best for mutagenicity and carcinogenicity. Each structural fragment in ECOSAR and DEREK for Windows needs to be evaluated and validated separately.

  19. International co-operation

    International Nuclear Information System (INIS)

    1998-01-01

    In this part the are reviewed: Co-operation with IAEA; Participation of the Slovakia on the 41 st session of the General Conference; The comprehensive Nuclear-Test-Ban Treaty Organization; Co-operation with the Organization for Economic Co-operation and Development; co-operation with the European Commission; Fulfillment of obligations resulting from the international contracting documents

  20. Biomedical programs operations plans

    Science.gov (United States)

    Walbrecher, H. F.

    1974-01-01

    Operational guidelines for the space shuttle life sciences payloads are presented. An operational assessment of the medical experimental altitude test for Skylab, and Skylab life sciences documentation are discussed along with the operations posture and collection of space shuttle operational planning data.

  1. Composite operators in QCD

    International Nuclear Information System (INIS)

    Sonoda, Hidenori

    1992-01-01

    We give a formula for the derivatives of a correlation function of composite operators with respect to the parameters (i.e. the strong fine structure constant and the quark mass) of QCD in four- dimensional euclidean space. The formula is given as spatial integration of the operator conjugate to a parameter. The operator product of a composite operator and a conjugate operator has an unintegrable part, and the formula requires divergent subtractions. By imposing consistency conditions we drive a relation between the anomalous dimensions of the composite operators and the unintegrable part of the operator product coefficients. (orig.)

  2. The AECL operator companion

    International Nuclear Information System (INIS)

    Lupton, L.R.; Anderson, L.L.; Basso, R.A.J.

    1989-11-01

    As CANDU plants become more complex, and are operated under tighter constraints and for longer periods between outages, plant operations staff will have to absorb more information to correctly and rapidly respond to upsets. A development program is underway at AECL to use expert systems and interactive media tools to assist operations staff of existing and future CANDU plants. The complete system for plant information access and display, on-line advice and diagnosis, and interactive operating procedures is called the Operator Companion. A prototype, consisting of operator consoles, expert systems and simulation modules in a distributed architecture, is currently being developed to demonstrate the concepts of the Operator Companion

  3. Development of an expert system for signal validation

    International Nuclear Information System (INIS)

    Qualls, A.L.; Uhrig, R.E.; Upadhyaya, B.R.

    1988-01-01

    Diagnosis of malfunctions in power plants has traditionally been in the domain of the process operator, who relies on training, experience, and reasoning ability to diagnose faults. The authors describe a method of signal validation using expert system technology, which detects possible anomalies in an instrument channel's output, similar to the procedure used by an operator. The system can be used to scan quickly over an array of sensor outputs and flag those that are observed to have possible anomalies. This system, when implemented in an operating power plant, could be used for continuous, on-line instrument anomaly detection with a minimum of computational effort

  4. Validation of Autonomous Space Systems

    Data.gov (United States)

    National Aeronautics and Space Administration — System validation addresses the question "Will the system do the right thing?" When system capability includes autonomy, the question becomes more pointed. As NASA...

  5. Magnetic Signature Analysis & Validation System

    National Research Council Canada - National Science Library

    Vliet, Scott

    2001-01-01

    The Magnetic Signature Analysis and Validation (MAGSAV) System is a mobile platform that is used to measure, record, and analyze the perturbations to the earth's ambient magnetic field caused by object such as armored vehicles...

  6. Mercury and Cyanide Data Validation

    Science.gov (United States)

    Document designed to offer data reviewers guidance in determining the validity ofanalytical data generated through the USEPA Contract Laboratory Program (CLP) Statement ofWork (SOW) ISM01.X Inorganic Superfund Methods (Multi-Media, Multi-Concentration)

  7. ICP-MS Data Validation

    Science.gov (United States)

    Document designed to offer data reviewers guidance in determining the validity ofanalytical data generated through the USEPA Contract Laboratory Program Statement ofWork (SOW) ISM01.X Inorganic Superfund Methods (Multi-Media, Multi-Concentration)

  8. Contextual Validity in Hybrid Logic

    DEFF Research Database (Denmark)

    Blackburn, Patrick Rowan; Jørgensen, Klaus Frovin

    2013-01-01

    interpretations. Moreover, such indexicals give rise to a special kind of validity—contextual validity—that interacts with ordinary logi- cal validity in interesting and often unexpected ways. In this paper we model these interactions by combining standard techniques from hybrid logic with insights from the work...... of Hans Kamp and David Kaplan. We introduce a simple proof rule, which we call the Kamp Rule, and first we show that it is all we need to take us from logical validities involving now to contextual validities involving now too. We then go on to show that this deductive bridge is strong enough to carry us...... to contextual validities involving yesterday, today and tomorrow as well....

  9. The Management Advisory Committee of the Inspection Validation Centre - fifth report

    International Nuclear Information System (INIS)

    1988-07-01

    The Management Advisory Committee of the Inspection Validation Centre (IVC/MAC) was set up by the Chairman of the UKAEA early in 1983 with terms of reference to review the policy, scope, procedure and operation of the Inspection Validation Centre, to supervise its operation and to advise and report to the UKAEA appropriately. The Inspection Validation Centre (IVC) has been established at the UKAEA Northern Research Laboratories, Risley for the purpose of validating the procedures, equipment and personnel proposed by the CEGB for use in the ultrasonic inspection at various stages of the fabrication, erection and operation of the CEGB's PWR reactor pressure vessel and such other components as are identified by the CEGB. This report, for 1987/8, states that the IVC has continued to make progress in the provision of the validation services as specified. (author)

  10. Elementary operators on self-adjoint operators

    Science.gov (United States)

    Molnar, Lajos; Semrl, Peter

    2007-03-01

    Let H be a Hilbert space and let and be standard *-operator algebras on H. Denote by and the set of all self-adjoint operators in and , respectively. Assume that and are surjective maps such that M(AM*(B)A)=M(A)BM(A) and M*(BM(A)B)=M*(B)AM*(B) for every pair , . Then there exist an invertible bounded linear or conjugate-linear operator and a constant c[set membership, variant]{-1,1} such that M(A)=cTAT*, , and M*(B)=cT*BT, .

  11. Validation: an overview of definitions

    International Nuclear Information System (INIS)

    Pescatore, C.

    1995-01-01

    The term validation is featured prominently in the literature on radioactive high-level waste disposal and is generally understood to be related to model testing using experiments. In a first class, validation is linked to the goal of predicting the physical world as faithfully as possible but is unattainable and unsuitable for setting goals for the safety analyses. In a second class, validation is associated to split-sampling or to blind-tests predictions. In the third class of definition, validation focuses on the quality of the decision-making process. Most prominent in the present review is the observed lack of use of the term validation in the field of low-level radioactive waste disposal. The continued informal use of the term validation in the field of high level wastes disposals can become cause for misperceptions and endless speculations. The paper proposes either abandoning the use of this term or agreeing to a definition which would be common to all. (J.S.). 29 refs

  12. Prospective validation of pathologic complete response models in rectal cancer: Transferability and reproducibility.

    Science.gov (United States)

    van Soest, Johan; Meldolesi, Elisa; van Stiphout, Ruud; Gatta, Roberto; Damiani, Andrea; Valentini, Vincenzo; Lambin, Philippe; Dekker, Andre

    2017-09-01

    Multiple models have been developed to predict pathologic complete response (pCR) in locally advanced rectal cancer patients. Unfortunately, validation of these models normally omit the implications of cohort differences on prediction model performance. In this work, we will perform a prospective validation of three pCR models, including information whether this validation will target transferability or reproducibility (cohort differences) of the given models. We applied a novel methodology, the cohort differences model, to predict whether a patient belongs to the training or to the validation cohort. If the cohort differences model performs well, it would suggest a large difference in cohort characteristics meaning we would validate the transferability of the model rather than reproducibility. We tested our method in a prospective validation of three existing models for pCR prediction in 154 patients. Our results showed a large difference between training and validation cohort for one of the three tested models [Area under the Receiver Operating Curve (AUC) cohort differences model: 0.85], signaling the validation leans towards transferability. Two out of three models had a lower AUC for validation (0.66 and 0.58), one model showed a higher AUC in the validation cohort (0.70). We have successfully applied a new methodology in the validation of three prediction models, which allows us to indicate if a validation targeted transferability (large differences between training/validation cohort) or reproducibility (small cohort differences). © 2017 American Association of Physicists in Medicine.

  13. Development of an operations evaluation system for sinking EDM

    NARCIS (Netherlands)

    Lauwers, B.; Oosterling, J.A.J.; Vanderauwera, W.

    2010-01-01

    This paper describes the development and validation of an operations evaluation system for sinking EDM operations. Based on a given workpiece geometry (e.g. mould), regions to be EDM'ed are automatically indentified. For a given electrode configuration, consisting of one or more regions, EDM

  14. Transformation of covariant quark Wigner operator to noncovariant one

    International Nuclear Information System (INIS)

    Selikhov, A.V.

    1989-01-01

    The gauge in which covariant and noncovariant quark Wigner operators coincide has been found. In this gauge the representations of vector potential via field strength tensor is valid. The system of equations for the coefficients of covariant Wigner operator expansion in the basis γ-matrices algebra is obtained. 12 refs.; 3 figs

  15. Operation experience with elevated ammonia

    International Nuclear Information System (INIS)

    Vankova, Katerina; Kysela, Jan; Malac, Miroslav; Petrecky, Igor; Svarc, Vladimir

    2011-01-01

    The 10 VVER units in the Czech and Slovak Republics are all in very good water chemistry and radiation condition, yet questions have arisen regarding the optimization of cycle chemistry and improved operation in these units. To address these issues, a comprehensive experimental program for different water chemistries of the primary circuit was carried out at the Rez Nuclear Research Institute, Czech Republic, with the goal of judging the influence of various water chemistries on radiation build-up. Four types of water chemistries were compared: standard VVER water chemistry (in common use), direct hydrogen dosing without ammonia, standard VVER water chemistry with elevated ammonia levels, and zinc dosing to standard VVER water chemistry. The test results showed that the types of water chemistry other than the common one have benefits for the operation of the nuclear power plant (NPP) primary circuit. Operation experience with elevated ammonia at NPP Dukovany Units 3 and 4 is presented which validates the experimental results, demonstrating improved corrosion product volume activity. (orig.)

  16. Ada Compiler Validation Summary Report: Certificate Number: 940325S1. 11352 DDC-I DACS Sun SPARC/Solaries to Pentium PM Bare Ada Cross Compiler System, Version 4.6.4 Sun SPARCclassic = Intel Pentium (Operated as Bare Machine) Based in Xpress Desktop (Intel Product Number: XBASE6E4F-B)

    Science.gov (United States)

    1994-03-25

    Best Available Copy REPORT DOCUMENTATION PAGE _V -ONC Uft Xaf. WO -" Am u~~ ns~ 940325SI. 11352 , AVV: 94ddc5OO_3d. Compiler: DACS Sun SPARC/ aonais to...Manual for the Ada Proarammina Language, ANSI/MIL-STD-1815A, February 1983 and ISO 8652-1987. [Pro92] Ada Coupiler Validation Procedures, Version 3.1...objectives found to be irrelevant for the given Ada implementation. ISO International Organization for Standardization. LRM The Ada standard, or

  17. Operation planning device

    International Nuclear Information System (INIS)

    Watanabe, Takashi; Odakawa, Naoto; Erikuchi, Makoto; Okada, Masayuki; Koizumi, Atsuhiko.

    1996-01-01

    The device of the present invention provides a device suitable for monitoring a reactor core state and operation replanning in terms of reactor operation. Namely, (1) an operation result difference judging means judges that replanning is necessary when the operation results deviates from the operation planning, (2) an operation replanning rule data base storing means stores a deviation key which shows various kinds of states where the results deviate from the planning and a rule for replanning for returning to the operation planning on every deviating key, (3) an operation replanning means forms a new operation planning in accordance with the rule which is retrieved based on the deviation key, (4) an operation planning optimizing rule data base storing means evaluates the reformed planning and stores it on every evaluation item, (5) an operation planning optimization means correct the operation planning data so as to be optimized when the evaluation of the means (4) is less than a reference value, and (6) an operation planning display means edits adaptable operation planning data and the result of the evaluation and displays them. (I.S.)

  18. An Analysis of the Twenty-Nine Capabilities of the Marine Corps Expeditionary Unit (Special Operations Capable)

    National Research Council Canada - National Science Library

    Love, John

    1998-01-01

    ... (Special Operations Capable) (MEU (SOC) to determine their relative validity. The methodology utilizes a multiple criteria decision-making model to determine the relative validity of each MEU (SOC) capability...

  19. Your Lung Operation: After Your Operation

    Medline Plus

    Full Text Available ... to Choosing a Surgical Residency Education Modules Practice Management Workshops Patients and Family Patient Education Patient Education ... Trauma Advanced Trauma Life Support Advanced Trauma Operative Management Basic Endovascular Skills for Trauma Disaster Management and ...

  20. Your Lung Operation: After Your Operation

    Medline Plus

    Full Text Available ... Congress Educational Program Events and Special Activities Resources Housing and Travel Exhibitors Media Information Clinical Congress 2017 ... Surgical Skills for Exposure in Trauma Advanced Trauma Life Support Advanced Trauma Operative Management Basic Endovascular Skills ...

  1. Your Lung Operation: After Your Operation

    Medline Plus

    Full Text Available ... 25- and 50-Year Fellows Recognition Surgical History Group Icons in Surgery Archives Catalog Additional Resources Contact ... for after the operation including review of attached equipment and ways for you to actively participate to ...

  2. Your Lung Operation: After Your Operation

    Medline Plus

    Full Text Available ... at ACS ACS and Veterans Diversity at ACS ... and Family Contact My Profile Shop ( 0 ) Cart Donate American College of Surgeons Education Patients and Family Skills Programs Your Lung Operation ...

  3. Your Lung Operation: After Your Operation

    Medline Plus

    Full Text Available ... Surgeon Specific Registry Trauma Education Trauma Education Trauma Education Advanced Surgical Skills for Exposure in Trauma Advanced Trauma Life Support Advanced Trauma Operative Management Basic Endovascular Skills for Trauma Disaster Management and Emergency ...

  4. Your Lung Operation: After Your Operation

    Medline Plus

    Full Text Available ... Education Trauma Education Achieving Zero Preventable Deaths Trauma Systems Conference Advanced Surgical Skills for Exposure in Trauma Advanced Trauma Life Support Advanced Trauma Operative Management Basic Endovascular Skills for Trauma Disaster Management and ...

  5. Nonlocal Operational Calculi for Dunkl Operators

    Directory of Open Access Journals (Sweden)

    Ivan H. Dimovski

    2009-03-01

    Full Text Available The one-dimensional Dunkl operator $D_k$ with a non-negative parameter $k$, is considered under an arbitrary nonlocal boundary value condition. The right inverse operator of $D_k$, satisfying this condition is studied. An operational calculus of Mikusinski type is developed. In the frames of this operational calculi an extension of the Heaviside algorithm for solution of nonlocal Cauchy boundary value problems for Dunkl functional-differential equations $P(D_ku = f$ with a given polynomial $P$ is proposed. The solution of these equations in mean-periodic functions reduces to such problems. Necessary and sufficient condition for existence of unique solution in mean-periodic functions is found.

  6. Your Lung Operation: After Your Operation

    Medline Plus

    Full Text Available ... Mentoring for Excellence in Trauma Surgery Advanced Trauma Life Support Verification, Review, and Consultation Program for Hospitals ... Surgical Skills for Exposure in Trauma Advanced Trauma Life Support Advanced Trauma Operative Management Basic Endovascular Skills ...

  7. Your Lung Operation: After Your Operation

    Medline Plus

    Full Text Available ... Stay Up to Date with ACS Association Management Jobs Events Find a Surgeon Patients and Family Contact My Profile Shop ( 0 ) Cart Donate American College of Surgeons Education Patients and Family Skills Programs Your Lung Operation ...

  8. Your Lung Operation: After Your Operation

    Medline Plus

    Full Text Available ... Disaster Management and Emergency Preparedness Rural Trauma Team Development Course Trauma Evaluation and Management Trauma CME The ... for after the operation including review of attached equipment and ways for you to actively participate to ...

  9. Your Lung Operation: After Your Operation

    Medline Plus

    Full Text Available ... Up to Date with ACS Association Management JACS Jobs Events Find a Surgeon Patients and Family Contact My Profile Shop ( 0 ) Cart Donate American College of Surgeons Education Patients and Family Skills Programs Your Lung Operation ...

  10. Your Lung Operation: After Your Operation

    Medline Plus

    Full Text Available ... ACS ACS and Veterans Diversity at ACS ... and Family Contact My Profile Shop ( 0 ) Cart Donate American College of Surgeons Education Patients and Family Skills Programs Your Lung Operation ...

  11. Your Lung Operation: After Your Operation

    Medline Plus

    Full Text Available ... Online Guide to Choosing a Surgical Residency Practice Management Workshops Patients and Family Patient Education Patient Education ... Trauma Advanced Trauma Life Support Advanced Trauma Operative Management Basic Endovascular Skills for Trauma Disaster Management and ...

  12. Site characterization and validation - validation drift fracture data, stage 4

    International Nuclear Information System (INIS)

    Bursey, G.; Gale, J.; MacLeod, R.; Straahle, A.; Tiren, S.

    1991-08-01

    This report describes the mapping procedures and the data collected during fracture mapping in the validation drift. Fracture characteristics examined include orientation, trace length, termination mode, and fracture minerals. These data have been compared and analysed together with fracture data from the D-boreholes to determine the adequacy of the borehole mapping procedures and to assess the nature and degree of orientation bias in the borehole data. The analysis of the validation drift data also includes a series of corrections to account for orientation, truncation, and censoring biases. This analysis has identified at least 4 geologically significant fracture sets in the rock mass defined by the validation drift. An analysis of the fracture orientations in both the good rock and the H-zone has defined groups of 7 clusters and 4 clusters, respectively. Subsequent analysis of the fracture patterns in five consecutive sections along the validation drift further identified heterogeneity through the rock mass, with respect to fracture orientations. These results are in stark contrast to the results form the D-borehole analysis, where a strong orientation bias resulted in a consistent pattern of measured fracture orientations through the rock. In the validation drift, fractures in the good rock also display a greater mean variance in length than those in the H-zone. These results provide strong support for a distinction being made between fractures in the good rock and the H-zone, and possibly between different areas of the good rock itself, for discrete modelling purposes. (au) (20 refs.)

  13. Improving operating room safety

    Directory of Open Access Journals (Sweden)

    Garrett Jill

    2009-11-01

    Full Text Available Abstract Despite the introduction of the Universal Protocol, patient safety in surgery remains a daily challenge in the operating room. This present study describes one community health system's efforts to improve operating room safety through human factors training and ultimately the development of a surgical checklist. Using a combination of formal training, local studies documenting operating room safety issues and peer to peer mentoring we were able to substantially change the culture of our operating room. Our efforts have prepared us for successfully implementing a standardized checklist to improve operating room safety throughout our entire system. Based on these findings we recommend a multimodal approach to improving operating room safety.

  14. Computer algebra and operators

    Science.gov (United States)

    Fateman, Richard; Grossman, Robert

    1989-01-01

    The symbolic computation of operator expansions is discussed. Some of the capabilities that prove useful when performing computer algebra computations involving operators are considered. These capabilities may be broadly divided into three areas: the algebraic manipulation of expressions from the algebra generated by operators; the algebraic manipulation of the actions of the operators upon other mathematical objects; and the development of appropriate normal forms and simplification algorithms for operators and their actions. Brief descriptions are given of the computer algebra computations that arise when working with various operators and their actions.

  15. Operating System Security

    CERN Document Server

    Jaeger, Trent

    2008-01-01

    Operating systems provide the fundamental mechanisms for securing computer processing. Since the 1960s, operating systems designers have explored how to build "secure" operating systems - operating systems whose mechanisms protect the system against a motivated adversary. Recently, the importance of ensuring such security has become a mainstream issue for all operating systems. In this book, we examine past research that outlines the requirements for a secure operating system and research that implements example systems that aim for such requirements. For system designs that aimed to

  16. Rodent Research-1 (RR1) NASA Validation Flight: Mouse liver transcriptomic proteomic and epigenomic data

    Data.gov (United States)

    National Aeronautics and Space Administration — RR-1 is a validation flight to evaluate the hardware operational and science capabilities of the Rodent Research Project on the ISS. RNA DNA and protein were...

  17. Quantum Fisher information on its own is not a valid measure of the coherence

    Science.gov (United States)

    Kwon, Hyukjoon; Tan, Kok Chuan; Choi, Seongjeon; Jeong, Hyunseok

    2018-06-01

    We show that contrary to the claim in Feng and Wei (2017), the quantum Fisher information itself is not a valid measure of the coherence based on the resource theory because it can increase via an incoherent operation.

  18. Feature selection for anomaly–based network intrusion detection using cluster validity indices

    CSIR Research Space (South Africa)

    Naidoo, Tyrone

    2015-09-01

    Full Text Available data, which is rarely available in operational networks. It uses normalized cluster validity indices as an objective function that is optimized over the search space of candidate feature subsets via a genetic algorithm. Feature sets produced...

  19. Validation of multi-channel scanning microwave radiometer onboard OCEANSAT - 1

    Digital Repository Service at National Institute of Oceanography (India)

    Muraleedharan, P.M.; Pankajakshan, T.; Harikrishnan, M.

    IRS-P4 (OCEASAT-1) was the first operational oceanographic satellite that India has launched. An extensive validation campaign was unleashed immediately after its launch in May 1999. Various platforms (Ship, Moored buoy, Drifting buoy, Autonomous...

  20. Validation of sea surface temperature, wind speed and integrated water vapour from MSMR measurements. Project report

    Digital Repository Service at National Institute of Oceanography (India)

    Muraleedharan, P.M.

    IRS-P4 (OCEANSAT-1) is the first operational oceanographic satellite that India has launched. An extensive validation campaign was unleashed immediately after its launch in May 1999. Various platforms (viz., ship, moored buoy, drifting buoy...