WorldWideScience

Sample records for validation procedure due

  1. Procedural Due Process Rights in Student Discipline.

    Science.gov (United States)

    Pressman, Robert; Weinstein, Susan

    To assist administrators in understanding procedural due process rights in student discipline, this manual draws together hundreds of citations and case summaries of federal and state court decisions and provides detailed commentary as well. Chapter 1 outlines the general principles of procedural due process rights in student discipline, such as…

  2. In-Trail Procedure Air Traffic Control Procedures Validation Simulation Study

    Science.gov (United States)

    Chartrand, Ryan C.; Hewitt, Katrin P.; Sweeney, Peter B.; Graff, Thomas J.; Jones, Kenneth M.

    2012-01-01

    In August 2007, Airservices Australia (Airservices) and the United States National Aeronautics and Space Administration (NASA) conducted a validation experiment of the air traffic control (ATC) procedures associated with the Automatic Dependant Surveillance-Broadcast (ADS-B) In-Trail Procedure (ITP). ITP is an Airborne Traffic Situation Awareness (ATSA) application designed for near-term use in procedural airspace in which ADS-B data are used to facilitate climb and descent maneuvers. NASA and Airservices conducted the experiment in Airservices simulator in Melbourne, Australia. Twelve current operational air traffic controllers participated in the experiment, which identified aspects of the ITP that could be improved (mainly in the communication and controller approval process). Results showed that controllers viewed the ITP as valid and acceptable. This paper describes the experiment design and results.

  3. Procedure for Validation of Aggregators Providing Demand Response

    DEFF Research Database (Denmark)

    Bondy, Daniel Esteban Morales; Gehrke, Oliver; Thavlov, Anders

    2016-01-01

    of small heterogeneous resources that are geographically distributed. Therefore, a new test procedure must be designed for the aggregator validation. This work proposes such a procedure and exemplifies is with a study case. The validation of aggregators is essential if aggregators are to be integrated...... succesfully into the power system....

  4. CARVEDILOL POPULATION PHARMACOKINETIC ANALYSIS – APPLIED VALIDATION PROCEDURE

    Directory of Open Access Journals (Sweden)

    Aleksandra Catić-Đorđević

    2013-09-01

    Full Text Available Carvedilol is a nonselective beta blocker/alpha-1 blocker, which is used for treatment of essential hypertension, chronic stable angina, unstable angina and ischemic left ventricular dysfunction. The aim of this study was to describe carvedilol population pharmacokinetic (PK analysis as well as the validation of analytical procedure, which is an important step regarding this approach. In contemporary clinical practice, population PK analysis is often more important than standard PK approach in setting a mathematical model that describes the PK parameters. Also, it includes the variables that have particular importance in the drugs pharmacokinetics such as sex, body mass, dosage, pharmaceutical form, pathophysiological state, disease associated with the organism or the presence of a specific polymorphism in the isoenzyme important for biotransformation of the drug. One of the most frequently used approach in population PK analysis is the Nonlinear Modeling of Mixed Effects - NONMEM modeling. Analytical methods used in the data collection period is of great importance for the implementation of a population PK analysis of carvedilol in order to obtain reliable data that can be useful in clinical practice. High performance liquid chromatography (HPLC analysis of carvedilol is used to confirm the identity of a drug and provide quantitative results and also to monitor the efficacy of the therapy. Analytical procedures used in other studies could not be fully implemented in our research as it was necessary to perform certain modification and validation of the method with the aim of using the obtained results for the purpose of a population pharmacokinetic analysis. Validation process is a logical terminal phase of analytical procedure development that provides applicability of the procedure itself. The goal of validation is to ensure consistency of the method and accuracy of results or to confirm the selection of analytical method for a given sample

  5. Occurrence of pneumomediastinum due to dental procedures.

    Science.gov (United States)

    Aslaner, Mehmet Ali; Kasap, Gül Nihal; Demir, Cihat; Akkaş, Meltem; Aksu, Nalan M

    2015-01-01

    The occurrence of pneumomediastinum and massive subcutaneous emphysema due to dental procedures is quite rare. We present a case of pneumomediastinum and massive subcutaneous emphysema that occurred during third molar tooth extraction with air-turbine handpiece.

  6. Procedural Due Process and Fairness in Student Discipline. A Legal Memorandum.

    Science.gov (United States)

    Johnson, T. Page

    When the Supreme Court decided that the Constitution requires public school principals to follow procedural due process in suspension and expulsion cases, the Justices recognized a link between procedural due process and the fairness of effective discipline. This report reviews the constitutional due process required when public school officials…

  7. Improvement and Validation of Weld Residual Stress Modelling Procedure

    International Nuclear Information System (INIS)

    Zang, Weilin; Gunnars, Jens; Dong, Pingsha; Hong, Jeong K.

    2009-06-01

    The objective of this work is to identify and evaluate improvements for the residual stress modelling procedure currently used in Sweden. There is a growing demand to eliminate any unnecessary conservatism involved in residual stress assumptions. The study was focused on the development and validation of an improved weld residual stress modelling procedure, by taking advantage of the recent advances in residual stress modelling and stress measurement techniques. The major changes applied in the new weld residual stress modelling procedure are: - Improved procedure for heat source calibration based on use of analytical solutions. - Use of an isotropic hardening model where mixed hardening data is not available. - Use of an annealing model for improved simulation of strain relaxation in re-heated material. The new modelling procedure is demonstrated to capture the main characteristics of the through thickness stress distributions by validation to experimental measurements. Three austenitic stainless steel butt-welds cases are analysed, covering a large range of pipe geometries. From the cases it is evident that there can be large differences between the residual stresses predicted using the new procedure, and the earlier procedure or handbook recommendations. Previously recommended profiles could give misleading fracture assessment results. The stress profiles according to the new procedure agree well with the measured data. If data is available then a mixed hardening model should be used

  8. Improvement and Validation of Weld Residual Stress Modelling Procedure

    Energy Technology Data Exchange (ETDEWEB)

    Zang, Weilin; Gunnars, Jens (Inspecta Technology AB, Stockholm (Sweden)); Dong, Pingsha; Hong, Jeong K. (Center for Welded Structures Research, Battelle, Columbus, OH (United States))

    2009-06-15

    The objective of this work is to identify and evaluate improvements for the residual stress modelling procedure currently used in Sweden. There is a growing demand to eliminate any unnecessary conservatism involved in residual stress assumptions. The study was focused on the development and validation of an improved weld residual stress modelling procedure, by taking advantage of the recent advances in residual stress modelling and stress measurement techniques. The major changes applied in the new weld residual stress modelling procedure are: - Improved procedure for heat source calibration based on use of analytical solutions. - Use of an isotropic hardening model where mixed hardening data is not available. - Use of an annealing model for improved simulation of strain relaxation in re-heated material. The new modelling procedure is demonstrated to capture the main characteristics of the through thickness stress distributions by validation to experimental measurements. Three austenitic stainless steel butt-welds cases are analysed, covering a large range of pipe geometries. From the cases it is evident that there can be large differences between the residual stresses predicted using the new procedure, and the earlier procedure or handbook recommendations. Previously recommended profiles could give misleading fracture assessment results. The stress profiles according to the new procedure agree well with the measured data. If data is available then a mixed hardening model should be used

  9. Practical procedure for method validation in INAA- A tutorial

    International Nuclear Information System (INIS)

    Petroni, Robson; Moreira, Edson G.

    2015-01-01

    This paper describes the procedure employed by the Neutron Activation Laboratory at the Nuclear and Energy Research Institute (LAN, IPEN - CNEN/SP) for validation of Instrumental Neutron Activation Analysis (INAA) methods. According to recommendations of ISO/IEC 17025 the method performance characteristics (limit of detection, limit of quantification, trueness, repeatability, intermediate precision, reproducibility, selectivity, linearity and uncertainties budget) were outline in an easy, fast and convenient way. The paper presents step by step how to calculate the required method performance characteristics in a process of method validation, what are the procedures, adopted strategies and acceptance criteria for the results, that is, how to make a method validation in INAA. In order to exemplify the methodology applied, obtained results for the method validation of mass fraction determination of Co, Cr, Fe, Rb, Se and Zn in biological matrix samples, using an internal reference material of mussel tissue were presented. It was concluded that the methodology applied for validation of INAA methods is suitable, meeting all the requirements of ISO/IEC 17025, and thereby, generating satisfactory results for the studies carried at LAN, IPEN - CNEN/SP. (author)

  10. Practical procedure for method validation in INAA- A tutorial

    Energy Technology Data Exchange (ETDEWEB)

    Petroni, Robson; Moreira, Edson G., E-mail: robsonpetroni@usp.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    This paper describes the procedure employed by the Neutron Activation Laboratory at the Nuclear and Energy Research Institute (LAN, IPEN - CNEN/SP) for validation of Instrumental Neutron Activation Analysis (INAA) methods. According to recommendations of ISO/IEC 17025 the method performance characteristics (limit of detection, limit of quantification, trueness, repeatability, intermediate precision, reproducibility, selectivity, linearity and uncertainties budget) were outline in an easy, fast and convenient way. The paper presents step by step how to calculate the required method performance characteristics in a process of method validation, what are the procedures, adopted strategies and acceptance criteria for the results, that is, how to make a method validation in INAA. In order to exemplify the methodology applied, obtained results for the method validation of mass fraction determination of Co, Cr, Fe, Rb, Se and Zn in biological matrix samples, using an internal reference material of mussel tissue were presented. It was concluded that the methodology applied for validation of INAA methods is suitable, meeting all the requirements of ISO/IEC 17025, and thereby, generating satisfactory results for the studies carried at LAN, IPEN - CNEN/SP. (author)

  11. A Neural Networks Based Operation Guidance System for Procedure Presentation and Validation

    International Nuclear Information System (INIS)

    Seung, Kun Mo; Lee, Seung Jun; Seong, Poong Hyun

    2006-01-01

    In this paper, a neural network based operator support system is proposed to reduce operator's errors in abnormal situations in nuclear power plants (NPPs). There are many complicated situations, in which regular and suitable operations should be done by operators accordingly. In order to regulate and validate operators' operations, it is necessary to develop an operator support system which includes computer based procedures with the functions for operation validation. Many computerized procedures systems (CPS) have been recently developed. Focusing on the human machine interface (HMI) design and procedures' computerization, most of CPSs used various methodologies to enhance system's convenience, reliability and accessibility. Other than only showing procedures, the proposed system integrates a simple CPS and an operation validation system (OVS) by using artificial neural network (ANN) for operational permission and quantitative evaluation

  12. Verification and validation of a numeric procedure for flow simulation of a 2x2 PWR rod bundle

    International Nuclear Information System (INIS)

    Santos, Andre A.C.; Barros Filho, Jose Afonso; Navarro, Moyses A.

    2011-01-01

    Before Computational Fluid Dynamics (CFD) can be considered as a reliable tool for the analysis of flow through rod bundles there is a need to establish the credibility of the numerical results. Procedures must be defined to evaluate the error and uncertainty due to aspects such as mesh refinement, turbulence model, wall treatment and appropriate definition of boundary conditions. These procedures are referred to as Verification and Validation (V and V) processes. In 2009 a standard was published by the American Society of Mechanical Engineers (ASME) establishing detailed procedures for V and V of CFD simulations. This paper presents a V and V evaluation of a numerical methodology applied to the simulation of a PWR rod bundle segment with a split vane spacer grid based on ASMEs standard. In this study six progressively refined meshes were generated to evaluate the numerical uncertainty through the verification procedure. Experimental and analytical results available in the literature were used in this study for validation purpose. The results show that the ASME verification procedure can give highly variable predictions of uncertainty depending on the mesh triplet used for the evaluation. However, the procedure can give good insight towards optimization of the mesh size and overall result quality. Although the experimental results used for the validation were not ideal, through the validation procedure the deficiencies and strengths of the presented modeling could be detected and reasonably evaluated. Even though it is difficult to obtain reliable estimates of the uncertainty of flow quantities in the turbulent flow, this study shows that the V and V process is a necessary step in a CFD analysis of a spacer grid design. (author)

  13. DMM assessments of attachment and adaptation: Procedures, validity and utility.

    Science.gov (United States)

    Farnfield, Steve; Hautamäki, Airi; Nørbech, Peder; Sahhar, Nicola

    2010-07-01

    This article gives a brief over view of the Dynamic-Maturational Model of attachment and adaptation (DMM; Crittenden, 2008) together with the various DMM assessments of attachment that have been developed for specific stages of development. Each assessment is discussed in terms of procedure, outcomes, validity, advantages and limitations, comparable procedures and areas for further research and validation. The aims are twofold: to provide an introduction to DMM theory and its application that underlie the articles in this issue of CCPP; and to provide researchers and clinicians with a guide to DMM assessments.

  14. Validity of diagnoses, procedures, and laboratory data in Japanese administrative data.

    Science.gov (United States)

    Yamana, Hayato; Moriwaki, Mutsuko; Horiguchi, Hiromasa; Kodan, Mariko; Fushimi, Kiyohide; Yasunaga, Hideo

    2017-10-01

    Validation of recorded data is a prerequisite for studies that utilize administrative databases. The present study evaluated the validity of diagnoses and procedure records in the Japanese Diagnosis Procedure Combination (DPC) data, along with laboratory test results in the newly-introduced Standardized Structured Medical Record Information Exchange (SS-MIX) data. Between November 2015 and February 2016, we conducted chart reviews of 315 patients hospitalized between April 2014 and March 2015 in four middle-sized acute-care hospitals in Shizuoka, Kochi, Fukuoka, and Saga Prefectures and used them as reference standards. The sensitivity and specificity of DPC data in identifying 16 diseases and 10 common procedures were identified. The accuracy of SS-MIX data for 13 laboratory test results was also examined. The specificity of diagnoses in the DPC data exceeded 96%, while the sensitivity was below 50% for seven diseases and variable across diseases. When limited to primary diagnoses, the sensitivity and specificity were 78.9% and 93.2%, respectively. The sensitivity of procedure records exceeded 90% for six procedures, and the specificity exceeded 90% for nine procedures. Agreement between the SS-MIX data and the chart reviews was above 95% for all 13 items. The validity of diagnoses and procedure records in the DPC data and laboratory results in the SS-MIX data was high in general, supporting their use in future studies. Copyright © 2017 The Authors. Production and hosting by Elsevier B.V. All rights reserved.

  15. Method Validation Procedure in Gamma Spectroscopy Laboratory

    International Nuclear Information System (INIS)

    El Samad, O.; Baydoun, R.

    2008-01-01

    The present work describes the methodology followed for the application of ISO 17025 standards in gamma spectroscopy laboratory at the Lebanese Atomic Energy Commission including the management and technical requirements. A set of documents, written procedures and records were prepared to achieve the management part. The technical requirements, internal method validation was applied through the estimation of trueness, repeatability , minimum detectable activity and combined uncertainty, participation in IAEA proficiency tests assure the external method validation, specially that the gamma spectroscopy lab is a member of ALMERA network (Analytical Laboratories for the Measurements of Environmental Radioactivity). Some of these results are presented in this paper. (author)

  16. Calculation of piping loads due to filling procedures

    International Nuclear Information System (INIS)

    Swidersky, Harald; Thiele, Thomas

    2012-01-01

    Filling procedures in piping systems are usually not load cases that are studied by fluid dynamic and structure dynamic analyses with respect to the integrity of pipes and supports. Although, their frequency is higher than that of postulated accidental transients, therefore they have to be considered for fatigue analyses. The piping and support loads due to filling procedures are caused by the density differences if the transported fluids, for instance in flows with the transport of gas bubbles. The impact duration of the momentum forces is defined by the flow velocity and the length of discontinuities in the piping segments. Filling procedures end very often with a shock pressure, caused by the impact and decelerating of the fluid front at smaller cross sections. The suitability of the thermally hydraulics program RELAP/MOD3.3 for the calculation of realistic loads from filling procedures was studied, the results compared with experimental data. It is shown that dependent on the discretization level the loads are partial significantly underestimated.

  17. Validation of EOPs/FRGs Procedures Using LOHS Scenario

    International Nuclear Information System (INIS)

    Bajs, T.; Konjarek, D.; Vukovic, J.

    2012-01-01

    Validation of EOPs (Emergency Operating Procedures) and FRGs (Function Restoration Guidelines) can be achieved either through plant full scope simulator or on desk top exercises. The desk top exercise is conducted when for the given scenario plant full scope simulator is not suitable. In either verification cases predefined scenario should be evaluated and possible branching foreseen. The scenario presented is LOHS, with bleed and feed procedure initiated. Best estimate light water reactor transient analysis code RELAP5/mod3.3 was used in calculation. Standardized detailed plant model was used. Operator actions were modelled from beginning of the scenario to its termination.(author).

  18. Additional radiation dose to population due to X-ray diagnostic procedures

    International Nuclear Information System (INIS)

    Chougule, A.

    2006-01-01

    entrance skin dose (mGy) received by the patient during the radiological procedure will be presented in the paper. From the population of the region, the frequency of radiological procedures/person/year is estimated and in the present study it worked out to be 10,500-procedures/ year/100,000 population. From the data collected, estimation of additional contribution of radiation dose to population due to X ray diagnostic procedures was done. In present st udy it was found that the contribution of radiation dose due to radiological procedure to population is 0.22 mSv/year/person. The UNSCEAR (2000) reported the 1.2 mSv as mean effective dose per capita due to medical X ray examinations in developed countries where frequency of x ray procedure is much higher as compared to India. The results in detail are discussed in this communication. (author)

  19. Validity, reliability and support for implementation of independence-scaled procedural assessment in laparoscopic surgery.

    Science.gov (United States)

    Kramp, Kelvin H; van Det, Marc J; Veeger, Nic J G M; Pierie, Jean-Pierre E N

    2016-06-01

    There is no widely used method to evaluate procedure-specific laparoscopic skills. The first aim of this study was to develop a procedure-based assessment method. The second aim was to compare its validity, reliability and feasibility with currently available global rating scales (GRSs). An independence-scaled procedural assessment was created by linking the procedural key steps of the laparoscopic cholecystectomy to an independence scale. Subtitled and blinded videos of a novice, an intermediate and an almost competent trainee, were evaluated with GRSs (OSATS and GOALS) and the independence-scaled procedural assessment by seven surgeons, three senior trainees and six scrub nurses. Participants received a short introduction to the GRSs and independence-scaled procedural assessment before assessment. The validity was estimated with the Friedman and Wilcoxon test and the reliability with the intra-class correlation coefficient (ICC). A questionnaire was used to evaluate user opinion. Independence-scaled procedural assessment and GRS scores improved significantly with surgical experience (OSATS p = 0.001, GOALS p < 0.001, independence-scaled procedural assessment p < 0.001). The ICCs of the OSATS, GOALS and independence-scaled procedural assessment were 0.78, 0.74 and 0.84, respectively, among surgeons. The ICCs increased when the ratings of scrub nurses were added to those of the surgeons. The independence-scaled procedural assessment was not considered more of an administrative burden than the GRSs (p = 0.692). A procedural assessment created by combining procedural key steps to an independence scale is a valid, reliable and acceptable assessment instrument in surgery. In contrast to the GRSs, the reliability of the independence-scaled procedural assessment exceeded the threshold of 0.8, indicating that it can also be used for summative assessment. It furthermore seems that scrub nurses can assess the operative competence of surgical trainees.

  20. Validation of an advanced analytical procedure applied to the measurement of environmental radioactivity.

    Science.gov (United States)

    Thanh, Tran Thien; Vuong, Le Quang; Ho, Phan Long; Chuong, Huynh Dinh; Nguyen, Vo Hoang; Tao, Chau Van

    2018-04-01

    In this work, an advanced analytical procedure was applied to calculate radioactivity in spiked water samples in a close geometry gamma spectroscopy. It included MCNP-CP code in order to calculate the coincidence summing correction factor (CSF). The CSF results were validated by a deterministic method using ETNA code for both p-type HPGe detectors. It showed that a good agreement for both codes. Finally, the validity of the developed procedure was confirmed by a proficiency test to calculate the activities of various radionuclides. The results of the radioactivity measurement with both detectors using the advanced analytical procedure were received the ''Accepted'' statuses following the proficiency test. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. Analytical validation of an ultraviolet-visible procedure for determining lutein concentration and application to lutein-loaded nanoparticles.

    Science.gov (United States)

    Silva, Jéssica Thaís do Prado; Silva, Anderson Clayton da; Geiss, Julia Maria Tonin; de Araújo, Pedro Henrique Hermes; Becker, Daniela; Bracht, Lívia; Leimann, Fernanda Vitória; Bona, Evandro; Guerra, Gustavo Petri; Gonçalves, Odinei Hess

    2017-09-01

    Lutein is a carotenoid presenting known anti-inflammatory and antioxidant properties. Lutein-rich diets have been associated with neurological improvement as well as reduction of the risk of vision loss due to Age-Related Macular Degeneration (AMD). Micro and nanoencapsulation have demonstrated to be effective techniques in protecting lutein against degradation and also in improving its bioavailability. However, actual lutein concentration inside the capsules and encapsulation efficiency are key parameters that must be precisely known when designing in vitro and in vivo tests. In this work an analytical procedure was validated for the determination of the actual lutein content in zein nanoparticles using ultraviolet-visible spectroscopy. Method validation followed the International Conference on Harmonisation (ICH) guidelines which evaluate linearity, detection limit, quantification limit, accuracy and precision. The validated methodology was applied to characterize lutein-loaded nanoparticles. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Consistency of FMEA used in the validation of analytical procedures

    DEFF Research Database (Denmark)

    Oldenhof, M.T.; van Leeuwen, J.F.; Nauta, Maarten

    2011-01-01

    is always carried out under the supervision of an experienced FMEA-facilitator and that the FMEA team has at least two members with competence in the analytical method to be validated. However, the FMEAs of both teams contained valuable information that was not identified by the other team, indicating......In order to explore the consistency of the outcome of a Failure Mode and Effects Analysis (FMEA) in the validation of analytical procedures, an FMEA was carried out by two different teams. The two teams applied two separate FMEAs to a High Performance Liquid Chromatography-Diode Array Detection...

  3. Integrated System Validation Usability Questionnaire: Computerized Procedures

    International Nuclear Information System (INIS)

    Garcés, Ma. I.; Torralba, B.

    2015-01-01

    The Research and Development (R&D) project on “Theoretical and Methodological Approaches to Integrated System Validation of Control Rooms, 2014-2015”, in which the research activities described in this report are framed, has two main objectives: to develop the items for an usability methodology conceived as a part of the measurement framework for performance-based control room evaluation that the OECD Halden Reactor Project will test in the experiments planned for 2015; and the statistical analysis of the data generated in the experimental activities of the Halden Man-Machine Laboratory (HAMMLAB) facility, with previous usability questionnaires, in 2010 and 2011. In this report, the procedure designed to meet the first goal of the project is described, in particular, the process followed to identify the items related to operating procedures, both computer and paper-based, one of the elements to be included in the usability questionnaire. Three phases are performed, in the first one, the approaches developed by the United States Nuclear Regulatory Commission, NRC, are reviewed, the models used by the nuclear industry and their technical support organizations, mainly, the Electric Power Research Institute, EPRI, are analyzed, and scientist advances are also explored. In the remaining stages, general and specific guidelines for computerized and paper-based procedures are compared and criteria for the preliminary selection of the items that should be incorporated into the usability questionnaire are defined. This proposal will be reviewed and adapted by the Halden Reactor Project to the design of the specific experiments performed in HAMLAB.

  4. The Analytical Pragmatic Structure of Procedural Due Process: A Framework for Inquiry in Administrative Decision Making.

    Science.gov (United States)

    Fisher, James E.; Sealey, Ronald W.

    The study describes the analytical pragmatic structure of concepts and applies this structure to the legal concept of procedural due process. This structure consists of form, purpose, content, and function. The study conclusions indicate that the structure of the concept of procedural due process, or any legal concept, is not the same as the…

  5. Generalized peritonitis due to perforated diverticulitis: Hartmann's procedure or primary anastomosis?

    Science.gov (United States)

    Trenti, Loris; Biondo, Sebastiano; Golda, Thomas; Monica, Millan; Kreisler, Esther; Fraccalvieri, Domenico; Frago, Ricardo; Jaurrieta, Eduardo

    2011-03-01

    Hartmann's procedure (HP) still remains the most frequently performed procedure for diffuse peritonitis due to perforated diverticulitis. The aims of this study were to assess the feasibility and safety of resection with primary anastomosis (RPA) in patients with purulent or fecal diverticular peritonitis and review morbidity and mortality after single stage procedure and Hartmann in our experience. From January 1995 through December 2008, patients operated for generalized diverticular peritonitis were studied. Patients were classified into two main groups: RPA and HP. A total of 87 patients underwent emergency surgery for diverticulitis complicated with purulent or diffuse fecal peritonitis. Sixty (69%) had undergone HP while RPA was performed in 27 patients (31%). At the multivariate analysis, RPA was associated with less post-operative complications (P clinical anastomotic leakage and needed re-operation. RPA can be safely performed without adding morbidity and mortality in cases of diffuse diverticular peritonitis. HP should be reserved only for hemodynamically unstable or high-risk patients. Specialization in colorectal surgery improves mortality and raises the percentage of one-stage procedures.

  6. Operator competence in fetoscopic laser surgery for twin-twin transfusion syndrome: validation of a procedure-specific evaluation tool.

    Science.gov (United States)

    Peeters, S H P; Akkermans, J; Bustraan, J; Middeldorp, J M; Lopriore, E; Devlieger, R; Lewi, L; Deprest, J; Oepkes, D

    2016-03-01

    Fetoscopic laser surgery for twin-twin transfusion syndrome is a procedure for which no objective tools exist to assess technical skills. To ensure that future fetal surgeons reach competence prior to performing the procedure unsupervised, we developed a performance assessment tool. The aim of this study was to validate this assessment tool for reliability and construct validity. We made use of a procedure-specific evaluation instrument containing all essential steps of the fetoscopic laser procedure, which was previously created using Delphi methodology. Eleven experts and 13 novices from three fetal medicine centers performed the procedure on the same simulator. Two independent observers assessed each surgery using the instrument (maximum score: 52). Interobserver reliability was assessed using Spearman correlation. We compared the performance of novices and experts to assess construct validity. The interobserver reliability was high (Rs  = 0.974, P performed by experts and in 9/13 (69%) procedures performed by novices (P = 0.005). Multivariable analysis showed that the checklist score, independent of age and gender, predicted competence. The procedure-specific assessment tool for fetoscopic laser surgery shows good interobserver reliability and discriminates experts from novices. This instrument may therefore be a useful tool in the training curriculum for fetal surgeons. Further intervention studies with reassessment before and after training may increase the construct validity of the tool. Copyright © 2015 ISUOG. Published by John Wiley & Sons Ltd. Copyright © 2015 ISUOG. Published by John Wiley & Sons Ltd.

  7. Experimental Testing Procedures and Dynamic Model Validation for Vanadium Redox Flow Battery Storage System

    DEFF Research Database (Denmark)

    Baccino, Francesco; Marinelli, Mattia; Nørgård, Per Bromand

    2013-01-01

    The paper aims at characterizing the electrochemical and thermal parameters of a 15 kW/320 kWh vanadium redox flow battery (VRB) installed in the SYSLAB test facility of the DTU Risø Campus and experimentally validating the proposed dynamic model realized in Matlab-Simulink. The adopted testing...... efficiency of the battery system. The test procedure has general validity and could also be used for other storage technologies. The storage model proposed and described is suitable for electrical studies and can represent a general model in terms of validity. Finally, the model simulation outputs...

  8. Due date assignment procedures with dynamically updated coefficients for multi-level assembly job shops

    NARCIS (Netherlands)

    Adam, N.R.; Bertrand, J.W.M.; Morehead, D.C.; Surkis, J.

    1993-01-01

    This paper presents a study of due date assignment procedures in job shop environments where multi-level assembly jobs are processed and due dates are internally assigned. Most of the reported studies in the literature have focused on string type jobs. We propose a dynamic update approach (which

  9. Development and validation of a screening procedure to identify speech-language delay in toddlers with cleft palate

    DEFF Research Database (Denmark)

    Jørgensen, Line Dahl; Willadsen, Elisabeth

    2017-01-01

    condition based on assessment of consonant inventory using a real-time listening procedure in combination with parent-reported expressive vocabulary. These measures allowed evaluation of early speech-language skills found to correlate significantly with later speech-language difficulties in longitudinal......The purpose of this study was to develop and validate a clinically useful speech-language screening procedure for young children with cleft palate +/- cleft lip (CP) to identify those in need of speech-language intervention. Twenty-two children with CP were assigned to a +/- need for intervention...... studies of children with CP. The external validity of this screening procedure was evaluated by comparing the +/- need for intervention assignment determined by the screening procedure to experienced speech-language pathologists’ (SLPs’) clinical judgment of whether or not a child needed early...

  10. Technical Note: Procedure for the calibration and validation of kilo-voltage cone-beam CT models

    Energy Technology Data Exchange (ETDEWEB)

    Vilches-Freixas, Gloria; Létang, Jean Michel; Rit, Simon, E-mail: simon.rit@creatis.insa-lyon.fr [Université de Lyon, CREATIS, CNRS UMR5220, Inserm U1206, INSA-Lyon, Université Lyon 1, Centre Léon Bérard, Lyon 69373 Cedex 08 (France); Brousmiche, Sébastien [Ion Beam Application, Louvain-la-Neuve 1348 (Belgium); Romero, Edward; Vila Oliva, Marc [Université de Lyon, CREATIS, CNRS UMR5220, Inserm U1206, INSA-Lyon, Université Lyon 1, Centre Léon Bérard, Lyon 69373 Cedex 08, France and Ion Beam Application, Louvain-la-Neuve 1348 (Belgium); Kellner, Daniel; Deutschmann, Heinz; Keuschnigg, Peter; Steininger, Philipp [Institute for Research and Development on Advanced Radiation Technologies, Paracelsus Medical University, Salzburg 5020 (Austria)

    2016-09-15

    Purpose: The aim of this work is to propose a general and simple procedure for the calibration and validation of kilo-voltage cone-beam CT (kV CBCT) models against experimental data. Methods: The calibration and validation of the CT model is a two-step procedure: the source model then the detector model. The source is described by the direction dependent photon energy spectrum at each voltage while the detector is described by the pixel intensity value as a function of the direction and the energy of incident photons. The measurements for the source consist of a series of dose measurements in air performed at each voltage with varying filter thicknesses and materials in front of the x-ray tube. The measurements for the detector are acquisitions of projection images using the same filters and several tube voltages. The proposed procedure has been applied to calibrate and assess the accuracy of simple models of the source and the detector of three commercial kV CBCT units. If the CBCT system models had been calibrated differently, the current procedure would have been exclusively used to validate the models. Several high-purity attenuation filters of aluminum, copper, and silver combined with a dosimeter which is sensitive to the range of voltages of interest were used. A sensitivity analysis of the model has also been conducted for each parameter of the source and the detector models. Results: Average deviations between experimental and theoretical dose values are below 1.5% after calibration for the three x-ray sources. The predicted energy deposited in the detector agrees with experimental data within 4% for all imaging systems. Conclusions: The authors developed and applied an experimental procedure to calibrate and validate any model of the source and the detector of a CBCT unit. The present protocol has been successfully applied to three x-ray imaging systems. The minimum requirements in terms of material and equipment would make its implementation suitable in

  11. Validation of a densimeter calibration procedure for a secondary calibration laboratory

    International Nuclear Information System (INIS)

    Alpizar Herrera, Juan Carlos

    2014-01-01

    A survey was conducted to quantify the need for calibration of a density measurement instrument at the research units at the Sede Rodrigo Facio of the Universidad de Costa Rica. A calibration procedure was documented for the instrument that presented the highest demand in the survey by the calibration service. A study of INTE-ISO/IEC 17025: 2005 and specifically in section 5.4 of this standard was done, to document the procedure for calibrating densimeters. Densimeter calibration procedures and standards were sought from different national and international sources. The method of hydrostatic weighing or Cuckow method was the basis of the defined procedure. Documenting the calibration procedure and creating other documents was performed for data acquisition log, intermediate calculation log and calibration certificate copy. A veracity test was performed using as reference laboratory a laboratory of calibration secondary national as part of the validation process of the documented procedure. The results of the E_n statistic of 0.41; 0.34 and 0.46 for the calibration points 90%, 50% and 10% were obtained for the densimeter scale respectively. A reproducibility analysis of the method was performed with satisfactory results. Different suppliers were contacted to estimate the economic costs of the equipment and materials, needed to develop the documented method of densimeter calibration. The acquisition of an analytical balance was recommended, instead of a precision scale, in order to improve the results obtained with the documented method [es

  12. Stepwise Procedure for Development and Validation of a Multipesticide Method

    Energy Technology Data Exchange (ETDEWEB)

    Ambrus, A. [Hungarian Food Safety Office, Budapest (Hungary)

    2009-07-15

    The stepwise procedure for development and the validation of so called multi-pesticide methods are described. Principles, preliminary actions, criteria for the selection of chromatographic separation, detection and performance verification of multi-pesticide methods are outlined. Also the long term repeatability and reproducibility, as well as the necessity for the documentation of laboratory work are highlighted. Appendix I hereof describes in detail the calculation of calibration parameters, whereas Appendix II focuses on the calculation of the significance of differences of concentrations obtained on two different separation columns. (author)

  13. Validation procedures used in the Background Soil Characterization Project on the Oak Ridge Reservation, Oak Ridge, Tennessee

    International Nuclear Information System (INIS)

    1993-12-01

    The purpose of this report is (1) to document the data validation process developed for the Background Soil Characterization Project (BSCP); (2) to offer members of other project teams and potential data users the benefit of the experience gained in the BSCP in the area of developing project-specific data validation criteria and procedures based on best available guidance and technical information; and (3) to provide input and guidance to the efforts under way within Martin Marietta Energy Systems, Inc., to develop standard operating procedures to streamline and optimize the analytical laboratory data validation process for general use by making it more technically rigorous, consistent, and cost effective. Lessons learned from the BSCP are also provided to meet this end (Sect. 1.3)

  14. The validity and reliability of value-added and target-setting procedures with special reference to Key Stage 3

    OpenAIRE

    Moody, Ian Robin

    2003-01-01

    The validity of value-added systems of measurement is crucially dependent upon there being a demonstrably unambiguous relationship between the so-called baseline, or intake measures, and any subsequent measure of performance at a later stage. The reliability of such procedures is dependent on the relationships between these two measures being relatively stable over time. A number of questions arise with regard to both the validity and reliability of value-added procedures at any level in educ...

  15. 41 CFR 60-3.6 - Use of selection procedures which have not been validated.

    Science.gov (United States)

    2010-07-01

    ... EMPLOYMENT OPPORTUNITY, DEPARTMENT OF LABOR 3-UNIFORM GUIDELINES ON EMPLOYEE SELECTION PROCEDURES (1978... validation techniques contemplated by these guidelines. In such circumstances, the user should utilize... techniques contemplated by these guidelines usually should be followed if technically feasible. Where the...

  16. Spatial Distribution of Cosmetic-Procedure Businesses in Two U.S. Cities: A Pilot Mapping and Validation Study

    Science.gov (United States)

    Austin, S. Bryn; Gordon, Allegra R.; Kennedy, Grace A.; Sonneville, Kendrin R.; Blossom, Jeffrey; Blood, Emily A.

    2013-01-01

    Cosmetic procedures have proliferated rapidly over the past few decades, with over $11 billion spent on cosmetic surgeries and other minimally invasive procedures and another $2.9 billion spent on U.V. indoor tanning in 2012 in the United States alone. While research interest is increasing in tandem with the growth of the industry, methods have yet to be developed to identify and geographically locate the myriad types of businesses purveying cosmetic procedures. Geographic location of cosmetic-procedure businesses is a critical element in understanding the public health impact of this industry; however no studies we are aware of have developed valid and feasible methods for spatial analyses of these types of businesses. The aim of this pilot validation study was to establish the feasibility of identifying businesses offering surgical and minimally invasive cosmetic procedures and to characterize the spatial distribution of these businesses. We developed and tested three methods for creating a geocoded list of cosmetic-procedure businesses in Boston (MA) and Seattle (WA), USA, comparing each method on sensitivity and staff time required per confirmed cosmetic-procedure business. Methods varied substantially. Our findings represent an important step toward enabling rigorous health-linked spatial analyses of the health implications of this little-understood industry. PMID:24322394

  17. Spatial Distribution of Cosmetic-Procedure Businesses in Two U.S. Cities: A Pilot Mapping and Validation Study

    Directory of Open Access Journals (Sweden)

    S. Bryn Austin

    2013-12-01

    Full Text Available Cosmetic procedures have proliferated rapidly over the past few decades, with over $11 billion spent on cosmetic surgeries and other minimally invasive procedures and another $2.9 billion spent on U.V. indoor tanning in 2012 in the United States alone. While research interest is increasing in tandem with the growth of the industry, methods have yet to be developed to identify and geographically locate the myriad types of businesses purveying cosmetic procedures. Geographic location of cosmetic-procedure businesses is a critical element in understanding the public health impact of this industry; however no studies we are aware of have developed valid and feasible methods for spatial analyses of these types of businesses. The aim of this pilot validation study was to establish the feasibility of identifying businesses offering surgical and minimally invasive cosmetic procedures and to characterize the spatial distribution of these businesses. We developed and tested three methods for creating a geocoded list of cosmetic-procedure businesses in Boston (MA and Seattle (WA, USA, comparing each method on sensitivity and staff time required per confirmed cosmetic-procedure business. Methods varied substantially. Our findings represent an important step toward enabling rigorous health-linked spatial analyses of the health implications of this little-understood industry.

  18. Calculation of piping loads due to filling procedures; Berechnung von Rohrleitungsbelastungen durch Fuellvorgaenge

    Energy Technology Data Exchange (ETDEWEB)

    Swidersky, Harald; Thiele, Thomas [TUeV Sued Industrie Service GmbH, Muenchen (Germany)

    2012-11-01

    Filling procedures in piping systems are usually not load cases that are studied by fluid dynamic and structure dynamic analyses with respect to the integrity of pipes and supports. Although, their frequency is higher than that of postulated accidental transients, therefore they have to be considered for fatigue analyses. The piping and support loads due to filling procedures are caused by the density differences if the transported fluids, for instance in flows with the transport of gas bubbles. The impact duration of the momentum forces is defined by the flow velocity and the length of discontinuities in the piping segments. Filling procedures end very often with a shock pressure, caused by the impact and decelerating of the fluid front at smaller cross sections. The suitability of the thermally hydraulics program RELAP/MOD3.3 for the calculation of realistic loads from filling procedures was studied, the results compared with experimental data. It is shown that dependent on the discretization level the loads are partial significantly underestimated.

  19. Consistency of FMEA used in the validation of analytical procedures.

    Science.gov (United States)

    Oldenhof, M T; van Leeuwen, J F; Nauta, M J; de Kaste, D; Odekerken-Rombouts, Y M C F; Vredenbregt, M J; Weda, M; Barends, D M

    2011-02-20

    In order to explore the consistency of the outcome of a Failure Mode and Effects Analysis (FMEA) in the validation of analytical procedures, an FMEA was carried out by two different teams. The two teams applied two separate FMEAs to a High Performance Liquid Chromatography-Diode Array Detection-Mass Spectrometry (HPLC-DAD-MS) analytical procedure used in the quality control of medicines. Each team was free to define their own ranking scales for the probability of severity (S), occurrence (O), and detection (D) of failure modes. We calculated Risk Priority Numbers (RPNs) and we identified the failure modes above the 90th percentile of RPN values as failure modes needing urgent corrective action; failure modes falling between the 75th and 90th percentile of RPN values were identified as failure modes needing necessary corrective action, respectively. Team 1 and Team 2 identified five and six failure modes needing urgent corrective action respectively, with two being commonly identified. Of the failure modes needing necessary corrective actions, about a third were commonly identified by both teams. These results show inconsistency in the outcome of the FMEA. To improve consistency, we recommend that FMEA is always carried out under the supervision of an experienced FMEA-facilitator and that the FMEA team has at least two members with competence in the analytical method to be validated. However, the FMEAs of both teams contained valuable information that was not identified by the other team, indicating that this inconsistency is not always a drawback. Copyright © 2010 Elsevier B.V. All rights reserved.

  20. Enhanced Oceanic Operations Human-In-The-Loop In-Trail Procedure Validation Simulation Study

    Science.gov (United States)

    Murdoch, Jennifer L.; Bussink, Frank J. L.; Chamberlain, James P.; Chartrand, Ryan C.; Palmer, Michael T.; Palmer, Susan O.

    2008-01-01

    The Enhanced Oceanic Operations Human-In-The-Loop In-Trail Procedure (ITP) Validation Simulation Study investigated the viability of an ITP designed to enable oceanic flight level changes that would not otherwise be possible. Twelve commercial airline pilots with current oceanic experience flew a series of simulated scenarios involving either standard or ITP flight level change maneuvers and provided subjective workload ratings, assessments of ITP validity and acceptability, and objective performance measures associated with the appropriate selection, request, and execution of ITP flight level change maneuvers. In the majority of scenarios, subject pilots correctly assessed the traffic situation, selected an appropriate response (i.e., either a standard flight level change request, an ITP request, or no request), and executed their selected flight level change procedure, if any, without error. Workload ratings for ITP maneuvers were acceptable and not substantially higher than for standard flight level change maneuvers, and, for the majority of scenarios and subject pilots, subjective acceptability ratings and comments for ITP were generally high and positive. Qualitatively, the ITP was found to be valid and acceptable. However, the error rates for ITP maneuvers were higher than for standard flight level changes, and these errors may have design implications for both the ITP and the study's prototype traffic display. These errors and their implications are discussed.

  1. Assessing Women's Responses to Sexual Threat: Validity of a Virtual Role-Play Procedure

    Science.gov (United States)

    Jouriles, Ernest N.; Rowe, Lorelei Simpson; McDonald, Renee; Platt, Cora G.; Gomez, Gabriella S.

    2011-01-01

    This study evaluated the validity of a role-play procedure that uses virtual reality technology to assess women's responses to sexual threat. Forty-eight female undergraduate students were randomly assigned to either a standard, face-to-face role-play (RP) or a virtual role-play (VRP) of a sexually coercive situation. A multimethod assessment…

  2. Handling of radiopharmaceuticals drugs in hot cell: Implementation and validation of new hygiene procedures

    International Nuclear Information System (INIS)

    Levigoureux, E.; Hoffman, A.; Bolot, C.; Aulagner, G.; Brun, J.

    2012-01-01

    Exigencies associated with radiopharmaceutical drugs require validation of hygiene procedures. Different bio-cleaning processes were applied. For each, samples were collected from work surface, from preparation field and from the cap of multi-doses vial on agar contact. Twenty-two radiopharmaceutical preparations were inoculated in different liquid media. Results show that modification of bio-cleaning process enables a microbiological contamination reduction (p ≡ 0.0196). All preparations passed the sterility tests. Thus these results enabled the validation of new hygiene process in the radiopharmacy unit. (authors) [fr

  3. A competency based selection procedure for Dutch postgraduate GP training: a pilot study on validity and reliability.

    Science.gov (United States)

    Vermeulen, Margit I; Tromp, Fred; Zuithoff, Nicolaas P A; Pieters, Ron H M; Damoiseaux, Roger A M J; Kuyvenhoven, Marijke M

    2014-12-01

    Abstract Background: Historically, semi-structured interviews (SSI) have been the core of the Dutch selection for postgraduate general practice (GP) training. This paper describes a pilot study on a newly designed competency-based selection procedure that assesses whether candidates have the competencies that are required to complete GP training. The objective was to explore reliability and validity aspects of the instruments developed. The new selection procedure comprising the National GP Knowledge Test (LHK), a situational judgement tests (SJT), a patterned behaviour descriptive interview (PBDI) and a simulated encounter (SIM) was piloted alongside the current procedure. Forty-seven candidates volunteered in both procedures. Admission decision was based on the results of the current procedure. Study participants did hardly differ from the other candidates. The mean scores of the candidates on the LHK and SJT were 21.9 % (SD 8.7) and 83.8% (SD 3.1), respectively. The mean self-reported competency scores (PBDI) were higher than the observed competencies (SIM): 3.7(SD 0.5) and 2.9(SD 0.6), respectively. Content-related competencies showed low correlations with one another when measured with different instruments, whereas more diverse competencies measured by a single instrument showed strong to moderate correlations. Moreover, a moderate correlation between LHK and SJT was found. The internal consistencies (intraclass correlation, ICC) of LHK and SJT were poor while the ICC of PBDI and SIM showed acceptable levels of reliability. Findings on content validity and reliability of these new instruments are promising to realize a competency based procedure. Further development of the instruments and research on predictive validity should be pursued.

  4. Application of the neo-deterministic seismic microzonation procedure in Bulgaria and validation of the seismic input against Eurocode 8

    International Nuclear Information System (INIS)

    Paskaleva, I.; Kouteva, M.; Vaccari, F.; Panza, G.F.

    2008-03-01

    The earthquake record and the Code for design and construction in seismic regions in Bulgaria have shown that the territory of the Republic of Bulgaria is exposed to a high seismic risk due to local shallow and regional strong intermediate-depth seismic sources. The available strong motion database is quite limited, and therefore not representative at all of the real hazard. The application of the neo-deterministic seismic hazard assessment procedure for two main Bulgarian cities has been capable to supply a significant database of synthetic strong motions for the target sites, applicable for earthquake engineering purposes. The main advantage of the applied deterministic procedure is the possibility to take simultaneously and correctly into consideration the contribution to the earthquake ground motion at the target sites of the seismic source and of the seismic wave propagation in the crossed media. We discuss in this study the result of some recent applications of the neo-deterministic seismic microzonation procedure to the cities of Sofia and Russe. The validation of the theoretically modeled seismic input against Eurocode 8 and the few available records at these sites is discussed. (author)

  5. Validity of the Draw-a-Person: Screening Procedure for Emotional Disturbance (DAP:SPED) in Strengths-Based Assessment

    Science.gov (United States)

    Matto, Holly C.; Naglieri, Jack A.; Clausen, Cinny

    2005-01-01

    Objective: This is the first validity study to date to examine the relationship between the Draw-A-Person: Screening Procedure for Emotional Disturbance (DAP:SPED) and strengths-based emotional and behavioral measures. The incremental predictive validity of the DAP:SPED relative to the Behavioral and Emotional Rating Scale was examined. Method:…

  6. Measuring production loss due to health and work environment problems: construct validity and implications.

    Science.gov (United States)

    Karlsson, Malin Lohela; Bergström, Gunnar; Björklund, Christina; Hagberg, Jan; Jensen, Irene

    2013-12-01

    The aim was to validate two measures of production loss, health-related and work environment-related production loss, concerning their associations with health status and work environment factors. Validity was assessed by evaluating the construct validity. Health problems related and work environment-related problems (or factors) were included in separate analyses and evaluated regarding the significant difference in proportion of explained variation (R) of production loss. health problems production loss was not found to fulfill the criteria for convergent validity in this study; however, the measure of work environment-related production loss did fulfill the criteria that were set up. The measure of work environment-related production loss can be used to screen for production loss due to work environment problems as well as an outcome measure when evaluating the effect of organizational interventions.

  7. Monte Carlo evaluation of hand and finger doses due to exposure to 18F in PET procedures

    International Nuclear Information System (INIS)

    Pessanha, Paula R.; Queiroz Filho, Pedro P.; Santos, Denison S.; Mauricio, Claudia L.P.

    2011-01-01

    The increasing number of PET procedures performed in nuclear medicine, and, consequently, of workers handling radiopharmaceuticals, is a potential hazard in radiation protection. It is then necessary to evaluate the doses of workers employed in the practice of PET. In this work, the Geant4 Monte Carlo code was used to evaluate doses to fingers and hands of those workers. A geometric phantom, representing the hand of the professional inserted in the clinical procedure, was implemented in the simulation code, with dimensions of a standard man's forearm, which in this case will assess the exposure of the extremities. The geometric phantom is designed so that a simple definition of joint angles configures the fingers, allowing investigations into alternative configurations. Thus, it was possible the placement of the phantom fingers, to simulate all forms of manipulation of a syringe, and subsequently obtain exposure data, relating to the administration procedure of the PET radiopharmaceutical to the patient. The simulation was validated by the irradiation of a REMAB R hand phantom, consisting of a human skeleton hand covered by a tenite II shell, which can be filled with water. Air Kerma values were obtained from the beam dosimetry, which was done with a calibrated ionization chamber. The reading of TLD's, placed on certain points of the surface of the phantom, were compared with the values obtained in the Monte Carlo simulation. After validation of the program, we obtained dose values for the PET procedure, simulating syringes with and without shielding. (author)

  8. 34 CFR Appendix C to Part 682 - Procedures for Curing Violations of the Due Diligence in Collection and Timely Filing of Claims...

    Science.gov (United States)

    2010-07-01

    ... 34 Education 3 2010-07-01 2010-07-01 false Procedures for Curing Violations of the Due Diligence... Appendix C to Part 682—Procedures for Curing Violations of the Due Diligence in Collection and Timely... lenders to use (1) to cure violations of the requirements for due diligence in collection (“due diligence...

  9. Validity, reliability, feasibility, acceptability and educational impact of direct observation of procedural skills (DOPS).

    Science.gov (United States)

    Naeem, Naghma

    2013-01-01

    Direct observation of procedural skills (DOPS) is a new workplace assessment tool. The aim of this narrative review of literature is to summarize the available evidence about the validity, reliability, feasibility, acceptability and educational impact of DOPS. A PubMed database and Google search of the literature on DOPS published from January 2000 to January 2012 was conducted which yielded 30 articles. Thirteen articles were selected for full text reading and review. In the reviewed literature, DOPS was found to be a useful tool for assessment of procedural skills, but further research is required to prove its utility as a workplace based assessment instrument.

  10. What Do You Think You Are Measuring? A Mixed-Methods Procedure for Assessing the Content Validity of Test Items and Theory-Based Scaling

    Science.gov (United States)

    Koller, Ingrid; Levenson, Michael R.; Glück, Judith

    2017-01-01

    The valid measurement of latent constructs is crucial for psychological research. Here, we present a mixed-methods procedure for improving the precision of construct definitions, determining the content validity of items, evaluating the representativeness of items for the target construct, generating test items, and analyzing items on a theoretical basis. To illustrate the mixed-methods content-scaling-structure (CSS) procedure, we analyze the Adult Self-Transcendence Inventory, a self-report measure of wisdom (ASTI, Levenson et al., 2005). A content-validity analysis of the ASTI items was used as the basis of psychometric analyses using multidimensional item response models (N = 1215). We found that the new procedure produced important suggestions concerning five subdimensions of the ASTI that were not identifiable using exploratory methods. The study shows that the application of the suggested procedure leads to a deeper understanding of latent constructs. It also demonstrates the advantages of theory-based item analysis. PMID:28270777

  11. Mediation and Due Process Procedures in Special Education: An Analysis of State Policies. Final Report. Project FORUM.

    Science.gov (United States)

    Ahearn, Eileen M.

    This survey of 50 states and 3 of 10 non-state U.S. jurisdictions concerning state due process procedures focuses mainly on the use of mediation as a form of dispute resolution that offers an alternative to due process hearings in special education. A background section discusses the definition of mediation and the mediation process. Survey…

  12. A formal language to describe a wide class of failure detection and signal validation procedures

    Energy Technology Data Exchange (ETDEWEB)

    Racz, A. [Hungarian Academy of Sciences, Budapest (Hungary). Atomic Energy Research Inst.

    1996-01-01

    In the present article we make the first step towards the implementation of a user-friendly, object-oriented system devoted to failure detection and signal validation purposes. After overviewing different signal modelling, residual making and hypothesis testing procedures, a mathematical tool is suggested to describe a general failure detection problem. Three different levels of the abstraction are distinguished; direct examination, preliminary decision support mechanism and indirect examination. Possible scenarios are introduced depending both on the objective properties of the investigated signal and the particular requirements prescribed by the expert himself. Finally it is showed how to build up systematically a complete, general failure detection procedure. (author).

  13. A validity generalization procedure to test relations between intrinsic and extrinsic motivation and influence tactics.

    Science.gov (United States)

    Barbuto, John E; Moss, Jennifer A

    2006-08-01

    The relations of intrinsic and extrinsic motivation with use of consultative, legitimating, and pressure influence tactics were examined using validity generalization procedures. 5 to 7 field studies with cumulative samples exceeding 800 were used to test each relationship. Significance was found for relation between agents' intrinsic motivation and their use of consultative influence tactics and agents' extrinsic motivation and their use of legitimating influence tactics.

  14. Experimental design technique applied to the validation of an instrumental Neutron Activation Analysis procedure

    International Nuclear Information System (INIS)

    Santos, Uanda Paula de M. dos; Moreira, Edson Gonçalves

    2017-01-01

    In this study optimization of procedures and standardization of Instrumental Neutron Activation Analysis (INAA) method were carried out for the determination of the elements bromine, chlorine, magnesium, manganese, potassium, sodium and vanadium in biological matrix materials using short irradiations at a pneumatic system. 2 k experimental designs were applied for evaluation of the individual contribution of selected variables of the analytical procedure in the final mass fraction result. The chosen experimental designs were the 2 3 and the 2 4 , depending on the radionuclide half life. Different certified reference materials and multi-element comparators were analyzed considering the following variables: sample decay time, irradiation time, counting time and sample distance to detector. Comparator concentration, sample mass and irradiation time were maintained constant in this procedure. By means of the statistical analysis and theoretical and experimental considerations, it was determined the optimized experimental conditions for the analytical methods that will be adopted for the validation procedure of INAA methods in the Neutron Activation Analysis Laboratory (LAN) of the Research Reactor Center (CRPq) at the Nuclear and Energy Research Institute (IPEN /CNEN-SP). Optimized conditions were estimated based on the results of z-score tests, main effect, interaction effects and better irradiation conditions. (author)

  15. Experimental design technique applied to the validation of an instrumental Neutron Activation Analysis procedure

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Uanda Paula de M. dos; Moreira, Edson Gonçalves, E-mail: uandapaula@gmail.com, E-mail: emoreira@ipen.br [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo, SP (Brazil)

    2017-07-01

    In this study optimization of procedures and standardization of Instrumental Neutron Activation Analysis (INAA) method were carried out for the determination of the elements bromine, chlorine, magnesium, manganese, potassium, sodium and vanadium in biological matrix materials using short irradiations at a pneumatic system. 2{sup k} experimental designs were applied for evaluation of the individual contribution of selected variables of the analytical procedure in the final mass fraction result. The chosen experimental designs were the 2{sup 3} and the 2{sup 4}, depending on the radionuclide half life. Different certified reference materials and multi-element comparators were analyzed considering the following variables: sample decay time, irradiation time, counting time and sample distance to detector. Comparator concentration, sample mass and irradiation time were maintained constant in this procedure. By means of the statistical analysis and theoretical and experimental considerations, it was determined the optimized experimental conditions for the analytical methods that will be adopted for the validation procedure of INAA methods in the Neutron Activation Analysis Laboratory (LAN) of the Research Reactor Center (CRPq) at the Nuclear and Energy Research Institute (IPEN /CNEN-SP). Optimized conditions were estimated based on the results of z-score tests, main effect, interaction effects and better irradiation conditions. (author)

  16. URANS simulations of the tip-leakage cavitating flow with verification and validation procedures

    Science.gov (United States)

    Cheng, Huai-yu; Long, Xin-ping; Liang, Yun-zhi; Long, Yun; Ji, Bin

    2018-04-01

    In the present paper, the Vortex Identified Zwart-Gerber-Belamri (VIZGB) cavitation model coupled with the SST-CC turbulence model is used to investigate the unsteady tip-leakage cavitating flow induced by a NACA0009 hydrofoil. A qualitative comparison between the numerical and experimental results is made. In order to quantitatively evaluate the reliability of the numerical data, the verification and validation (V&V) procedures are used in the present paper. Errors of numerical results are estimated with seven error estimators based on the Richardson extrapolation method. It is shown that though a strict validation cannot be achieved, a reasonable prediction of the gross characteristics of the tip-leakage cavitating flow can be obtained. Based on the numerical results, the influence of the cavitation on the tip-leakage vortex (TLV) is discussed, which indicates that the cavitation accelerates the fusion of the TLV and the tip-separation vortex (TSV). Moreover, the trajectory of the TLV, when the cavitation occurs, is close to the side wall.

  17. Reliability and validity of procedure-based assessments in otolaryngology training.

    Science.gov (United States)

    Awad, Zaid; Hayden, Lindsay; Robson, Andrew K; Muthuswamy, Keerthini; Tolley, Neil S

    2015-06-01

    To investigate the reliability and construct validity of procedure-based assessment (PBA) in assessing performance and progress in otolaryngology training. Retrospective database analysis using a national electronic database. We analyzed PBAs of otolaryngology trainees in North London from core trainees (CTs) to specialty trainees (STs). The tool contains six multi-item domains: consent, planning, preparation, exposure/closure, technique, and postoperative care, rated as "satisfactory" or "development required," in addition to an overall performance rating (pS) of 1 to 4. Individual domain score, overall calculated score (cS), and number of "development-required" items were calculated for each PBA. Receiver operating characteristic analysis helped determine sensitivity and specificity. There were 3,152 otolaryngology PBAs from 46 otolaryngology trainees analyzed. PBA reliability was high (Cronbach's α 0.899), and sensitivity approached 99%. cS correlated positively with pS and level in training (rs : +0.681 and +0.324, respectively). ST had higher cS and pS than CT (93% ± 0.6 and 3.2 ± 0.03 vs. 71% ± 3.1 and 2.3 ± 0.08, respectively; P reliable and valid for assessing otolaryngology trainees' performance and progress at all levels. It is highly sensitive in identifying competent trainees. The tool is used in a formative and feedback capacity. The technical domain is the best predictor and should be given close attention. NA. © 2014 The American Laryngological, Rhinological and Otological Society, Inc.

  18. Procedure for the Selection and Validation of a Calibration Model I-Description and Application.

    Science.gov (United States)

    Desharnais, Brigitte; Camirand-Lemyre, Félix; Mireault, Pascal; Skinner, Cameron D

    2017-05-01

    Calibration model selection is required for all quantitative methods in toxicology and more broadly in bioanalysis. This typically involves selecting the equation order (quadratic or linear) and weighting factor correctly modelizing the data. A mis-selection of the calibration model will generate lower quality control (QC) accuracy, with an error up to 154%. Unfortunately, simple tools to perform this selection and tests to validate the resulting model are lacking. We present a stepwise, analyst-independent scheme for selection and validation of calibration models. The success rate of this scheme is on average 40% higher than a traditional "fit and check the QCs accuracy" method of selecting the calibration model. Moreover, the process was completely automated through a script (available in Supplemental Data 3) running in RStudio (free, open-source software). The need for weighting was assessed through an F-test using the variances of the upper limit of quantification and lower limit of quantification replicate measurements. When weighting was required, the choice between 1/x and 1/x2 was determined by calculating which option generated the smallest spread of weighted normalized variances. Finally, model order was selected through a partial F-test. The chosen calibration model was validated through Cramer-von Mises or Kolmogorov-Smirnov normality testing of the standardized residuals. Performance of the different tests was assessed using 50 simulated data sets per possible calibration model (e.g., linear-no weight, quadratic-no weight, linear-1/x, etc.). This first of two papers describes the tests, procedures and outcomes of the developed procedure using real LC-MS-MS results for the quantification of cocaine and naltrexone. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  19. Experimental Validation Of An Innovative Procedure For The Rolling Noise Correction

    Directory of Open Access Journals (Sweden)

    Viscardi Massimo

    2017-01-01

    Full Text Available Among the wide contest of the train vehicles rolling noise evaluation, the aim of the paper is the development, implementation and experimental testing of a new method for roughness calculation according to FprCEN/TR 16891:2015 and the successive evaluation of the correction parameters of the measured rolling noise due to the presence of not compliant rail roughness. It is, in-fact, a very often operative condition, the execution of rolling noise tests over standard in-operation rails that are characterized by roughness profiles very different from standard one as those prescribed within the ISO 3095 procedure. Very often, this difference lead to the presence of an exceeding noise that needs to be evaluated and revised for a correct definition of the phenomena. Within the paper, the procedure implementation is presented and later on verified in operative experimental contest; forecasted and measured data are compared and successively commented.

  20. A procedure validation for high conversion reactors fuel elements calculation

    International Nuclear Information System (INIS)

    Ishida, V.N.; Patino, N.E.; Abbate, M.J.; Sbaffoni, M.M.

    1990-01-01

    The present work includes procedure validation of cross sections generation starting from nuclear data and the calculation system actually used at the Bariloche Atomic Center Reactor and Neutrons Division for its application to fuel elements calculation of a high conversion reactor (HCR). To this purpose, the fuel element calculation belonging to a High Conversion Boiling water Reactor (HCBWR) was chosen as reference problem, employing the Monte Carlo method. Various cases were considered: with and without control bars, cold of hot, at different vacuum fractions. Multiplication factors, reaction rates, power maps and peak factors were compared. A sensitivity analysis of typical cells used, the approximations employed to solve the transport equation (Sn or Diffusion), the 1-D or 2-D representation and densification of the spatial network used, with the aim of evaluating their influence on the parameters studied and to come to an optimum combination to be used in future design calculations. (Author) [es

  1. Design for validation: An approach to systems validation

    Science.gov (United States)

    Carter, William C.; Dunham, Janet R.; Laprie, Jean-Claude; Williams, Thomas; Howden, William; Smith, Brian; Lewis, Carl M. (Editor)

    1989-01-01

    Every complex system built is validated in some manner. Computer validation begins with review of the system design. As systems became too complicated for one person to review, validation began to rely on the application of adhoc methods by many individuals. As the cost of the changes mounted and the expense of failure increased, more organized procedures became essential. Attempts at devising and carrying out those procedures showed that validation is indeed a difficult technical problem. The successful transformation of the validation process into a systematic series of formally sound, integrated steps is necessary if the liability inherent in the future digita-system-based avionic and space systems is to be minimized. A suggested framework and timetable for the transformtion are presented. Basic working definitions of two pivotal ideas (validation and system life-cyle) are provided and show how the two concepts interact. Many examples are given of past and present validation activities by NASA and others. A conceptual framework is presented for the validation process. Finally, important areas are listed for ongoing development of the validation process at NASA Langley Research Center.

  2. Are we really measuring what we say we're measuring? Using video techniques to supplement traditional construct validation procedures.

    Science.gov (United States)

    Podsakoff, Nathan P; Podsakoff, Philip M; Mackenzie, Scott B; Klinger, Ryan L

    2013-01-01

    Several researchers have persuasively argued that the most important evidence to consider when assessing construct validity is whether variations in the construct of interest cause corresponding variations in the measures of the focal construct. Unfortunately, the literature provides little practical guidance on how researchers can go about testing this. Therefore, the purpose of this article is to describe how researchers can use video techniques to test whether their scales measure what they purport to measure. First, we discuss how researchers can develop valid manipulations of the focal construct that they hope to measure. Next, we explain how to design a study to use this manipulation to test the validity of the scale. Finally, comparing and contrasting traditional and contemporary perspectives on validation, we discuss the advantages and limitations of video-based validation procedures. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  3. Validation of standard operating procedures in a multicenter retrospective study to identify -omics biomarkers for chronic low back pain.

    Directory of Open Access Journals (Sweden)

    Concetta Dagostino

    Full Text Available Chronic low back pain (CLBP is one of the most common medical conditions, ranking as the greatest contributor to global disability and accounting for huge societal costs based on the Global Burden of Disease 2010 study. Large genetic and -omics studies provide a promising avenue for the screening, development and validation of biomarkers useful for personalized diagnosis and treatment (precision medicine. Multicentre studies are needed for such an effort, and a standardized and homogeneous approach is vital for recruitment of large numbers of participants among different centres (clinical and laboratories to obtain robust and reproducible results. To date, no validated standard operating procedures (SOPs for genetic/-omics studies in chronic pain have been developed. In this study, we validated an SOP model that will be used in the multicentre (5 centres retrospective "PainOmics" study, funded by the European Community in the 7th Framework Programme, which aims to develop new biomarkers for CLBP through three different -omics approaches: genomics, glycomics and activomics. The SOPs describe the specific procedures for (1 blood collection, (2 sample processing and storage, (3 shipping details and (4 cross-check testing and validation before assays that all the centres involved in the study have to follow. Multivariate analysis revealed the absolute specificity and homogeneity of the samples collected by the five centres for all genetics, glycomics and activomics analyses. The SOPs used in our multicenter study have been validated. Hence, they could represent an innovative tool for the correct management and collection of reliable samples in other large-omics-based multicenter studies.

  4. Validation of the in-flight calibration procedures for the MICROSCOPE space mission

    Science.gov (United States)

    Hardy, Émilie; Levy, Agnès; Rodrigues, Manuel; Touboul, Pierre; Métris, Gilles

    2013-11-01

    The MICROSCOPE space mission aims to test the Equivalence Principle with an accuracy of 10-15. The drag-free micro-satellite will orbit around the Earth and embark a differential electrostatic accelerometer including two cylindrical test masses submitted to the same gravitational field and made of different materials. The experience consists in testing the equality of the electrostatic acceleration applied to the masses to maintain them relatively motionless. The accuracy of the measurements exploited for the test of the Equivalence Principle is limited by our a priori knowledge of several physical parameters of the instrument. These parameters are partially estimated on-ground, but with an insufficient accuracy, and an in-orbit calibration is therefore required to correct the measurements. The calibration procedures have been defined and their analytical performances have been evaluated. In addition, a simulator software including the dynamics model of the instrument, the satellite drag-free system and the perturbing environment has been developed to numerically validate the analytical results. After an overall presentation of the MICROSCOPE mission, this paper will describe the calibration procedures and focus on the simulator. Such an in-flight calibration is mandatory for similar space missions taking advantage of a drag-free system.

  5. Estimation of the collective dose in the Portuguese population due to medical procedures in 2010

    International Nuclear Information System (INIS)

    Teles, Pedro; Vaz, Pedro; Sousa, M. Carmen de; Paulo, Graciano; Santos, Joana; Pascoal, Ana; Cardoso, Gabriela; Santos, Ana Isabel; Lanca, Isabel; Matela, Nuno; Janeiro, Luis; Sousa, Patrick; Carvoeiras, Pedro; Parafita, Rui; Simaozinho, Paula

    2013-01-01

    In a wide range of medical fields, technological advancements have led to an increase in the average collective dose in national populations worldwide. Periodic estimations of the average collective population dose due to medical exposure is, therefore of utmost importance, and is now mandatory in countries within the European Union (article 12 of EURATOM directive 97/ 43). Presented in this work is a report on the estimation of the collective dose in the Portuguese population due to nuclear medicine diagnostic procedures and the Top 20 diagnostic radiology examinations, which represent the 20 exams that contribute the most to the total collective dose in diagnostic radiology and interventional procedures in Europe. This work involved the collaboration of a multidisciplinary taskforce comprising representatives of all major Portuguese stakeholders (universities, research institutions, public and private health care providers, administrative services of the National Healthcare System, scientific and professional associations and private service providers). This allowed us to gather a comprehensive amount of data necessary for a robust estimation of the collective effective dose to the Portuguese population. The methodology used for data collection and dose estimation was based on European Commission recommendations, as this work was performed in the framework of the European wide Dose Datamed II project. This is the first study estimating the collective dose for the population in Portugal, considering such a wide national coverage and range of procedures and consisting of important baseline reference data. The taskforce intends to continue developing periodic collective dose estimations in the future. The estimated annual average effective dose for the Portuguese population was of 0.080±0.017 mSv caput -1 for nuclear medicine exams and of 0.96±0.68 mSv caput -1 for the Top 20 diagnostic radiology exams. (authors)

  6. Validation of a new analytical procedure for determination of residual solvents in [18F]FDG by gas chromatography

    International Nuclear Information System (INIS)

    Costa, Flávia M.; Costa, Cassiano L.S.; Silva, Juliana B.; Ferreira, Soraya M.Z.M.D.

    2017-01-01

    Fludeoxyglucose F 18 ([ 18 F]FDG) is the most used radiopharmaceutical for positron emission tomography, especially on oncology. Organic solvents such as ether, ethanol and acetonitrile might be used in the synthesis of [ 18 F]FDG; however, they might not be completely removed during purification steps. The determination of residual solvents in [ 18 F]FDG is required in the European Pharmacopoeia (EP) and the United States Pharmacopeia (USP) monographs. While the procedure described in the EP is quite general, the one described in the USP requires a long runtime (about 13 minutes). In this work a simple and fast (4-minute) analytical procedure was developed and validated for determination of residual solvents in [ 18 F]FDG. Analyses were carried out in a Perkin Elmer gas chromatograph equipped with a flame ionization detector. The separation was obtained on a 0.53-mm x 30 m fused-silica column. Validation included the evaluation of various parameters, such as: specificity, linearity and range, limits of detection and quantitation, precision (repeatability and intermediate precision), accuracy, and robustness. Results were found to be within acceptable limits, indicating the developed procedure is suitable for its intended application. Considering the short half-life of fluorine-18 (109.7 minutes), this new method could be a valuable alternative for routine quality control of [ 18 F]FDG. (author)

  7. Procedural Due Process for Students at Public Colleges and Universities.

    Science.gov (United States)

    Golden, Edward J.

    1982-01-01

    Reports the findings of a study to determine what procedural protections are afforded students at public colleges and universities who are faced with disciplinary or academic dismissal. The data are from 62 of the 85 public postsecondary institutions asked to provide published procedural guidelines. (Author/MLF)

  8. Automatic segmentation of rotational x-ray images for anatomic intra-procedural surface generation in atrial fibrillation ablation procedures.

    Science.gov (United States)

    Manzke, Robert; Meyer, Carsten; Ecabert, Olivier; Peters, Jochen; Noordhoek, Niels J; Thiagalingam, Aravinda; Reddy, Vivek Y; Chan, Raymond C; Weese, Jürgen

    2010-02-01

    Since the introduction of 3-D rotational X-ray imaging, protocols for 3-D rotational coronary artery imaging have become widely available in routine clinical practice. Intra-procedural cardiac imaging in a computed tomography (CT)-like fashion has been particularly compelling due to the reduction of clinical overhead and ability to characterize anatomy at the time of intervention. We previously introduced a clinically feasible approach for imaging the left atrium and pulmonary veins (LAPVs) with short contrast bolus injections and scan times of approximately 4 -10 s. The resulting data have sufficient image quality for intra-procedural use during electro-anatomic mapping (EAM) and interventional guidance in atrial fibrillation (AF) ablation procedures. In this paper, we present a novel technique to intra-procedural surface generation which integrates fully-automated segmentation of the LAPVs for guidance in AF ablation interventions. Contrast-enhanced rotational X-ray angiography (3-D RA) acquisitions in combination with filtered-back-projection-based reconstruction allows for volumetric interrogation of LAPV anatomy in near-real-time. An automatic model-based segmentation algorithm allows for fast and accurate LAPV mesh generation despite the challenges posed by image quality; relative to pre-procedural cardiac CT/MR, 3-D RA images suffer from more artifacts and reduced signal-to-noise. We validate our integrated method by comparing 1) automatic and manual segmentations of intra-procedural 3-D RA data, 2) automatic segmentations of intra-procedural 3-D RA and pre-procedural CT/MR data, and 3) intra-procedural EAM point cloud data with automatic segmentations of 3-D RA and CT/MR data. Our validation results for automatically segmented intra-procedural 3-D RA data show average segmentation errors of 1) approximately 1.3 mm compared with manual 3-D RA segmentations 2) approximately 2.3 mm compared with automatic segmentation of pre-procedural CT/MR data and 3

  9. Continuous validation of ASTEC containment models and regression testing

    International Nuclear Information System (INIS)

    Nowack, Holger; Reinke, Nils; Sonnenkalb, Martin

    2014-01-01

    The focus of the ASTEC (Accident Source Term Evaluation Code) development at GRS is primarily on the containment module CPA (Containment Part of ASTEC), whose modelling is to a large extent based on the GRS containment code COCOSYS (COntainment COde SYStem). Validation is usually understood as the approval of the modelling capabilities by calculations of appropriate experiments done by external users different from the code developers. During the development process of ASTEC CPA, bugs and unintended side effects may occur, which leads to changes in the results of the initially conducted validation. Due to the involvement of a considerable number of developers in the coding of ASTEC modules, validation of the code alone, even if executed repeatedly, is not sufficient. Therefore, a regression testing procedure has been implemented in order to ensure that the initially obtained validation results are still valid with succeeding code versions. Within the regression testing procedure, calculations of experiments and plant sequences are performed with the same input deck but applying two different code versions. For every test-case the up-to-date code version is compared to the preceding one on the basis of physical parameters deemed to be characteristic for the test-case under consideration. In the case of post-calculations of experiments also a comparison to experimental data is carried out. Three validation cases from the regression testing procedure are presented within this paper. The very good post-calculation of the HDR E11.1 experiment shows the high quality modelling of thermal-hydraulics in ASTEC CPA. Aerosol behaviour is validated on the BMC VANAM M3 experiment, and the results show also a very good agreement with experimental data. Finally, iodine behaviour is checked in the validation test-case of the THAI IOD-11 experiment. Within this test-case, the comparison of the ASTEC versions V2.0r1 and V2.0r2 shows how an error was detected by the regression testing

  10. A procedure for safety assessment of components with cracks - Handbook

    International Nuclear Information System (INIS)

    Andersson, P.; Bergman, M.; Brickstad, B.; Dahlberg, L.; Nilsson, F.; Sattari-Far, I.

    1996-01-01

    In this handbook a procedure is described which can be used both for assessment of detected cracks or crack like defects or for defect tolerance analysis. The procedure can be used to calculate possible crack growth due to fatigue or stress corrosion and to calculate the reserve margin for failure due to fracture and plastic collapse. For ductile materials, the procedure gives the reserve margin for initiation of stable crack growth. Thus, an extra reserve margin, unknown to size, exists for failure in components made of ductile materials. The procedure was developed for operative use with the following objectives in mind: The procedure should be able to handle both linear and non-linear problems without any a priori division; The procedure shall ensure uniqueness of the safety assessment; The procedure should be well defined and easy to use; The conservatism of the procedure should be well validated; The handbook that documents the procedure should be so complete that for most assessments access to any other fracture mechanics literature should not be necessary. The method utilized is based on the R6-method developed at Nuclear Electric plc. This method can in principle be used for all metallic materials. It is, however, more extensively verified for steel alloys only. The method is not intended for use in temperatures where creep deformation is of importance. The first edition of the handbook was released in 1990 and the second in 1991. This third edition has been extensively revised. A Windows-based program (SACC) has been developed which can perform the assessments described in the book including calculation of crack growth due to stress corrosion and fatigue. 52 refs., 27 figs., 35 tabs

  11. Delphi Method Validation of a Procedural Performance Checklist for Insertion of an Ultrasound-Guided Internal Jugular Central Line.

    Science.gov (United States)

    Hartman, Nicholas; Wittler, Mary; Askew, Kim; Manthey, David

    2016-01-01

    Placement of ultrasound-guided central lines is a critical skill for physicians in several specialties. Improving the quality of care delivered surrounding this procedure demands rigorous measurement of competency, and validated tools to assess performance are essential. Using the iterative, modified Delphi technique and experts in multiple disciplines across the United States, the study team created a 30-item checklist designed to assess competency in the placement of ultrasound-guided internal jugular central lines. Cronbach α was .94, indicating an excellent degree of internal consistency. Further validation of this checklist will require its implementation in simulated and clinical environments. © The Author(s) 2014.

  12. "INTRODUCING A FULL VALIDATED ANALYTICAL PROCEDURE AS AN OFFICIAL COMPENDIAL METHOD FOR FENTANYL TRANSDERMAL PATCHES"

    Directory of Open Access Journals (Sweden)

    Amir Mehdizadeh

    2005-04-01

    Full Text Available A simple, sensitive and specific HPLC method and also a simple and fast extraction procedure were developed for quantitative analysis of fentanyl transdermal patches. Chloroform, methanol and ethanol were used as extracting solvents with recovery percent of 92.1, 94.3 and 99.4% respectively. Fentanyl was extracted with ethanol and the eluted fentanyl through the C18 column was monitored by UV detection at 230 nm. The linearity was at the range of 0.5-10 µg/mL with correlation coefficient (r2 of 0.9992. Both intra and inter-day accuracy and precision were within acceptable limits. The detection limit (DL and quantitation limit (QL were 0.15 and 0.5 µg/mL, respectively. Other validation characteristics such as selectivity, robustness and ruggedness were evaluated. Following method validation, a system suitability test (SST including capacity factor (k´, plate number (N, tailing factor (T, and RSD was defined for routine test.

  13. Design of the measurements validation procedure and the expert system architecture for a cogeneration internal combustion engine

    International Nuclear Information System (INIS)

    Barelli, L.; Bidini, G.

    2005-01-01

    A research activity has been initiated to study the development of a diagnostic methodology, for the optimization of energy efficiency and the maximization of the operational time in those conditions, based on artificial intelligence (AI) techniques such as artificial neural network (ANN) and fuzzy logic. The diagnostic procedure, developed specifically for the cogeneration plant located at the Engineering Department of the University of Perugia, must be characterized by a modular architecture to obtain a flexible architecture applicable to different systems. The first part of the study deals with the identifying the principal modules and the corresponding variables necessary to evaluate the module 'health state'. Also the consequent upgrade of the monitoring system is described in this paper. Moreover it describes the structure proposed for the diagnostic procedure, consisting of a procedure for measurement validation and a fuzzy logic-based inference system. The first reveals the presence of abnormal conditions and localizes their source distinguishing between system failure and instrumentation malfunctions. The second provides an evaluation of module health state and the classification of the failures which have possibly occurred. The procedure was implemented in C++

  14. Validation of multivariate classification methods using analytical fingerprints – concept and case study on organic feed for laying hens

    NARCIS (Netherlands)

    Alewijn, Martin; Voet, van der Hilko; Ruth, van Saskia

    2016-01-01

    Multivariate classification methods based on analytical fingerprints have found many applications in the food and feed area, but practical applications are still scarce due to a lack of a generally accepted validation procedure. This paper proposes a new approach for validation of this type of

  15. Validation of a new analytical procedure for determination of residual solvents in [{sup 18}F]FDG by gas chromatography

    Energy Technology Data Exchange (ETDEWEB)

    Costa, Flávia M.; Costa, Cassiano L.S.; Silva, Juliana B.; Ferreira, Soraya M.Z.M.D., E-mail: flaviabiomedica@yahoo.com.br [Centro de Desenvolvimento da Tecnologia Nuclear (UPPR/CDTN/CNEN-MG), Belo Horizonte, MG (Brazil). Unidade de Pesquisa e Produção de Radiofármacos

    2017-07-01

    Fludeoxyglucose F 18 ([{sup 18}F]FDG) is the most used radiopharmaceutical for positron emission tomography, especially on oncology. Organic solvents such as ether, ethanol and acetonitrile might be used in the synthesis of [{sup 18}F]FDG; however, they might not be completely removed during purification steps. The determination of residual solvents in [{sup 18}F]FDG is required in the European Pharmacopoeia (EP) and the United States Pharmacopeia (USP) monographs. While the procedure described in the EP is quite general, the one described in the USP requires a long runtime (about 13 minutes). In this work a simple and fast (4-minute) analytical procedure was developed and validated for determination of residual solvents in [{sup 18}F]FDG. Analyses were carried out in a Perkin Elmer gas chromatograph equipped with a flame ionization detector. The separation was obtained on a 0.53-mm x 30 m fused-silica column. Validation included the evaluation of various parameters, such as: specificity, linearity and range, limits of detection and quantitation, precision (repeatability and intermediate precision), accuracy, and robustness. Results were found to be within acceptable limits, indicating the developed procedure is suitable for its intended application. Considering the short half-life of fluorine-18 (109.7 minutes), this new method could be a valuable alternative for routine quality control of [{sup 18}F]FDG. (author)

  16. Scope and limitations of due process in administrative proceedings

    Directory of Open Access Journals (Sweden)

    Bernardo Carvajal Sánchez

    2010-12-01

    Full Text Available In order to explain in a better way the scope of Due Process in Administrative Law as a legal norm whose respect is essential to all government agencies, three points of view (formal, structural and material are proposed. Those items seem useful to understand “Administrative Due Process” in all its dimensions: as a constitutional norm developed by the enactment of laws and decrees; as a principle inspiring some conducts and new norms; and as an objective and subjective fundamental right. On the other hand, it is shown that Administrative Due Process is not an absolute rule because in some cases its full application is subject to normative relativism. Two opposite trends can be perceived at this point: in the first place, government agencies usually do not act the same way judges do, so Administrative Due Process should be distinguished from Judicial Due Process; therefore, it could actually have a more restricted scope. In the second place, some administrative authorities are nowadays playing a role more or less similar to what judges do. This means that new procedural guarantees will be claimed. In any case, admitting valid limitations to Administrative Due Process leads to the quest of the limits of these limitations. The application of the rule of Due Process cannot be totally suppressed; its scope cannot be completely reduced. This is the result of its fundamental nature as a legal norm that ensures justice and equity in all administrative procedures, and proscribes random decisions.

  17. Optimization of instrumental neutron activation analysis method by means of 2k experimental design technique aiming the validation of analytical procedures

    International Nuclear Information System (INIS)

    Petroni, Robson; Moreira, Edson G.

    2013-01-01

    In this study optimization of procedures and standardization of Instrumental Neutron Activation Analysis (INAA) methods were carried out for the determination of the elements arsenic, chromium, cobalt, iron, rubidium, scandium, selenium and zinc in biological materials. The aim is to validate the analytical methods for future accreditation at the National Institute of Metrology, Quality and Technology (INMETRO). The 2 k experimental design was applied for evaluation of the individual contribution of selected variables of the analytical procedure in the final mass fraction result. Samples of Mussel Tissue Certified Reference Material and multi-element standards were analyzed considering the following variables: sample decay time, counting time and sample distance to detector. The standard multi-element concentration (comparator standard), mass of the sample and irradiation time were maintained constant in this procedure. By means of the statistical analysis and theoretical and experimental considerations it was determined the optimized experimental conditions for the analytical methods that will be adopted for the validation procedure of INAA methods in the Neutron Activation Analysis Laboratory (LAN) of the Research Reactor Center (CRPq) at the Nuclear and Energy Research Institute (IPEN - CNEN/SP). Optimized conditions were estimated based on the results of z-score tests, main effect and interaction effects. The results obtained with the different experimental configurations were evaluated for accuracy (precision and trueness) for each measurement. (author)

  18. Combined CT- and fluoroscopy-guided nephrostomy in patients with non-obstructive uropathy due to urine leaks in cases of failed ultrasound-guided procedures

    International Nuclear Information System (INIS)

    Sommer, C.M.; Huber, J.; Radeleff, B.A.; Hosch, W.; Stampfl, U.; Loenard, B.M.; Hallscheidt, P.; Haferkamp, A.; Kauczor, H.U.; Richter, G.M.

    2011-01-01

    Aim: To report our experience of combined CT- and fluoroscopy-guided nephrostomy in patients with non-obstructive uropathy due to urine leaks in cases of failed ultrasound-guided procedures. Patients and methods: Eighteen patients (23 kidneys) with non-obstructive uropathy due to urine leaks underwent combined CT- and fluoroscopy-guided nephrostomy. All procedures were indicated as second-line interventions after failed ultrasound-guided nephrostomy. Thirteen males and five females with an age of 62.3 ± 8.7 (40–84) years were treated. Urine leaks developed in majority after open surgery, e.g. postoperative insufficiency of ureteroneocystostomy (5 kidneys). The main reasons for failed ultrasound-guided nephrostomy included anatomic obstacles in the puncture tract (7 kidneys), and inability to identify pelvic structures (7 kidneys). CT-guided guidewire placement into the collecting system was followed by fluoroscopy-guided nephrostomy tube positioning. Procedural success rate, major and minor complication rates, CT-views and needle passes, duration of the procedure and radiation dose were analyzed. Results: Procedural success was 91%. Major and minor complication rates were 9% (one septic shock and one perirenal abscess) and 9% (one perirenal haematoma and one urinoma), respectively. 30-day mortality rate was 6%. Number of CT-views and needle passes were 9.3 ± 6.1 and 3.6 ± 2.6, respectively. Duration of the complete procedure was 87 ± 32 min. Dose-length product and dose-area product were 1.8 ± 1.4 Gy cm and 3.9 ± 4.3 Gy cm 2 , respectively. Conclusions: Combined CT- and fluoroscopy-guided nephrostomy in patients with non-obstructive uropathy due to urine leaks in cases of failed ultrasound-guided procedures was feasible with high technical success and a tolerable complication rate.

  19. User's guide for signal validation software: Final report

    International Nuclear Information System (INIS)

    Swisher, V.I.

    1987-09-01

    Northeast Utilities has implemented a real-time signal validation program into the safety parameter display systems (SPDS) at Millstone Units 2 and 3. Signal validation has been incorporated to improve the reliability of the information being used in the SPDS. Signal validation uses Parity Space Vector Analysis to process SPDS sensor data. The Parity Space algorithm determines consistency among independent, redundant input measurements. This information is then used to calculate a validated estimate of that parameter. Additional logic is incorporated to compare partially redundant measurement data. In both plants the SPDS has been designed to monitor the status of critical safety functions (CSFs) and provide information that can be used with plant-specific emergency operating procedures (EOPs). However the CSF logic, EOPs, and complement of plant sensors vary for these plants due to their different design characteristics (MP2 - 870 MWe Combustion Engineering PWR, MP3 - 1150 MWe Westinghouse PWR). These differences in plant design and information requirements result in a variety of signal validation applications

  20. A study on a systematic approach of verification and validation of a computerized procedure system: ImPRO

    International Nuclear Information System (INIS)

    Qin, Wei; Seong, Poong Hyun

    2003-01-01

    Paper Based Procedure (PBP) and Computerized Procedure System (CPS) are studied to demonstrate that it is necessary to develop CPS in NPP I and C system. Computerized procedure system is actually a software system. All the desired and undesired properties of a software system can be described and evaluated as software qualities. Generally, software qualities can be categorized into product quality and process quality. In order to achieve product quality, the process quality of a software system should also be considered and achieved. Characteristics of CPS will be described to analyse the product and process of an example CPS: ImPRO. At the same time, several main product and process issues will be analysed from Verification and Validation (V and V) point of view. It is concluded and suggested that V and V activities can also be regarded as software development process, this point of view then is applied to the V and V activities of ImPRO as a systematic approach of V and V of ImPRO. To support and realize this approach, suitable testing technologies and testing strategies are suggested

  1. Pain-related impairment of daily activities after thoracic surgery: a questionnaire validation.

    Science.gov (United States)

    Ringsted, Thomas K; Wildgaard, Kim; Kreiner, Svend; Kehlet, Henrik

    2013-09-01

    Persistent postoperative pain is an acknowledged entity that reduces daily activities. Evaluation of the post-thoracotomy pain syndrome (PTPS) is often measured using traditional pain scales without in-depth questions on pain impairment. Thus, the purpose was to create a procedure-specific questionnaire for assessment of functional impairment due to PTPS. Activities were obtained from the literature supplemented by interviews with patients and surgeons. The questionnaire was validated using the Rasch model in order to describe an underlying pain impairment scale. Four of 17 questions were redundant. The remaining 13 questions from low to intensive activity described functional impairment following persistent pain from thoracotomy and video-assisted thoracic surgery (VATS). No evidence for differential item functioning for gender, age or differences between open or VATS, were found. A generalized log-linear Rasch model including local dependence was constructed. Though local dependence influenced reliability, the test-retest reliability estimated under the log-linear Rasch model was high (0.88-0.96). Correlation with items from the Disability of the Arm, Shoulder and Hand (quick) questionnaire supported validity (γ = 0.46, P impairment questionnaire measured 2 qualitatively different pain dimensions although highly correlated (γ = 0.76). This study presents method, results and validation of a new unidimensional scale measuring procedure specific functional impairment due to PTPS following open surgery and VATS. Procedure specific tools such as this could provide important outcomes measures for future trials on persistent postsurgical pain states allowing better assessment of interventions (250).

  2. A procedure for estimation of pipe break probabilities due to IGSCC

    International Nuclear Information System (INIS)

    Bergman, M.; Brickstad, B.; Nilsson, F.

    1998-06-01

    A procedure has been developed for estimation of the failure probability of welds joints in nuclear piping susceptible to intergranular stress corrosion cracking. The procedure aims at a robust and rapid estimate of the failure probability for a specific weld with known stress state. Random properties are taken into account of the crack initiation rate, the initial crack length, the in-service inspection efficiency and the leak rate. A computer realization of the procedure has been developed for user friendly applications by design engineers. Some examples are considered to investigate the sensitivity of the failure probability to different input quantities. (au)

  3. Validation of an efficiency calibration procedure for a coaxial n-type and a well-type HPGe detector used for the measurement of environmental radioactivity

    Energy Technology Data Exchange (ETDEWEB)

    Morera-Gómez, Yasser, E-mail: ymore24@gamail.com [Centro de Estudios Ambientales de Cienfuegos, AP 5. Ciudad Nuclear, CP 59350 Cienfuegos (Cuba); Departamento de Química y Edafología, Universidad de Navarra, Irunlarrea No 1, Pamplona 31009, Navarra (Spain); Cartas-Aguila, Héctor A.; Alonso-Hernández, Carlos M.; Nuñez-Duartes, Carlos [Centro de Estudios Ambientales de Cienfuegos, AP 5. Ciudad Nuclear, CP 59350 Cienfuegos (Cuba)

    2016-05-11

    To obtain reliable measurements of the environmental radionuclide activity using HPGe (High Purity Germanium) detectors, the knowledge of the absolute peak efficiency is required. This work presents a practical procedure for efficiency calibration of a coaxial n-type and a well-type HPGe detector using experimental and Monte Carlo simulations methods. The method was performed in an energy range from 40 to 1460 keV and it can be used for both, solid and liquid environmental samples. The calibration was initially verified measuring several reference materials provided by the IAEA (International Atomic Energy Agency). Finally, through the participation in two Proficiency Tests organized by IAEA for the members of the ALMERA network (Analytical Laboratories for the Measurement of Environmental Radioactivity) the validity of the developed procedure was confirmed. The validation also showed that measurement of {sup 226}Ra should be conducted using coaxial n-type HPGe detector in order to minimize the true coincidence summing effect. - Highlights: • An efficiency calibration for a coaxial and a well-type HPGe detector was performed. • The calibration was made using experimental and Monte Carlo simulations methods. • The procedure was verified measuring several reference materials provided by IAEA. • Calibrations were validated through the participation in 2 ALMERA Proficiency Tests.

  4. Procedures For Microbial-Ecology Laboratory

    Science.gov (United States)

    Huff, Timothy L.

    1993-01-01

    Microbial Ecology Laboratory Procedures Manual provides concise and well-defined instructions on routine technical procedures to be followed in microbiological laboratory to ensure safety, analytical control, and validity of results.

  5. Virtual reality simulation training in a high-fidelity procedure suite

    DEFF Research Database (Denmark)

    Lönn, Lars; Edmond, John J; Marco, Jean

    2012-01-01

    To assess the face and content validity of a novel, full physics, full procedural, virtual reality simulation housed in a hybrid procedure suite.......To assess the face and content validity of a novel, full physics, full procedural, virtual reality simulation housed in a hybrid procedure suite....

  6. External Validation and Update of a Prediction Rule for the Duration of Sickness Absence Due to Common Mental Disorders

    NARCIS (Netherlands)

    Norder, Giny; Roelen, Corne A. M.; van der Klink, Jac J. L.; Bultmann, Ute; Sluiter, J. K.; Nieuwenhuijsen, K.

    Purpose The objective of the present study was to validate an existing prediction rule (including age, education, depressive/anxiety symptoms, and recovery expectations) for predictions of the duration of sickness absence due to common mental disorders (CMDs) and investigate the added value of

  7. Optimization of instrumental neutron activation analysis method by means of 2{sup k} experimental design technique aiming the validation of analytical procedures

    Energy Technology Data Exchange (ETDEWEB)

    Petroni, Robson; Moreira, Edson G., E-mail: rpetroni@ipen.br, E-mail: emoreira@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2013-07-01

    In this study optimization of procedures and standardization of Instrumental Neutron Activation Analysis (INAA) methods were carried out for the determination of the elements arsenic, chromium, cobalt, iron, rubidium, scandium, selenium and zinc in biological materials. The aim is to validate the analytical methods for future accreditation at the National Institute of Metrology, Quality and Technology (INMETRO). The 2{sup k} experimental design was applied for evaluation of the individual contribution of selected variables of the analytical procedure in the final mass fraction result. Samples of Mussel Tissue Certified Reference Material and multi-element standards were analyzed considering the following variables: sample decay time, counting time and sample distance to detector. The standard multi-element concentration (comparator standard), mass of the sample and irradiation time were maintained constant in this procedure. By means of the statistical analysis and theoretical and experimental considerations it was determined the optimized experimental conditions for the analytical methods that will be adopted for the validation procedure of INAA methods in the Neutron Activation Analysis Laboratory (LAN) of the Research Reactor Center (CRPq) at the Nuclear and Energy Research Institute (IPEN - CNEN/SP). Optimized conditions were estimated based on the results of z-score tests, main effect and interaction effects. The results obtained with the different experimental configurations were evaluated for accuracy (precision and trueness) for each measurement. (author)

  8. Validation Process Methods

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, John E. [National Renewable Energy Lab. (NREL), Golden, CO (United States); English, Christine M. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Gesick, Joshua C. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mukkamala, Saikrishna [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2018-01-04

    This report documents the validation process as applied to projects awarded through Funding Opportunity Announcements (FOAs) within the U.S. Department of Energy Bioenergy Technologies Office (DOE-BETO). It describes the procedures used to protect and verify project data, as well as the systematic framework used to evaluate and track performance metrics throughout the life of the project. This report also describes the procedures used to validate the proposed process design, cost data, analysis methodologies, and supporting documentation provided by the recipients.

  9. Validation procedures of software applied in nuclear instruments. Proceedings of a technical meeting

    International Nuclear Information System (INIS)

    2007-09-01

    The IAEA has supported the availability of well functioning nuclear instruments in Member States over more than three decades. Some older or aged instruments are still being used and are still in good working condition. However, those instruments may not meet modern software requirements for the end-user in all cases. Therefore, Member States, mostly those with emerging economies, modernize/refurbish such instruments to meet the end-user demands. New advanced software is not only applied in case of new instrumentation, but often also for new and improved applications of modernized and/or refurbished instruments in many Member States for which in few cases the IAEA also provided support. Modern software applied in nuclear instrumentation plays a key role for their safe operation and execution of commands in a user friendly manner. Correct data handling and transfer has to be ensured. Additional features such as data visualization, interfacing to PC for control and data storage are often included. To finalize the task, where new instrumentation which is not commercially available is used, or aged instruments are modernized/refurbished, the applied software has to be verified and validated. A Technical Meeting on 'Validation Procedures of Software Applied in Nuclear Instruments' was organized in Vienna, 20-23 November 2006, to discuss the verification and validation process of software applied to operation and use of nuclear instruments. The presentations at the technical meeting included valuable information, which has been compiled and summarized in this publication, which should be useful for technical staff in Member States when modernizing/refurbishing nuclear instruments. 22 experts in the field of modernization/refurbishment of nuclear instruments as well as users of applied software presented their latest results. Discussion sessions followed the presentations. This publication is the outcome of deliberations during the meeting

  10. Bio-Oil Analysis Laboratory Procedures | Bioenergy | NREL

    Science.gov (United States)

    Bio-Oil Analysis Laboratory Procedures Bio-Oil Analysis Laboratory Procedures NREL develops laboratory analytical procedures (LAPs) for the analysis of raw and upgraded pyrolysis bio-oils. These standard procedures have been validated and allow for reliable bio-oil analysis. Procedures Determination

  11. Ultrasonic techniques validation on shell

    International Nuclear Information System (INIS)

    Navarro, J.; Gonzalez, E.

    1998-01-01

    Due to the results obtained in several international RRT during the 80's, it has been necessary to prove the effectiveness of the NDT techniques. For this reason it has been imperative to verify the goodness of the Inspection Procedure over different mock-ups, representative of the inspection area and with real defects. Prior to the revision of the inspection procedure and with the aim of updating the techniques used, it is a good practice to perform different scans on the mock-ups until the validation is achieved. It is at this point, where all the parameters of the inspection at hands are defined; transducer, step, scan direction,... and what it's more important, it will be demonstrated that the technique to be used for the area required to inspection is suitable to evaluate the degradation phenomena that could appear. (Author)

  12. A Procedure for Identification of Appropriate State Space and ARIMA Models Based on Time-Series Cross-Validation

    Directory of Open Access Journals (Sweden)

    Patrícia Ramos

    2016-11-01

    Full Text Available In this work, a cross-validation procedure is used to identify an appropriate Autoregressive Integrated Moving Average model and an appropriate state space model for a time series. A minimum size for the training set is specified. The procedure is based on one-step forecasts and uses different training sets, each containing one more observation than the previous one. All possible state space models and all ARIMA models where the orders are allowed to range reasonably are fitted considering raw data and log-transformed data with regular differencing (up to second order differences and, if the time series is seasonal, seasonal differencing (up to first order differences. The value of root mean squared error for each model is calculated averaging the one-step forecasts obtained. The model which has the lowest root mean squared error value and passes the Ljung–Box test using all of the available data with a reasonable significance level is selected among all the ARIMA and state space models considered. The procedure is exemplified in this paper with a case study of retail sales of different categories of women’s footwear from a Portuguese retailer, and its accuracy is compared with three reliable forecasting approaches. The results show that our procedure consistently forecasts more accurately than the other approaches and the improvements in the accuracy are significant.

  13. Enhancing Title Ix Due Process Standards in Campus Sexual Assault Adjudication: Considering the Roles of Distributive, Procedural, and Restorative Justice

    Science.gov (United States)

    Harper, Shannon; Maskaly, Jon; Kirkner, Anne; Lorenz, Katherine

    2017-01-01

    Title IX prohibits sex discrimination--including sexual assault--in higher education. The Department of Education Office for Civil Rights' 2011 "Dear Colleague Letter" outlines recommendations for campus sexual assault adjudication allowing a variety of procedures that fail to protect accused students' due process rights and victims'…

  14. The Management Advisory Committee of the Inspection Validation Centre seventh report

    International Nuclear Information System (INIS)

    1990-07-01

    The Management Advisory Committee of the Inspection Validation Centre (IVC/MAC) was set up to review the policy, scope, procedure and operation of the Inspection Validation Centre (IVC), to supervise its operation and to advise and report to the United Kingdom Atomic Energy Authority (UKAEA) appropriately. The IVC was established at the UKAEA Risley Laboratory, to validate the procedures, personnel and equipment proposed by Nuclear Electric for use in the ultrasonic inspection at various stages of the fabrication, erection and operation of the Sizewell 'B' Pressurized Water Reactor (PWR) reactor pressure vessel (RPV) and such other components as are identified by the utility. It is operated by the UKAEA to work as an independent organisation under contract to Nuclear Electric, and results are reported to Nuclear Electric together with the conclusions of the Centre in relation to the validation of individual techniques. At the meetings of the IVC/MAC, the progress on the manufacture of the pressure vessel is also outlined by the PWR Project Director. The vessel has now undergone the final stress relief and post-hydro inspection and is due to be delivered to the Sizewell site before the end of 1990. (author)

  15. A study on development of the step complexity measure for emergency operating procedures using entropy concepts

    International Nuclear Information System (INIS)

    Park, J. K.; Jung, W. D.; Kim, J. W.; Ha, J. J.

    2001-04-01

    In complex systems, such as nuclear power plants (NPPs) or airplane control systems, human errors play a major role in many accidents. For example, it was reported that about 70% of aviation accidents are due to human errors, and that approximately 28% of accidents in process industries are caused by human errors. According to related studies, written manuals or operating procedures are revealed as one of the most important factors in aviation and manufacturing industries. In case of NPPs, the importance of procedures is more salient than other industries because not only over 50% of human errors were due to procedures but also about 18% of accidents were caused by the failure of following procedures. Thus, the provision of emergency operating procedures (EOPs) that are designed so that the possibility of human errors can be reduced is very important. To accomplish this goal, a quantitative and objective measure that can evaluate EOPs is indispensable. The purpose of this study is the development of a method that can quantify the complexity of a step included in EOPs. In this regard, the step complexity measure (SC) is developed based on three sub-measures such as the SIC (step information complexity), the SLC (step logic complexity) and the SSC (step size complexity). To verify the SC measure, not only quantitative validations (such as comparing SC scores with subjective evaluation results and with averaged step performance time) but also qualitative validations to clarify physical meanings of the SC measure are performed

  16. A study on development of the step complexity measure for emergency operating procedures using entropy concepts

    Energy Technology Data Exchange (ETDEWEB)

    Park, J. K.; Jung, W. D.; Kim, J. W.; Ha, J. J

    2001-04-01

    In complex systems, such as nuclear power plants (NPPs) or airplane control systems, human errors play a major role in many accidents. For example, it was reported that about 70% of aviation accidents are due to human errors, and that approximately 28% of accidents in process industries are caused by human errors. According to related studies, written manuals or operating procedures are revealed as one of the most important factors in aviation and manufacturing industries. In case of NPPs, the importance of procedures is more salient than other industries because not only over 50% of human errors were due to procedures but also about 18% of accidents were caused by the failure of following procedures. Thus, the provision of emergency operating procedures (EOPs) that are designed so that the possibility of human errors can be reduced is very important. To accomplish this goal, a quantitative and objective measure that can evaluate EOPs is indispensable. The purpose of this study is the development of a method that can quantify the complexity of a step included in EOPs. In this regard, the step complexity measure (SC) is developed based on three sub-measures such as the SIC (step information complexity), the SLC (step logic complexity) and the SSC (step size complexity). To verify the SC measure, not only quantitative validations (such as comparing SC scores with subjective evaluation results and with averaged step performance time) but also qualitative validations to clarify physical meanings of the SC measure are performed.

  17. Estimate of the Effective Dose Equivalent to the Cypriot Population due to Diagnostic Nuclear Medicine Procedures in the Public Sector

    Energy Technology Data Exchange (ETDEWEB)

    Christofides, S [Medical Physics Department, Nicosia General Hospital (Cyprus)

    1994-12-31

    The Effective Dose Equivalent (EDE) to the Cypriot population due to Diagnostic Nuclear Medicine procedures has been estimated from data published by the Government of Cyprus, in its Health and Hospital Statistics Series for the years 1990, 1991, and 1992. The average EDE per patient was estimated to be 3,09, 3,75 and 4,01 microSievert for 1990, 1991 and 1992 respectively, while the per caput EDE was estimated to be 11,75, 15,16 and 17,09 microSieverts for 1990, 1991 and 1992 respectively, from the procedures in the public sector. (author). 11 refs, 4 tabs.

  18. Estimate of the Effective Dose Equivalent to the Cypriot Population due to Diagnostic Nuclear Medicine Procedures in the Public Sector

    International Nuclear Information System (INIS)

    Christofides, S.

    1994-01-01

    The Effective Dose Equivalent (EDE) to the Cypriot population due to Diagnostic Nuclear Medicine procedures has been estimated from data published by the Government of Cyprus, in its Health and Hospital Statistics Series for the years 1990, 1991, and 1992. The average EDE per patient was estimated to be 3,09, 3,75 and 4,01 microSievert for 1990, 1991 and 1992 respectively, while the per caput EDE was estimated to be 11,75, 15,16 and 17,09 microSieverts for 1990, 1991 and 1992 respectively, from the procedures in the public sector. (author)

  19. Validity in assessment of prior learning

    DEFF Research Database (Denmark)

    Wahlgren, Bjarne; Aarkrog, Vibe

    2015-01-01

    , the article discusses the need for specific criteria for assessment. The reliability and validity of the assessment procedures depend on whether the competences are well-defined, and whether the teachers are adequately trained for the assessment procedures. Keywords: assessment, prior learning, adult...... education, vocational training, lifelong learning, validity...

  20. Deference and Due Process

    OpenAIRE

    Vermeule, Cornelius Adrian

    2015-01-01

    In the textbooks, procedural due process is a strictly judicial enterprise; although substantive entitlements are created by legislative and executive action, it is for courts to decide independently what process the Constitution requires. The notion that procedural due process might be committed primarily to the discretion of the agencies themselves is almost entirely absent from the academic literature. The facts on the ground are very different. Thanks to converging strands of caselaw ...

  1. Development of structural design procedure of plate-fin heat exchanger for HTGR

    Energy Technology Data Exchange (ETDEWEB)

    Mizokami, Yorikata, E-mail: yorikata_mizokami@mhi.co.jp [Mitsubishi Heavy Industries, Ltd., 1-1, Wadasaki-cho 1-Chome, Hyogo-ku, Kobe 652-8585 (Japan); Igari, Toshihide [Mitsubishi Heavy Industries, Ltd., 5-717-1, Fukahori-machi, Nagasaki 851-0392 (Japan); Kawashima, Fumiko [Kumamoto University, 39-1 Kurokami 2-Chome, Kumamoto 860-8555 (Japan); Sakakibara, Noriyuki [Mitsubishi Heavy Industries, Ltd., 5-717-1, Fukahori-machi, Nagasaki 851-0392 (Japan); Tanihira, Masanori [Mitsubishi Heavy Industries, Ltd., 16-5, Konan 2-Chome, Minato-ku, Tokyo 108-8215 (Japan); Yuhara, Tetsuo [The University of Tokyo, 7-3-1, Hongo, Bunkyo-ku, Tokyo 113-8656 (Japan); Hiroe, Tetsuyuki [Kumamoto University, 39-1 Kurokami 2-Chome, Kumamoto 860-8555 (Japan)

    2013-02-15

    Highlights: ► We propose high temperature structural design procedure for plate-fin heat exchanger ► Allowable stresses for brazed structures will be newly discussed ► Validity of design procedure is confirmed by carrying out partial model tests ► Proposed design procedure is applied to heat exchangers for HTGR. -- Abstract: Highly efficient plate-fin heat exchanger for application to HTGR has been focused on recently. Since this heat exchanger is fabricated by brazing a lot of plates and fins, a new procedure for structural design of brazed structures in the HTGR temperature region up to 950 °C is required. Firstly in this paper influences on material strength due to both thermal aging during brazing process and helium gas environment were experimentally examined, and failure mode and failure limit of brazed side-bar structures were experimentally clarified. Secondly allowable stresses for aging materials and brazed structures were newly determined on the basis of the experimental results. For the purpose of validating the structural design procedure including homogenization FEM modeling, a pressure burst test and a thermal fatigue test of partial model for plate-fin heat exchanger were carried out. Finally, results of reference design of plate-fin heat exchangers of recuperator and intermediate heat exchanger for HTGR plant were evaluated by the proposed design criteria.

  2. Validity of anthropometric procedures to estimate body density and body fat percent in military men

    Directory of Open Access Journals (Sweden)

    Ciro Romélio Rodriguez-Añez

    1999-12-01

    Full Text Available The objective of this study was to verify the validity of the Katch e McArdle’s equation (1973,which uses the circumferences of the arm, forearm and abdominal to estimate the body density and the procedure of Cohen (1986 which uses the circumferences of the neck and abdominal to estimate the body fat percent (%F in military men. Therefore data from 50 military men, with mean age of 20.26 ± 2.04 years serving in Santa Maria, RS, was collected. The circumferences were measured according with Katch e McArdle (1973 and Cohen (1986 procedures. The body density measured (Dm obtained under water weighting was used as criteria and its mean value was 1.0706 ± 0.0100 g/ml. The residual lung volume was estimated using the Goldman’s e Becklake’s equation (1959. The %F was obtained with the Siri’s equation (1961 and its mean value was 12.70 ± 4.71%. The validation criterion suggested by Lohman (1992 was followed. The analysis of the results indicated that the procedure developed by Cohen (1986 has concurrent validity to estimate %F in military men or in other samples with similar characteristics with standard error of estimate of 3.45%. . RESUMO Através deste estudo objetivou-se verificar a validade: da equação de Katch e McArdle (1973 que envolve os perímetros do braço, antebraço e abdômen, para estimar a densidade corporal; e, o procedimento de Cohen (1986 que envolve os perímetros do pescoço e abdômen, para estimar o % de gordura (%G; para militares. Para tanto, coletou-se os dados de 50 militares masculinos, com idade média de 20,26 ± 2,04 anos, lotados na cidade de Santa Maria, RS. Mensurou-se os perímetros conforme procedimentos de Katch e McArdle (1973 e Cohen (1986. Utilizou-se a densidade corporal mensurada (Dm através da pesagem hidrostática como critério de validação, cujo valor médio foi de 1,0706 ± 0,0100 g/ml. Estimou-se o volume residual pela equação de Goldman e Becklake (1959. O %G derivado da Dm estimou

  3. Hanford Environmental Restoration data validation process for chemical and radiochemical analyses

    International Nuclear Information System (INIS)

    Adams, M.R.; Bechtold, R.A.; Clark, D.E.; Angelos, K.M.; Winter, S.M.

    1993-10-01

    Detailed procedures for validation of chemical and radiochemical data are used to assure consistent application of validation principles and support a uniform database of quality environmental data. During application of these procedures, it was determined that laboratory data packages were frequently missing certain types of documentation causing subsequent delays in meeting critical milestones in the completion of validation activities. A quality improvement team was assembled to address the problems caused by missing documentation and streamline the entire process. The result was the development of a separate data package verification procedure and revisions to the data validation procedures. This has resulted in a system whereby deficient data packages are immediately identified and corrected prior to validation and revised validation procedures which more closely match the common analytical reporting practices of laboratory service vendors

  4. Constitutional Due Process and Educational Administration.

    Science.gov (United States)

    Uerling, Donald F.

    1985-01-01

    Discusses substantive and procedural due process as required by the United States Constitution and interpreted by the Supreme Court, with particular reference to situations arising in educational environments. Covers interests protected by due process requirements, the procedures required, and some special considerations that may apply. (PGD)

  5. Modified Harrington Procedure for Acetabular Insuficiency Due to Metastatic Malignant Disease

    Directory of Open Access Journals (Sweden)

    WI Faisham

    2009-05-01

    Full Text Available Extensive peri-acetabular osteolysis caused by malignant disease process is a major surgical challenge as conventional hip arthroplasty is not adequate. We describe a modified use of the Harrington procedure for acetabular insufficiency secondary to metastatic disease in twelve patients. The procedures include application of multiple threaded pins to bridge the acetabular columns, anti-protrusio cage and cemented acetabular cup. Eleven patients were able to walk pain free and achieved a mean Musculoskeletal Tumour Society Functional Score of 80 (range, 68 to 86.

  6. Content validation applied to job simulation and written examinations

    International Nuclear Information System (INIS)

    Saari, L.M.; McCutchen, M.A.; White, A.S.; Huenefeld, J.C.

    1984-08-01

    The application of content validation strategies in work settings have become increasingly popular over the last few years, perhaps spurred by an acknowledgment in the courts of content validation as a method for validating employee selection procedures (e.g., Bridgeport Guardians v. Bridgeport Police Dept., 1977). Since criterion-related validation is often difficult to conduct, content validation methods should be investigated as an alternative for determining job related selection procedures. However, there is not yet consensus among scientists and professionals concerning how content validation should be conducted. This may be because there is a lack of clear cut operations for conducting content validation for different types of selection procedures. The purpose of this paper is to discuss two content validation approaches being used for the development of a licensing examination that involves a job simulation exam and a written exam. These represent variations in methods for applying content validation. 12 references

  7. Proposed Core Competencies and Empirical Validation Procedure in Competency Modeling: Confirmation and Classification.

    Science.gov (United States)

    Baczyńska, Anna K; Rowiński, Tomasz; Cybis, Natalia

    2016-01-01

    Competency models provide insight into key skills which are common to many positions in an organization. Moreover, there is a range of competencies that is used by many companies. Researchers have developed core competency terminology to underline their cross-organizational value. The article presents a theoretical model of core competencies consisting of two main higher-order competencies called performance and entrepreneurship. Each of them consists of three elements: the performance competency includes cooperation, organization of work and goal orientation, while entrepreneurship includes innovativeness, calculated risk-taking and pro-activeness. However, there is lack of empirical validation of competency concepts in organizations and this would seem crucial for obtaining reliable results from organizational research. We propose a two-step empirical validation procedure: (1) confirmation factor analysis, and (2) classification of employees. The sample consisted of 636 respondents (M = 44.5; SD = 15.1). Participants were administered a questionnaire developed for the study purpose. The reliability, measured by Cronbach's alpha, ranged from 0.60 to 0.83 for six scales. Next, we tested the model using a confirmatory factor analysis. The two separate, single models of performance and entrepreneurial orientations fit quite well to the data, while a complex model based on the two single concepts needs further research. In the classification of employees based on the two higher order competencies we obtained four main groups of employees. Their profiles relate to those found in the literature, including so-called niche finders and top performers. Some proposal for organizations is discussed.

  8. Acute cholangitis due to afferent loop syndrome after a Whipple procedure: a case report.

    Science.gov (United States)

    Spiliotis, John; Karnabatidis, Demetrios; Vaxevanidou, Archodoula; Datsis, Anastasios C; Rogdakis, Athanasios; Zacharis, Georgios; Siamblis, Demetrios

    2009-08-25

    Patients with resection of stomach and especially with Billroth II reconstruction (gastro jejunal anastomosis), are more likely to develop afferent loop syndrome which is a rare complication. When the afferent part is obstructed, biliary and pancreatic secretions accumulate and cause the distention of this part. In the case of a complete obstruction (rare), there is a high risk developing necrosis and perforation. This complication has been reported once in the literature. A 54-year-old Greek male had undergone a pancreato-duodenectomy (Whipple procedure) one year earlier due to a pancreatic adenocarcinoma. Approximately 10 months after the initial operation, the patient started having episodes of cholangitis (fever, jaundice) and abdominal pain. This condition progressively worsened and the suspicion of local recurrence or stenosis of the biliary-jejunal anastomosis was discussed. A few days before his admission the patient developed signs of septic cholangitis. Our case demonstrates a rare complication with serious clinical manifestation of the afferent loop syndrome. This advanced form of afferent loop syndrome led to the development of huge enterobiliary reflux, which had a serious clinical manifestation as cholangitis and systemic sepsis, due to bacterial overgrowth, which usually present in the afferent loop. The diagnosis is difficult and the interventional radiology gives all the details to support the therapeutic decision making. A variety of factors can contribute to its development including adhesions, kinking and angulation of the loop, stenosis of gastro-jejunal anastomosis and internal herniation. In order to decompress the afferent loop dilatation due to adhesions, a lateral-lateral jejunal anastomosis was performed between the afferent loop and a small bowel loop.

  9. A procedure to identify and to assess risk parameters in a SCR (Steel Catenary Riser) due to the fatigue failure

    Energy Technology Data Exchange (ETDEWEB)

    Stefane, Wania [Universidade Estadual de Campinas (UNICAMP), Campinas, SP (Brazil). Faculdade de Engenharia Mecanica; Morooka, Celso K. [Universidade Estadual de Campinas (UNICAMP), Campinas, SP (Brazil). Dept. de Engenharia de Petroleo. Centro de Estudos de Petroleo; Pezzi Filho, Mario [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil). E and P. ENGP/IPMI/ES; Matt, Cyntia G.C.; Franciss, Ricardo [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil). Centro de Pesquisas (CENPES)

    2009-12-19

    The discovery of offshore fields in ultra deep water and the presence of reservoirs located in great depths below the seabed requires innovative solutions for offshore oil production systems. Many riser configurations have emerged as economically viable technological solutions for these scenarios. Therefore the study and the development of methodologies applied to riser design and procedures to calculate and to dimension production risers, taken into account the effects of mete ocean conditions, such as waves, current and platform motion in the fatigue failure is fundamental. The random nature of these conditions as well as the mechanical characteristics of the riser components are critical to a probabilistic treatment to ensure the greatest reliability for risers and minimum risks associated to different aspects of the operation like the safety of the installation, economical concerns and the environment. The current work presents a procedure of the identification and the assessment of main parameters of risk when considering fatigue failure. Static and dynamic behavior of Steel Catenary Riser (SCR) under the effects of mete ocean conditions and uncertainties related to total cumulative damage (Miner-Palmgren's rule) are taken into account. The methodology adopted is probabilistic and the approach is analytical. The procedure is based on the First Order Reliability Method (FORM) which usually presents low computational effort and acceptable accuracy. The procedure suggested is applied for two practical cases, one using data available from the literature and the second with data collected from an actual Brazilian offshore field operation. For both cases, results of the probability of failure due to fatigue were obtained for different locations along the SCR length connected to a semi-submersible platform. From these results, the sensitivity of the probability of failure due to fatigue for a SCR could be verified, and the most effective parameter could also be

  10. Multielemental analysis of 18 essential and toxic elements in amniotic fluid samples by ICP-MS: Full procedure validation and estimation of measurement uncertainty.

    Science.gov (United States)

    Markiewicz, B; Sajnóg, A; Lorenc, W; Hanć, A; Komorowicz, I; Suliburska, J; Kocyłowski, R; Barałkiewicz, D

    2017-11-01

    Amniotic fluid is the substantial factor in the development of an embryo and fetus due to the fact that water and solutes contained in it penetrate the fetal membranes in an hydrostatic and osmotic way as well as being swallowed by the fetus. Elemental composition of amniotic fluid influences the growth and health of the fetus, therefore, an analysis of amniotic fluid is important because the results would indicate abnormal levels of minerals or toxic elements. Inductively coupled plasma mass spectroscopy (ICP-MS) is often used for determination of trace and ultra-trace level elements in a wide range of matrices including biological samples because of its unique analytical capabilities. In the case of trace and ultra-trace level analysis detailed characteristics of analytical procedure as well as properties of the analytical result are particularly important. The purpose of this study was to develop a new analytical procedure for multielemental analysis of 18 elements (Al, As, Ba, Ca, Cd, Co, Cr, Cu, Mg, Mn, Ni, Pb, Sb, Se, Sr, U, V and Zn) in amniotic fluid samples using ICP-MS. Dynamic reaction cell (DRC) with two reaction gases, ammonia and oxygen, was involved in the experiment to eliminate spectral interferences. Detailed validation was conducted using 3 certified reference mterials (CRMs) and real amniotic fluid samples collected from patients. Repeatability for all analyzed analytes was found to range from 0.70% to 8.0% and for intermediate precision results varied from 1.3% to 15%. Trueness expressed as recovery ranged from 80% to 125%. Traceability was assured through the analyses of CRMs. Uncertainty of the results was also evaluated using single-laboratory validation approach. The obtained expanded uncertainty (U) results for CRMs, expressed as a percentage of the concentration of an analyte, were found to be between 8.3% for V and 45% for Cd. Standard uncertainty of the precision was found to have a greater influence on the combined standard uncertainty

  11. Due diligence

    International Nuclear Information System (INIS)

    Sanghera, G.S.

    1999-01-01

    The Occupational Health and Safety (OHS) Act requires that every employer shall ensure the health and safety of workers in the workplace. Issues regarding the practices at workplaces and how they should reflect the standards of due diligence were discussed. Due diligence was described as being the need for employers to identify hazards in the workplace and to take active steps to prevent workers from potentially dangerous incidents. The paper discussed various aspects of due diligence including policy, training, procedures, measurement and enforcement. The consequences of contravening the OHS Act were also described

  12. Validation of natural language processing to extract breast cancer pathology procedures and results

    Directory of Open Access Journals (Sweden)

    Arika E Wieneke

    2015-01-01

    Full Text Available Background: Pathology reports typically require manual review to abstract research data. We developed a natural language processing (NLP system to automatically interpret free-text breast pathology reports with limited assistance from manual abstraction. Methods: We used an iterative approach of machine learning algorithms and constructed groups of related findings to identify breast-related procedures and results from free-text pathology reports. We evaluated the NLP system using an all-or-nothing approach to determine which reports could be processed entirely using NLP and which reports needed manual review beyond NLP. We divided 3234 reports for development (2910, 90%, and evaluation (324, 10% purposes using manually reviewed pathology data as our gold standard. Results: NLP correctly coded 12.7% of the evaluation set, flagged 49.1% of reports for manual review, incorrectly coded 30.8%, and correctly omitted 7.4% from the evaluation set due to irrelevancy (i.e. not breast-related. Common procedures and results were identified correctly (e.g. invasive ductal with 95.5% precision and 94.0% sensitivity, but entire reports were flagged for manual review because of rare findings and substantial variation in pathology report text. Conclusions: The NLP system we developed did not perform sufficiently for abstracting entire breast pathology reports. The all-or-nothing approach resulted in too broad of a scope of work and limited our flexibility to identify breast pathology procedures and results. Our NLP system was also limited by the lack of the gold standard data on rare findings and wide variation in pathology text. Focusing on individual, common elements and improving pathology text report standardization may improve performance.

  13. 21 CFR 60.44 - Hearing procedures.

    Science.gov (United States)

    2010-04-01

    ... RESTORATION Due Diligence Hearings § 60.44 Hearing procedures. The due diligence hearing shall be conducted in accordance with this part, supplemented by the nonconflicting procedures in part 16. During the due diligence... requesting a hearing under part 16. The standard of due diligence set forth in § 60.36 will apply in the due...

  14. Reliable and valid assessment of Lichtenstein hernia repair skills

    DEFF Research Database (Denmark)

    Carlsen, C G; Lindorff Larsen, Karen; Funch-Jensen, P

    2014-01-01

    PURPOSE: Lichtenstein hernia repair is a common surgical procedure and one of the first procedures performed by a surgical trainee. However, formal assessment tools developed for this procedure are few and sparsely validated. The aim of this study was to determine the reliability and validity...... of an assessment tool designed to measure surgical skills in Lichtenstein hernia repair. METHODS: Key issues were identified through a focus group interview. On this basis, an assessment tool with eight items was designed. Ten surgeons and surgical trainees were video recorded while performing Lichtenstein hernia...... a significant difference between the three groups which indicates construct validity, p skills can be assessed blindly by a single rater in a reliable and valid fashion with the new procedure-specific assessment tool. We recommend this tool for future assessment...

  15. CVTresh: R Package for Level-Dependent Cross-Validation Thresholding

    Directory of Open Access Journals (Sweden)

    Donghoh Kim

    2006-04-01

    Full Text Available The core of the wavelet approach to nonparametric regression is thresholding of wavelet coefficients. This paper reviews a cross-validation method for the selection of the thresholding value in wavelet shrinkage of Oh, Kim, and Lee (2006, and introduces the R package CVThresh implementing details of the calculations for the procedures. This procedure is implemented by coupling a conventional cross-validation with a fast imputation method, so that it overcomes a limitation of data length, a power of 2. It can be easily applied to the classical leave-one-out cross-validation and K-fold cross-validation. Since the procedure is computationally fast, a level-dependent cross-validation can be developed for wavelet shrinkage of data with various sparseness according to levels.

  16. CVTresh: R Package for Level-Dependent Cross-Validation Thresholding

    Directory of Open Access Journals (Sweden)

    Donghoh Kim

    2006-04-01

    Full Text Available The core of the wavelet approach to nonparametric regression is thresholding of wavelet coefficients. This paper reviews a cross-validation method for the selection of the thresholding value in wavelet shrinkage of Oh, Kim, and Lee (2006, and introduces the R package CVThresh implementing details of the calculations for the procedures.This procedure is implemented by coupling a conventional cross-validation with a fast imputation method, so that it overcomes a limitation of data length, a power of 2. It can be easily applied to the classical leave-one-out cross-validation and K-fold cross-validation. Since the procedure is computationally fast, a level-dependent cross-validation can be developed for wavelet shrinkage of data with various sparseness according to levels.

  17. Radiation risk of diagnostical procedures

    International Nuclear Information System (INIS)

    Pohlit, W.

    1986-01-01

    The environmental radiation burden of man in Germany is about 1 mGy (Milligray) per year. This is, of course, also valid for children. Due to diagnostical procedures this burden is increased to about 1.3 mGy. The question arises wether this can be neglected, or important consequences have to be drawn. To give a clear answer, the action of ionizing radiation in living cells and in organisms is explained in detail. Many of the radiation actions at the DNA can soon be repaired by the cell, if the radiation dose was small. Some damage, however will remain irreparable for the cell and consequently leads to cell death, to mutations or to cell transformation. The number of these lesion increases or decreases linearily with radiation dose. Therefore, it must be expected that the risk of tumour induction is increased to above the normal background even by the smallest doses. This small but not negligible risk has to be compared with other risks of civilization or with other medical risks. But also the benefit and the efficacy of diagnostic procedures have to be considered. (orig./HSCH) [de

  18. The validation of an infrared simulation system

    CSIR Research Space (South Africa)

    De Waal, A

    2013-08-01

    Full Text Available theoretical validation framework. This paper briefly describes the procedure used to validate software models in an infrared system simulation, and provides application examples of this process. The discussion includes practical validation techniques...

  19. Estimating the operator's performance time of emergency procedural tasks based on a task complexity measure

    International Nuclear Information System (INIS)

    Jung, Won Dae; Park, Jink Yun

    2012-01-01

    It is important to understand the amount of time required to execute an emergency procedural task in a high-stress situation for managing human performance under emergencies in a nuclear power plant. However, the time to execute an emergency procedural task is highly dependent upon expert judgment due to the lack of actual data. This paper proposes an analytical method to estimate the operator's performance time (OPT) of a procedural task, which is based on a measure of the task complexity (TACOM). The proposed method for estimating an OPT is an equation that uses the TACOM as a variable, and the OPT of a procedural task can be calculated if its relevant TACOM score is available. The validity of the proposed equation is demonstrated by comparing the estimated OPTs with the observed OPTs for emergency procedural tasks in a steam generator tube rupture scenario.

  20. Validation of modelling the radiation exposure due to solar particle events at aircraft altitudes

    International Nuclear Information System (INIS)

    Beck, P.; Bartlett, D. T.; Bilski, P.; Dyer, C.; Flueckiger, E.; Fuller, N.; Lantos, P.; Reitz, G.; Ruehm, W.; Spurny, F.; Taylor, G.; Trompier, F.; Wissmann, F.

    2008-01-01

    Dose assessment procedures for cosmic radiation exposure of aircraft crew have been introduced in most European countries in accordance with the corresponding European directive and national regulations. However, the radiation exposure due to solar particle events is still a matter of scientific research. Here we describe the European research project CONRAD, WP6, Subgroup-B, about the current status of available solar storm measurements and existing models for dose estimation at flight altitudes during solar particle events leading to ground level enhancement (GLE). Three models for the numerical dose estimation during GLEs are discussed. Some of the models agree with limited experimental data reasonably well. Analysis of GLEs during geomagnetically disturbed conditions is still complex and time consuming. Currently available solar particle event models can disagree with each other by an order of magnitude. Further research and verification by on-board measurements is still needed. (authors)

  1. Human factoring administrative procedures

    International Nuclear Information System (INIS)

    Grider, D.A.; Sturdivant, M.H.

    1991-01-01

    In nonnuclear business, administrative procedures bring to mind such mundane topics as filing correspondence and scheduling vacation time. In the nuclear industry, on the other hand, administrative procedures play a vital role in assuring the safe operation of a facility. For some time now, industry focus has been on improving technical procedures. Significant efforts are under way to produce technical procedure requires that a validated technical, regulatory, and administrative basis be developed and that the technical process be established for each procedure. Producing usable technical procedures requires that procedure presentation be engineered to the same human factors principles used in control room design. The vital safety role of administrative procedures requires that they be just as sound, just a rigorously formulated, and documented as technical procedures. Procedure programs at the Tennessee Valley Authority and at Boston Edison's Pilgrim Station demonstrate that human factors engineering techniques can be applied effectively to technical procedures. With a few modifications, those same techniques can be used to produce more effective administrative procedures. Efforts are under way at the US Department of Energy Nuclear Weapons Complex and at some utilities (Boston Edison, for instance) to apply human factors engineering to administrative procedures: The techniques being adapted include the following

  2. An exercise in model validation: Comparing univariate statistics and Monte Carlo-based multivariate statistics

    International Nuclear Information System (INIS)

    Weathers, J.B.; Luck, R.; Weathers, J.W.

    2009-01-01

    The complexity of mathematical models used by practicing engineers is increasing due to the growing availability of sophisticated mathematical modeling tools and ever-improving computational power. For this reason, the need to define a well-structured process for validating these models against experimental results has become a pressing issue in the engineering community. This validation process is partially characterized by the uncertainties associated with the modeling effort as well as the experimental results. The net impact of the uncertainties on the validation effort is assessed through the 'noise level of the validation procedure', which can be defined as an estimate of the 95% confidence uncertainty bounds for the comparison error between actual experimental results and model-based predictions of the same quantities of interest. Although general descriptions associated with the construction of the noise level using multivariate statistics exists in the literature, a detailed procedure outlining how to account for the systematic and random uncertainties is not available. In this paper, the methodology used to derive the covariance matrix associated with the multivariate normal pdf based on random and systematic uncertainties is examined, and a procedure used to estimate this covariance matrix using Monte Carlo analysis is presented. The covariance matrices are then used to construct approximate 95% confidence constant probability contours associated with comparison error results for a practical example. In addition, the example is used to show the drawbacks of using a first-order sensitivity analysis when nonlinear local sensitivity coefficients exist. Finally, the example is used to show the connection between the noise level of the validation exercise calculated using multivariate and univariate statistics.

  3. An exercise in model validation: Comparing univariate statistics and Monte Carlo-based multivariate statistics

    Energy Technology Data Exchange (ETDEWEB)

    Weathers, J.B. [Shock, Noise, and Vibration Group, Northrop Grumman Shipbuilding, P.O. Box 149, Pascagoula, MS 39568 (United States)], E-mail: James.Weathers@ngc.com; Luck, R. [Department of Mechanical Engineering, Mississippi State University, 210 Carpenter Engineering Building, P.O. Box ME, Mississippi State, MS 39762-5925 (United States)], E-mail: Luck@me.msstate.edu; Weathers, J.W. [Structural Analysis Group, Northrop Grumman Shipbuilding, P.O. Box 149, Pascagoula, MS 39568 (United States)], E-mail: Jeffrey.Weathers@ngc.com

    2009-11-15

    The complexity of mathematical models used by practicing engineers is increasing due to the growing availability of sophisticated mathematical modeling tools and ever-improving computational power. For this reason, the need to define a well-structured process for validating these models against experimental results has become a pressing issue in the engineering community. This validation process is partially characterized by the uncertainties associated with the modeling effort as well as the experimental results. The net impact of the uncertainties on the validation effort is assessed through the 'noise level of the validation procedure', which can be defined as an estimate of the 95% confidence uncertainty bounds for the comparison error between actual experimental results and model-based predictions of the same quantities of interest. Although general descriptions associated with the construction of the noise level using multivariate statistics exists in the literature, a detailed procedure outlining how to account for the systematic and random uncertainties is not available. In this paper, the methodology used to derive the covariance matrix associated with the multivariate normal pdf based on random and systematic uncertainties is examined, and a procedure used to estimate this covariance matrix using Monte Carlo analysis is presented. The covariance matrices are then used to construct approximate 95% confidence constant probability contours associated with comparison error results for a practical example. In addition, the example is used to show the drawbacks of using a first-order sensitivity analysis when nonlinear local sensitivity coefficients exist. Finally, the example is used to show the connection between the noise level of the validation exercise calculated using multivariate and univariate statistics.

  4. INITIATION AND CONDUCT OF ADMINISTRATIVE PROCEDURE

    Directory of Open Access Journals (Sweden)

    Milan Stipic

    2013-12-01

    Full Text Available General administrative procedure act contains legal norms that are valid for all identical cases. In addition to the general, there are special administrative procedures, customized to the specific administrative areas. Procedure initiation is regulated. Administrative procedure can be initiated at the request of the proponent and ex officio. When the official determines that the conditions for the conduct of administrative procedure are met, before making a decision, all the facts and circumstances relevant to the resolution of administrative matter have to be identified. When there are no legal requirements for the initiation of procedures, the official shall make a decision to reject the application of the party. The procedure is initiated ex officio when stipulated by law or when protection of public interest requires it. When initiating procedure ex officio, the public authority shall take into consideration the petition or other information that indicate the need to protect the public interest. In such cases the applicant is not a party, and the official is obliged to notify the applicant, if initiation of procedures is not accepted ex officio. Based on the notification, the applicant has a right to complain, including the situation when there is no response within the prescribed period of 30 days. Public authority may, therefore it is not obliged to, initiate administrative procedure by public announcement only in a situation where the parties are unknown, while it is obliged to initiate procedure by public announcement when this method of initiating the procedure is prescribed by law. Initiation of procedure with public announcement occurs in rare cases. Due to the application of efficiency and cost-effectiveness principle, two or more administrative procedures can be merged into one procedure by a conclusion. The condition for this is that the rights or obligations of the parties are based on the same legal basis and on the same or

  5. Study of verification and validation of standard welding procedure specifications guidelines for API 5L X-70 grade line pipe welding

    Directory of Open Access Journals (Sweden)

    Qazi H. A. A.

    2017-12-01

    Full Text Available Verification and validation of welding procedure specifications for X-70 grade line pipe welding was performed as per clause 8.2, Annexure B and D of API 5L, 45th Edition to check weld integrity in its future application conditions. Hot rolled coils were imported from China, de-coiling, strip edge milling, three roller bending to from pipe, inside and outside submerged arc welding of pipe, online ultrasonic testing of weld, HAZ and pipe body, cutting at fixed random length of pipe, visual inspection of pipe, Fluoroscopic inspection of pipe, welding procedure qualification test pieces marking at weld portion of the pipe, tensile testing, guided bend testing, CVN Impact testing were performed. Detailed study was conducted to explore possible explanations and variation in mechanical properties, WPS is examined and qualified as per API 5L 45th Edition.

  6. Gastric emptying of liquid meals: validation of the gamma camera technique

    Energy Technology Data Exchange (ETDEWEB)

    Lawaetz, Otto; Dige-Petersen, Harriet

    1989-05-01

    To assess the extent of errors and to provide correction factors for gamma camera gastric emptying studies of liquid meals labelled with radionuclides (/sup 99/Tc/sup m/ or /sup 113/In/sup m/), phantom studies were performed with different gastric emptying procedures, gamma cameras and data handling systems. To validate the overall accuracy of the method, 24 combined aspiration and gamma camera gastric emptying studies were carried out in three normal volunteers. Gastric meal volume was underestimated due to scattered radiation from the stomach. The underestimation was 7-20% varying with the size of the gastric region of interest (ROI), the energy of the nuclide and the fraction of meal in the stomach. The overestimation, due to scattered radiation from the gut, was negligible (1-3%) for any of the procedures. The gamma camera technique eliminated much of the error due to variations of stomach geometry and produced accurate quantitative gastric emptying data comparable to those obtained by evacuation (P > 0.10), when the entire field maximum 1-min count achieved within the first 20 min of a study was taken as representing the original volume of the meal ingested, and when corrections for area related errors due to scattered radiation from the stomach were performed. (author).

  7. Validation of calibration procedures for freeform parts on CMMs

    DEFF Research Database (Denmark)

    Savio, Enrico; De Chiffre, Leonardo

    2003-01-01

    The paper describes the validation of a new method for establishment of traceability of freeform measurements on coordinate measuring machines currently being considered for development as a new ISO standard. The method deals with calibration by: i) repeated measurements of a given uncalibrated...

  8. Site characterization and validation - validation drift fracture data, stage 4

    International Nuclear Information System (INIS)

    Bursey, G.; Gale, J.; MacLeod, R.; Straahle, A.; Tiren, S.

    1991-08-01

    This report describes the mapping procedures and the data collected during fracture mapping in the validation drift. Fracture characteristics examined include orientation, trace length, termination mode, and fracture minerals. These data have been compared and analysed together with fracture data from the D-boreholes to determine the adequacy of the borehole mapping procedures and to assess the nature and degree of orientation bias in the borehole data. The analysis of the validation drift data also includes a series of corrections to account for orientation, truncation, and censoring biases. This analysis has identified at least 4 geologically significant fracture sets in the rock mass defined by the validation drift. An analysis of the fracture orientations in both the good rock and the H-zone has defined groups of 7 clusters and 4 clusters, respectively. Subsequent analysis of the fracture patterns in five consecutive sections along the validation drift further identified heterogeneity through the rock mass, with respect to fracture orientations. These results are in stark contrast to the results form the D-borehole analysis, where a strong orientation bias resulted in a consistent pattern of measured fracture orientations through the rock. In the validation drift, fractures in the good rock also display a greater mean variance in length than those in the H-zone. These results provide strong support for a distinction being made between fractures in the good rock and the H-zone, and possibly between different areas of the good rock itself, for discrete modelling purposes. (au) (20 refs.)

  9. Model checking as an aid to procedure design

    International Nuclear Information System (INIS)

    Zhang, Wenhu

    2001-01-01

    The OECD Halden Reactor Project has been actively working on computer assisted operating procedures for many years. The objective of the research has been to provide computerised assistance for procedure design, verification and validation, implementation and maintenance. For the verification purpose, the application of formal methods has been considered in several reports. The recent formal verification activity conducted at the Halden Project is based on using model checking to the verification of procedures. This report presents verification approaches based on different model checking techniques and tools for the formalization and verification of operating procedures. Possible problems and relative merits of the different approaches are discussed. A case study of one of the approaches is presented to show the practical application of formal verification. Application of formal verification in the traditional procedure design process can reduce the human resources involved in reviews and simulations, and hence reduce the cost of verification and validation. A discussion of the integration of the formal verification with the traditional procedure design process is given at the end of this report. (Author)

  10. Development of Advanced Verification and Validation Procedures and Tools for the Certification of Learning Systems in Aerospace Applications

    Science.gov (United States)

    Jacklin, Stephen; Schumann, Johann; Gupta, Pramod; Richard, Michael; Guenther, Kurt; Soares, Fola

    2005-01-01

    Adaptive control technologies that incorporate learning algorithms have been proposed to enable automatic flight control and vehicle recovery, autonomous flight, and to maintain vehicle performance in the face of unknown, changing, or poorly defined operating environments. In order for adaptive control systems to be used in safety-critical aerospace applications, they must be proven to be highly safe and reliable. Rigorous methods for adaptive software verification and validation must be developed to ensure that control system software failures will not occur. Of central importance in this regard is the need to establish reliable methods that guarantee convergent learning, rapid convergence (learning) rate, and algorithm stability. This paper presents the major problems of adaptive control systems that use learning to improve performance. The paper then presents the major procedures and tools presently developed or currently being developed to enable the verification, validation, and ultimate certification of these adaptive control systems. These technologies include the application of automated program analysis methods, techniques to improve the learning process, analytical methods to verify stability, methods to automatically synthesize code, simulation and test methods, and tools to provide on-line software assurance.

  11. Simulation Based Studies in Software Engineering: A Matter of Validity

    Directory of Open Access Journals (Sweden)

    Breno Bernard Nicolau de França

    2015-04-01

    Full Text Available Despite the possible lack of validity when compared with other science areas, Simulation-Based Studies (SBS in Software Engineering (SE have supported the achievement of some results in the field. However, as it happens with any other sort of experimental study, it is important to identify and deal with threats to validity aiming at increasing their strength and reinforcing results confidence. OBJECTIVE: To identify potential threats to SBS validity in SE and suggest ways to mitigate them. METHOD: To apply qualitative analysis in a dataset resulted from the aggregation of data from a quasi-systematic literature review combined with ad-hoc surveyed information regarding other science areas. RESULTS: The analysis of data extracted from 15 technical papers allowed the identification and classification of 28 different threats to validity concerned with SBS in SE according Cook and Campbell’s categories. Besides, 12 verification and validation procedures applicable to SBS were also analyzed and organized due to their ability to detect these threats to validity. These results were used to make available an improved set of guidelines regarding the planning and reporting of SBS in SE. CONCLUSIONS: Simulation based studies add different threats to validity when compared with traditional studies. They are not well observed and therefore, it is not easy to identify and mitigate all of them without explicit guidance, as the one depicted in this paper.

  12. Quality data validation: Comprehensive approach to environmental data validation

    International Nuclear Information System (INIS)

    Matejka, L.A. Jr.

    1993-01-01

    Environmental data validation consists of an assessment of three major areas: analytical method validation; field procedures and documentation review; evaluation of the level of achievement of data quality objectives based in part on PARCC parameters analysis and expected applications of data. A program utilizing matrix association of required levels of validation effort and analytical levels versus applications of this environmental data was developed in conjunction with DOE-ID guidance documents to implement actions under the Federal Facilities Agreement and Consent Order in effect at the Idaho National Engineering Laboratory. This was an effort to bring consistent quality to the INEL-wide Environmental Restoration Program and database in an efficient and cost-effective manner. This program, documenting all phases of the review process, is described here

  13. Developing, adopting and adapting operating procedures

    International Nuclear Information System (INIS)

    Rabouhams, J.

    1986-01-01

    This lecture specifies all the dispositions which have been taken by EDF Nuclear and Fossil Generation Department - according to the fact that availability and safety largely depend on the quality of the procedures and their easy handling - in order to develop, adopt and adapt the operating procedures. The following points are treated: General organization of procedures for plant operation during normal and abnormal conditions; Personnel and extend of responsibility involved into the development of procedures (research center, training center, specialized services, nuclear station, etc.); Validation of the procedures by means of full-scope simulators; Modifications of the procedures taking into account operation experience in material and human fields; Development of simulation softs in order to perform the procedures in abnormal situations; Evolution of operating technics and future skills. (orig.)

  14. Method validation in pharmaceutical analysis: from theory to practical optimization

    Directory of Open Access Journals (Sweden)

    Jaqueline Kaleian Eserian

    2015-01-01

    Full Text Available The validation of analytical methods is required to obtain high-quality data. For the pharmaceutical industry, method validation is crucial to ensure the product quality as regards both therapeutic efficacy and patient safety. The most critical step in validating a method is to establish a protocol containing well-defined procedures and criteria. A well planned and organized protocol, such as the one proposed in this paper, results in a rapid and concise method validation procedure for quantitative high performance liquid chromatography (HPLC analysis.   Type: Commentary

  15. A study of graphite-epoxy laminate failures due to high transverse shear strains using the multi-span-beam shear test procedure

    Science.gov (United States)

    Jegley, Dawn C.

    1989-01-01

    The multi-span-beam shear test procedure is used to study failure mechanisms in graphite-epoxy laminates due to high transverse shear strains induced by severe local bending deformations in test specimens. Results of a series of tests on specimens with a variety of stacking sequences, including some with adhesive interleaving, are presented. These results indicate that laminates with stacking sequences with several + or - 45 and 90 deg plies next to each other are more susceptible to failures due to high transverse shear strains than laminates with + or - 45 and 0 deg plies next to each other or with + or - 45 deg plies next to layers of adhesive interleaving. Results of these tests are compared with analytical results based on finite elements.

  16. Reliability and Validity of 10 Different Standard Setting Procedures.

    Science.gov (United States)

    Halpin, Glennelle; Halpin, Gerald

    Research indicating that different cut-off points result from the use of different standard-setting techniques leaves decision makers with a disturbing dilemma: Which standard-setting method is best? This investigation of the reliability and validity of 10 different standard-setting approaches was designed to provide information that might help…

  17. 77 FR 67862 - Air Traffic Procedures Advisory Committee

    Science.gov (United States)

    2012-11-14

    ...The FAA is issuing this notice to advise the public that the FAA's Air Traffic Procedures Advisory Committee (ATPAC) two year charter has been coordinated and signed by the FAA Administrator. The ATPAC charter is valid for two years and provides a venue to review air traffic control procedures and practices for standardization, revision, clarification, and upgrading of terminology and procedures.

  18. Experimental methods to validate measures of emotional state and readiness for duty in critical operations

    International Nuclear Information System (INIS)

    Weston, Louise Marie

    2007-01-01

    A recent report on criticality accidents in nuclear facilities indicates that human error played a major role in a significant number of incidents with serious consequences and that some of these human errors may be related to the emotional state of the individual. A pre-shift test to detect a deleterious emotional state could reduce the occurrence of such errors in critical operations. The effectiveness of pre-shift testing is a challenge because of the need to gather predictive data in a relatively short test period and the potential occurrence of learning effects due to a requirement for frequent testing. This report reviews the different types of reliability and validity methods and testing and statistical analysis procedures to validate measures of emotional state. The ultimate value of a validation study depends upon the percentage of human errors in critical operations that are due to the emotional state of the individual. A review of the literature to identify the most promising predictors of emotional state for this application is highly recommended

  19. Simplified Transit Procedure in Railway Transport

    Directory of Open Access Journals (Sweden)

    Željko Kokorović

    2008-11-01

    Full Text Available The current transit procedure in railway transport that iscarried out on the basis of the Customs Act [ 11 of the Republicof Croatia is applied only up to the border, i. e. the issued documentsand guarantees are valid only up to the border, and byjoining the Convention on Common transit procedure, i. e. integrationof the Republic of Croatia in the European Union, theRepublic of Croatia will also have to implement the regulationsand rules of Simplified transit procedure valid in each of thethirty member states. In international railway traffic, the transportof goods is regulated by the Convention concerning InternationalCarriage by Rail- COT IF [2 1 and usage of the CIMwaybill (Contract for International Carriage of Goods by Rail.If the goods are transported in Simplified transit procedure, theformalities regarding the transport of goods performed by railcarriers using the international waybill CIM will be significantlysimplified and accelerated. In principle there are no delays dueto customs on the borders when crossing the EU borders andborders of the Convention member states, contributing greatlyto the acceleration of the transport of goods, reduction of waitingcosts and paperwork, as well as influence on the schedulereliability.

  20. Validation of the Netherlands pacemaker patient registry

    NARCIS (Netherlands)

    Dijk, WA; Kingma, T; Hooijschuur, CAM; Dassen, WRM; Hoorntje, JCA; van Gelder, LM

    1997-01-01

    This paper deals with the validation of the information stored in the Netherlands central pacemaker patient database. At this moment the registry database contains information on more than 70500 patients, 85000 pacemakers and 90000 leads. The validation procedures consisted of an internal

  1. [Treatment of periprosthetic and peri-implant fractures : modern plate osteosynthesis procedures].

    Science.gov (United States)

    Raschke, M J; Stange, R; Kösters, C

    2012-11-01

    Periprosthetic fractures are increasing not only due to the demographic development with high life expectancy, the increase in osteoporosis and increased prosthesis implantation but also due to increased activity of the elderly population. The therapeutic algorithms are manifold but general valid rules for severe fractures are not available. The most commonly occurring periprosthetic fractures are proximal and distal femoral fractures but in the clinical routine fractures of the tibial head, ankle, shoulder, elbow and on the borders to other implants (peri-implant fractures) and complex interprosthetic fractures are being seen increasingly more. It is to be expected that in the mid-term further options, such as cement augmentation of cannulated polyaxial locking screws will extend the portfolio of implants for treatment of periprosthetic fractures. The aim of this review article is to present the new procedures for osteosynthesis of periprosthetic fractures.

  2. Validation for chromatographic and electrophoretic methods

    OpenAIRE

    Ribani, Marcelo; Bottoli, Carla Beatriz Grespan; Collins, Carol H.; Jardim, Isabel Cristina Sales Fontes; Melo, Lúcio Flávio Costa

    2004-01-01

    The validation of an analytical method is fundamental to implementing a quality control system in any analytical laboratory. As the separation techniques, GC, HPLC and CE, are often the principal tools used in such determinations, procedure validation is a necessity. The objective of this review is to describe the main aspects of validation in chromatographic and electrophoretic analysis, showing, in a general way, the similarities and differences between the guidelines established by the dif...

  3. Aortic valve-sparing operation after correction of heart displacement due to pectus excavatum using Nuss procedure in a Marfan syndrome patient.

    Science.gov (United States)

    Fukunaga, Naoto; Yuzaki, Mitsuru; Hamakawa, Hiroshi; Nasu, Michihiro; Takahashi, Yutaka; Okada, Yukikatsu

    2012-01-01

    Cardiovascular surgery in the setting of chest wall deformities is a clinical challenge. Pectus excavatum, for example, can cause heart displacement to the left thoracic cavity, following the poor operative field. This report highlights a case in which a successful aortic valve-sparing operation via conventional median sternotomy after correction of the heart displacement due to pectus excavatum using Nuss procedure in Marfan syndrome. This technique can be one surgical option in Marfan syndrome patients with pectus excavatum and thoracic aortic aneurysm under close follow up.

  4. Application and Evaluation of an Expert Judgment Elicitation Procedure for Correlations.

    Science.gov (United States)

    Zondervan-Zwijnenburg, Mariëlle; van de Schoot-Hubeek, Wenneke; Lek, Kimberley; Hoijtink, Herbert; van de Schoot, Rens

    2017-01-01

    The purpose of the current study was to apply and evaluate a procedure to elicit expert judgments about correlations, and to update this information with empirical data. The result is a face-to-face group elicitation procedure with as its central element a trial roulette question that elicits experts' judgments expressed as distributions. During the elicitation procedure, a concordance probability question was used to provide feedback to the experts on their judgments. We evaluated the elicitation procedure in terms of validity and reliability by means of an application with a small sample of experts. Validity means that the elicited distributions accurately represent the experts' judgments. Reliability concerns the consistency of the elicited judgments over time. Four behavioral scientists provided their judgments with respect to the correlation between cognitive potential and academic performance for two separate populations enrolled at a specific school in the Netherlands that provides special education to youth with severe behavioral problems: youth with autism spectrum disorder (ASD), and youth with diagnoses other than ASD. Measures of face-validity, feasibility, convergent validity, coherence, and intra-rater reliability showed promising results. Furthermore, the current study illustrates the use of the elicitation procedure and elicited distributions in a social science application. The elicited distributions were used as a prior for the correlation, and updated with data for both populations collected at the school of interest. The current study shows that the newly developed elicitation procedure combining the trial roulette method with the elicitation of correlations is a promising tool, and that the results of the procedure are useful as prior information in a Bayesian analysis.

  5. Microbial ecology laboratory procedures manual NASA/MSFC

    Science.gov (United States)

    Huff, Timothy L.

    1990-01-01

    An essential part of the efficient operation of any microbiology laboratory involved in sample analysis is a standard procedures manual. The purpose of this manual is to provide concise and well defined instructions on routine technical procedures involving sample analysis and methods for monitoring and maintaining quality control within the laboratory. Of equal importance is the safe operation of the laboratory. This manual outlines detailed procedures to be followed in the microbial ecology laboratory to assure safety, analytical control, and validity of results.

  6. Simple procedure for phase-space measurement and entanglement validation

    Science.gov (United States)

    Rundle, R. P.; Mills, P. W.; Tilma, Todd; Samson, J. H.; Everitt, M. J.

    2017-08-01

    It has recently been shown that it is possible to represent the complete quantum state of any system as a phase-space quasiprobability distribution (Wigner function) [Phys. Rev. Lett. 117, 180401 (2016), 10.1103/PhysRevLett.117.180401]. Such functions take the form of expectation values of an observable that has a direct analogy to displaced parity operators. In this work we give a procedure for the measurement of the Wigner function that should be applicable to any quantum system. We have applied our procedure to IBM's Quantum Experience five-qubit quantum processor to demonstrate that we can measure and generate the Wigner functions of two different Bell states as well as the five-qubit Greenberger-Horne-Zeilinger state. Because Wigner functions for spin systems are not unique, we define, compare, and contrast two distinct examples. We show how the use of these Wigner functions leads to an optimal method for quantum state analysis especially in the situation where specific characteristic features are of particular interest (such as for spin Schrödinger cat states). Furthermore we show that this analysis leads to straightforward, and potentially very efficient, entanglement test and state characterization methods.

  7. A procedure for safety assessment of components with cracks - Handbook. 3rd revised edition

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, P.; Bergman, M.; Brickstad, B.; Dahlberg, L.; Nilsson, F.; Sattari-Far, I. [SAQ Kontroll AB, Stockholm (Sweden)

    1999-12-01

    In this handbook a procedure is described which can be used both for assessment of detected cracks or crack-like defects and for defect tolerance analysis. The procedure can be used to calculate possible crack growth due to fatigue or stress corrosion and to calculate the reserve margin for failure due to fracture and plastic collapse. For ductile materials, the procedure gives the reserve margin for initiation of stable crack growth. Thus, an extra reserve margin, unknown to size, exists for failure in components made of ductile materials. The procedure was developed for operative use with the following objectives in mind: a) The procedure should be able to handle both linear and non-linear problems without any a priori division. b) The procedure shall ensure uniqueness of the safety assessment. c) The procedure should be well defined and easy to use. d) The conservatism of the procedure should be well validated. e) The handbook, that documents the procedure, should be so complete that for most assessments, access to any other fracture mechanics literature should not be necessary. The method utilized in the procedure is based on the R6-method developed at Nuclear Electric plc. The basic assumption is that fracture initiated by a crack can be described by the variables K{sub r} and L{sub r}. K{sub r} is the ratio between the stress intensity factor and the fracture toughness of the material. L{sub r} is the ratio between applied load and the plastic limit load of the structure. The pair of calculated values of these variables is plotted in a diagram. If the point is situated within the noncritical region, fracture is assumed not to occur. If the point is situated outside the region, crack growth and fracture may occur. The method can in principal be used for all metallic materials. It is, however, more extensively verified for steel alloys only. The method is not intended for use in temperature regions where creep deformation is of importance. To fulfil the above

  8. A procedure for safety assessment of components with cracks - Handbook. 3rd revised edition

    International Nuclear Information System (INIS)

    Andersson, P.; Bergman, M.; Brickstad, B.; Dahlberg, L.; Nilsson, F.; Sattari-Far, I.

    1999-12-01

    In this handbook a procedure is described which can be used both for assessment of detected cracks or crack-like defects and for defect tolerance analysis. The procedure can be used to calculate possible crack growth due to fatigue or stress corrosion and to calculate the reserve margin for failure due to fracture and plastic collapse. For ductile materials, the procedure gives the reserve margin for initiation of stable crack growth. Thus, an extra reserve margin, unknown to size, exists for failure in components made of ductile materials. The procedure was developed for operative use with the following objectives in mind: a) The procedure should be able to handle both linear and non-linear problems without any a priori division. b) The procedure shall ensure uniqueness of the safety assessment. c) The procedure should be well defined and easy to use. d) The conservatism of the procedure should be well validated. e) The handbook, that documents the procedure, should be so complete that for most assessments, access to any other fracture mechanics literature should not be necessary. The method utilized in the procedure is based on the R6-method developed at Nuclear Electric plc. The basic assumption is that fracture initiated by a crack can be described by the variables K r and L r . K r is the ratio between the stress intensity factor and the fracture toughness of the material. L r is the ratio between applied load and the plastic limit load of the structure. The pair of calculated values of these variables is plotted in a diagram. If the point is situated within the noncritical region, fracture is assumed not to occur. If the point is situated outside the region, crack growth and fracture may occur. The method can in principal be used for all metallic materials. It is, however, more extensively verified for steel alloys only. The method is not intended for use in temperature regions where creep deformation is of importance. To fulfil the above given objectives

  9. Predictive validity of the Slovene Matura

    Directory of Open Access Journals (Sweden)

    Valentin Bucik

    2001-09-01

    Full Text Available Passing Matura is the last step of the secondary school graduation, but it is also the entrance ticket for the university. Besides, the summary score of Matura exam takes part in the selection process for the particular university studies in case of 'numerus clausus'. In discussing either aim of Matura important dilemmas arise, namely, is the Matura examination sufficiently exact and rightful procedure to, firstly, use its results for settling starting studying conditions and, secondly, to select validly, reliably and sensibly the best candidates for university studies. There are some questions concerning predictive validity of Matura that should be answered, e.g. (i does Matura as an enrollment procedure add to the qualitaty of the study; (ii is it a better selection tool than entrance examinations formerly used in different faculties in the case of 'numerus clausus'; and (iii is it reasonable to expect high predictive validity of Matura results for success at the university at all. Recent results show that in the last few years the dropout-rate is lower than before, the pass-rate between the first and the second year is higher and the average duration of study per student is shorter. It is clear, however, that it is not possible to simply predict the study success from the Matura results. There are too many factors influencing the success in the university studies. In most examined study programs the correlation between Matura results and study success is positive but moderate, therefore it can not be said categorically that only candidates accepted according to the Matura results are (or will be the best students. Yet it has been shown that Matura is a standardized procedure, comparable across different candidates entering university, and that – when compared entrance examinations – it is more objective, reliable, and hen ce more valid and fair a procedure. In addition, comparable procedures of university recruiting and selection can be

  10. Measuring variability of procedure progression in proceduralized scenarios

    International Nuclear Information System (INIS)

    Kim, Yochan; Park, Jinkyun; Jung, Wondea

    2012-01-01

    Highlights: ► The VPP measure was developed to quantify how differently operators follow the procedures. ► Sources that cause variability of ways to follow a given procedure were identified. ► The VPP values for the scenarios are positively related to the scenario performance time. ► The VPP measure is meaningful for explaining characteristics of several PSFs. -- Abstract: Various performance shaping factors (PSFs) have been presented to explain the contributors to unsafe acts in a human failure event or predict a human error probability of new human performance. However, because most of these parameters of an HRA depend on the subjective knowledge and experience of HRA analyzers, the results of an HRA insufficiently provide unbiased standards to explain human performance variations or compare collected data with other data from different analyzers. To secure the validity of the HRA results, we propose a quantitative measure, which represents the variability of procedure progression (VPP) in proceduralized scenarios. A VPP measure shows how differently the operators follow the steps of the procedures. This paper introduces the sources of the VPP measure and relevance to PSFs. The assessment method of the VPP measure is also proposed, and the application examples are shown with a comparison of the performance time. Although more empirical studies should be conducted to reveal the relationship between the VPP measure and other PSFs, it is believed that the VPP measure provides evidence to quantitatively evaluate human performance variations and to cross-culturally compare the collected data.

  11. A content validated questionnaire for assessment of self reported venous blood sampling practices.

    Science.gov (United States)

    Bölenius, Karin; Brulin, Christine; Grankvist, Kjell; Lindkvist, Marie; Söderberg, Johan

    2012-01-19

    Venous blood sampling is a common procedure in health care. It is strictly regulated by national and international guidelines. Deviations from guidelines due to human mistakes can cause patient harm. Validated questionnaires for health care personnel can be used to assess preventable "near misses"--i.e. potential errors and nonconformities during venous blood sampling practices that could transform into adverse events. However, no validated questionnaire that assesses nonconformities in venous blood sampling has previously been presented. The aim was to test a recently developed questionnaire in self reported venous blood sampling practices for validity and reliability. We developed a questionnaire to assess deviations from best practices during venous blood sampling. The questionnaire contained questions about patient identification, test request management, test tube labeling, test tube handling, information search procedures and frequencies of error reporting. For content validity, the questionnaire was confirmed by experts on questionnaires and venous blood sampling. For reliability, test-retest statistics were used on the questionnaire answered twice. The final venous blood sampling questionnaire included 19 questions out of which 9 had in total 34 underlying items. It was found to have content validity. The test-retest analysis demonstrated that the items were generally stable. In total, 82% of the items fulfilled the reliability acceptance criteria. The questionnaire could be used for assessment of "near miss" practices that could jeopardize patient safety and gives several benefits instead of assessing rare adverse events only. The higher frequencies of "near miss" practices allows for quantitative analysis of the effect of corrective interventions and to benchmark preanalytical quality not only at the laboratory/hospital level but also at the health care unit/hospital ward.

  12. On Line Validation Exercise (OLIVE: A Web Based Service for the Validation of Medium Resolution Land Products. Application to FAPAR Products

    Directory of Open Access Journals (Sweden)

    Marie Weiss

    2014-05-01

    Full Text Available The OLIVE (On Line Interactive Validation Exercise platform is dedicated to the validation of global biophysical products such as LAI (Leaf Area Index and FAPAR (Fraction of Absorbed Photosynthetically Active Radiation. It was developed under the framework of the CEOS (Committee on Earth Observation Satellites Land Product Validation (LPV sub-group. OLIVE has three main objectives: (i to provide a consistent and centralized information on the definition of the biophysical variables, as well as a description of the main available products and their performances (ii to provide transparency and traceability by an online validation procedure compliant with the CEOS LPV and QA4EO (Quality Assurance for Earth Observation recommendations (iii and finally, to provide a tool to benchmark new products, update product validation results and host new ground measurement sites for accuracy assessment. The functionalities and algorithms of OLIVE are described to provide full transparency of its procedures to the community. The validation process and typical results are illustrated for three FAPAR products: GEOV1 (VEGETATION sensor, MGVIo (MERIS sensor and MODIS collection 5 FPAR. OLIVE is available on the European Space Agency CAL/VAL portal, including full documentation, validation exercise results, and product extracts.

  13. Procedural virtual reality simulation in minimally invasive surgery.

    Science.gov (United States)

    Våpenstad, Cecilie; Buzink, Sonja N

    2013-02-01

    Simulation of procedural tasks has the potential to bridge the gap between basic skills training outside the operating room (OR) and performance of complex surgical tasks in the OR. This paper provides an overview of procedural virtual reality (VR) simulation currently available on the market and presented in scientific literature for laparoscopy (LS), flexible gastrointestinal endoscopy (FGE), and endovascular surgery (EVS). An online survey was sent to companies and research groups selling or developing procedural VR simulators, and a systematic search was done for scientific publications presenting or applying VR simulators to train or assess procedural skills in the PUBMED and SCOPUS databases. The results of five simulator companies were included in the survey. In the literature review, 116 articles were analyzed (45 on LS, 43 on FGE, 28 on EVS), presenting a total of 23 simulator systems. The companies stated to altogether offer 78 procedural tasks (33 for LS, 12 for FGE, 33 for EVS), of which 17 also were found in the literature review. Although study type and used outcomes vary between the three different fields, approximately 90 % of the studies presented in the retrieved publications for LS found convincing evidence to confirm the validity or added value of procedural VR simulation. This was the case in approximately 75 % for FGE and EVS. Procedural training using VR simulators has been found to improve clinical performance. There is nevertheless a large amount of simulated procedural tasks that have not been validated. Future research should focus on the optimal use of procedural simulators in the most effective training setups and further investigate the benefits of procedural VR simulation to improve clinical outcome.

  14. The development of a statistical procedure to correct the effects of restriction of range on validity coefficients

    Directory of Open Access Journals (Sweden)

    J. M. Scheepers

    1996-06-01

    Full Text Available In the validation of tests used for selection purposes, the obtained validity coefficients are invariably underestimates of the true validities, due to explicit and implicit selection in respect of the relevant variables. Both explicit and implicit selection leads to restriction of range of the relevant variables, and this in turn reduces the obtained validites. A formal proof for this is given. A number of researchers have developed formulae for correcting sample validities in order to get better estimates of the true validities (Pearson/ 1903; Thorndike, 1949; Gulliksen, 1950; Rydberg, 1962 and Lord & Novick, 1968. It is, however, virtually impossible to obtain a complete view of the problem of restriction of range in this way. In the present paper a different approach has been followed: Population correlations have been computed for various degrees of truncation of the explicit selection variable. This has been done for population correlations ranging from 0,10 to 0,99. A graphical display, indicating the shrinkage of the population correlations for various truncation ratios, has been prepared. Opsomming In die geldigheidsbepaling van toetse wat vir keuringsdoeleindes gebruik word, is die verkree geldigheidskoeffisiente sender uitsondering onderskattings van die ware geldighede as gevolg van eksplisiete en implisiete keuring ten opsigte van die tersaaklike veranderlikes. Sowel eksplisiete as implisiete keuring lei tot inperking van die variasiewydte van die relevante veranderiikes, en dit reduseer om die beurt weer die verkree geldighede. 'n Formele bewys hiervoor word in die referaat gegee. 'n Aantal navorsers het formules ontwikkel om steekproefgeldighede te korrigeer ten einde beter beramings van die ware geldighede te verkry (Pearson/ 1903; horndike, 1949: Gulliksen, 1950; Rygberg, 1962 en Lord & Novick, 1968. Dit is egter bykans onmoontlik om op hierdie wyse 'n geheelbeeld van die probleem van inperking van variasiewydte te vorm. In die

  15. Experimentally Validated Combustion and Piston Fatigue Life Evaluation Procedures for the Bi-Fuel Engines, Using an Integral-Type Fatigue Criterion

    Directory of Open Access Journals (Sweden)

    M. Shariyat

    Full Text Available Abstract A relatively complete procedure for high cycle fatigue life assessment of the engine components is outlined in the present paper. The piston is examined as a typical component of the engine. In this regard, combustion process and transient heat transfer simulations, determination of the instantaneous variations of the pressure and temperature in the combustion chamber, kinematic and dynamic analyses of the moving parts of the engine, thermoelastic stress analyses, and fatigue life analyses are accomplished. Results of the simulation are compared with the test data to verify the results. The heat transfer results are validated by the experimental results measured by the Templugs. The nonlinear multipoint contact constraints are modeled accurately. Results of the more accurate available fatigue criteria are compared with those of a fatigue criterion recently proposed by the first author. These results are also evaluated by comparing them with the experimental durability tests. The presented procedure may be used, e.g., to decide whether it is suitable to convert a gasoline-based engine to a bi-fuel one. Results of the various thermomechanical fatigue analyses performed reveal that the piston life decreases considerably when natural gas is used instead of gasoline.

  16. Diagnostical Procedure for Logistical Management in Turistical Entities

    Directory of Open Access Journals (Sweden)

    Libia Arlen Fergusson-Álvarez

    2016-06-01

    Full Text Available This research aims to design a diagnostic procedure of the logistics management for turistical entities, they could be hotels or not. This procedure was validated in Commercial Branch Caracol Santiago de Cuba and finally, different actions for the detected problems were proposed with the objective of improving the logistics management of the organization. To develop this research various tools and techniques served as support, such as: surveys, SPSS software (Statistical Package for Social Sciences 15.0 version and Decision 1.0 version, exponential smoothing, Cronbach's Alpha, the coefficient of Kendall W, financial and logistical indicators, the ABC or Pareto Method, the matrices for the classification of stocks and suppliers, among others. This research made possible the design of a diagnostic procedure for logistical management for turistical entities. And it was validated in the Comercial Branch Caracol Santiago, which allowed the proposition of improvement actions for increasing customer satisfaction.

  17. Reliable and valid assessment of Lichtenstein hernia repair skills.

    Science.gov (United States)

    Carlsen, C G; Lindorff-Larsen, K; Funch-Jensen, P; Lund, L; Charles, P; Konge, L

    2014-08-01

    Lichtenstein hernia repair is a common surgical procedure and one of the first procedures performed by a surgical trainee. However, formal assessment tools developed for this procedure are few and sparsely validated. The aim of this study was to determine the reliability and validity of an assessment tool designed to measure surgical skills in Lichtenstein hernia repair. Key issues were identified through a focus group interview. On this basis, an assessment tool with eight items was designed. Ten surgeons and surgical trainees were video recorded while performing Lichtenstein hernia repair, (four experts, three intermediates, and three novices). The videos were blindly and individually assessed by three raters (surgical consultants) using the assessment tool. Based on these assessments, validity and reliability were explored. The internal consistency of the items was high (Cronbach's alpha = 0.97). The inter-rater reliability was very good with an intra-class correlation coefficient (ICC) = 0.93. Generalizability analysis showed a coefficient above 0.8 even with one rater. The coefficient improved to 0.92 if three raters were used. One-way analysis of variance found a significant difference between the three groups which indicates construct validity, p fashion with the new procedure-specific assessment tool. We recommend this tool for future assessment of trainees performing Lichtenstein hernia repair to ensure that the objectives of competency-based surgical training are met.

  18. 29 CFR 1607.6 - Use of selection procedures which have not been validated.

    Science.gov (United States)

    2010-07-01

    .... 1607.6 Section 1607.6 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY... circumstances in which a user cannot or need not utilize the validation techniques contemplated by these... which has an adverse impact, the validation techniques contemplated by these guidelines usually should...

  19. 40 CFR 1039.501 - How do I run a valid emission test?

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false How do I run a valid emission test? 1039.501 Section 1039.501 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Procedures § 1039.501 How do I run a valid emission test? (a) Use the equipment and procedures for...

  20. Improvements in Logic Diagram of Computerized Procedure System of APR1400

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Sungkweon; Seong, Nokyu [KHNP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    The Computerized Procedure System (CPS) has been improved since it is installed in Shin-Kori 3 and 4 Nuclear Power Plants. It is one of operating support systems of digital Main Control Room (MCR) and provides many functions to operators in executing the procedure. CPS can effectively remove the human errors by supporting the procedure flow and logic diagram. This paper describes the logic diagram of CPS of reference power plant and shows the improved logic diagram of CPS of Shin-Kori unit 5 and 6. This paper describes the current logic diagram of CPS and suggests improved design for logic diagram. The improved logic diagram shall be validated through human factors engineering verification and validation. The improved design will help operators execute the computerized procedure fast and remove the human error.

  1. Two Challenges of Correct Validation in Pattern Recognition

    Directory of Open Access Journals (Sweden)

    Thomas eNowotny

    2014-09-01

    Full Text Available Supervised pattern recognition is the process of mapping patterns to class labelsthat define their meaning. The core methods for pattern recognitionhave been developed by machine learning experts but due to their broadsuccess an increasing number of non-experts are now employing andrefining them. In this perspective I will discuss the challenge ofcorrect validation of supervised pattern recognition systems, in particular whenemployed by non-experts. To illustrate the problem I will give threeexamples of common errors that I have encountered in the lastyear. Much of this challenge can be addressed by strict procedure invalidation but there are remaining problems of correctlyinterpreting comparative work on exemplary data sets, which I willelucidate on the example of the well-used MNIST data set of handwrittendigits.

  2. Processes and Procedures for Application of CFD to Nuclear Reactor Safety Analysis

    International Nuclear Information System (INIS)

    Richard W. Johnson; Richard R. Schultz; Patrick J. Roache; Ismail B. Celik; William D. Pointer; Yassin A. Hassan

    2006-01-01

    of the flow and energy transport as applied to nuclear reactor safety. However, it is expected that these practices and procedures will require updating from time to time as research and development affect them or replace them with better procedures. The practices and procedures are categorized into five groups. These are: (1) Code Verification; (2) Code and Calculation Documentation; (3) Reduction of Numerical Error; (4) Quantification of Numerical Uncertainty (Calculation Verification); and (5) Calculation Validation. These five categories have been identified from procedures currently required of CFD simulations such as those required for publication of a paper in the ASME Journal of Fluids Engineering and from the literature such as Roache [1998]. Code verification refers to the demonstration that the equations of fluid and energy transport have been correctly coded in the CFD code. Code and calculation documentation simply means that the equations and their discretizations, etc., and boundary and initial conditions used to pose the fluid flow problem are fully described in available documentation. Reduction of numerical error refers to practices and procedures to lower numerical errors to negligible or very low levels as is reasonably possible (such as avoiding use of first-order discretizations). The quantification of numerical uncertainty is also known as calculation verification. This means that estimates are made of numerical error to allow the characterization of the numerical results with a certain confidence level. Numerical error in this case does not include error due to models such as turbulence models. Calculation validation is the process of comparing simulation results to experimental data to demonstrate level of agreement. Validation does include the effects of modeling errors as well as numerical and experimental errors. A key issue in the validation process of numerical results is the existence of appropriate experimental data to use for

  3. Guidelines for the verification and validation of expert system software and conventional software: Volume 5, Rationale and description of verification and validation guideline packages and procedures. Final report

    International Nuclear Information System (INIS)

    Miller, L.A.; Hayes, J.E.; Mirsky, S.M.

    1995-05-01

    This report is the fifth volume in a series of reports describing the results of the Expert System Verification and Validation (V ampersand V) project which is jointly funded by US NRC and EPRI toward formulating guidelines for V ampersand V of expert systems for use in nuclear power applications. This report provides the rationale for and description of those guidelines. The actual guidelines themselves (and the accompanying 11 step by step Procedures) are presented in Volume 7, User's Manual. Three factors determine what V ampersand V is needed: (1) the stage, of the development life cycle (requirements, design, or implementation), (2) whether the overall system or a specialized component needs be tested (knowledge base component, inference engine or other highly reusable element, or a component involving conventional software), and (3) the stringency of V ampersand V that is needed (as judged from an assessment of the system's complexity and the requirement for its integrity to form three Classes). A V ampersand V guideline package is provided for each of the combinations of these three variables. The package specifies the V ampersand V methods recommended and the order in which they should be administered, the assurances each method provides, the qualifications needed by the V ampersand V team to employ each Particular method, the degree to which the methods should be applied, the performance measures that should be taken, and the decision criteria for accepting, conditionally accepting, or rejecting an evaluated system. In addition to the guideline packages, highly detailed step-by-step procedures are provided for 11 of the more important methods, to ensure that they Can be implemented correctly. The guidelines can apply to conventional procedural software systems as well as all kinds of AI systems

  4. Objective Truth Institution in Criminal Procedure

    Directory of Open Access Journals (Sweden)

    Voltornist O. A.

    2012-11-01

    Full Text Available The article deals with the category of objective truth in criminal procedure, its importance for correct determination of criminal court procedure aims. The author analyzes also the bill draft offered by the RF Committee of Inquiry “On amending in the RF Criminal Procedure Code due to the implementation ofobjective truth institution in criminal procedure”

  5. Validation and measurement uncertainty estimation in food microbiology: differences between quantitative and qualitative methods

    Directory of Open Access Journals (Sweden)

    Vesna Režić Dereani

    2010-09-01

    Full Text Available The aim of this research is to describe quality control procedures, procedures for validation and measurement uncertainty (MU determination as an important element of quality assurance in food microbiology laboratory for qualitative and quantitative type of analysis. Accreditation is conducted according to the standard ISO 17025:2007. General requirements for the competence of testing and calibration laboratories, which guarantees the compliance with standard operating procedures and the technical competence of the staff involved in the tests, recently are widely introduced in food microbiology laboratories in Croatia. In addition to quality manual introduction, and a lot of general documents, some of the most demanding procedures in routine microbiology laboratories are measurement uncertainty (MU procedures and validation experiment design establishment. Those procedures are not standardized yet even at international level, and they require practical microbiological knowledge, altogether with statistical competence. Differences between validation experiments design for quantitative and qualitative food microbiology analysis are discussed in this research, and practical solutions are shortly described. MU for quantitative determinations is more demanding issue than qualitative MU calculation. MU calculations are based on external proficiency testing data and internal validation data. In this paper, practical schematic descriptions for both procedures are shown.

  6. 32 CFR 1288.5 - Procedures.

    Science.gov (United States)

    2010-07-01

    ... validation sticker will be determined locally. (4) Decals or other media used to identify vehicles of... OF PRIVATELY OWNED MOTOR VEHICLES § 1288.5 Procedures. (a) Issuance of DLA POV decal and 3-year... vehicle. An additional decal may be placed on the rear bumper of the vehicle. For vehicles not equipped...

  7. 40 CFR 1045.501 - How do I run a valid emission test?

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false How do I run a valid emission test... Procedures § 1045.501 How do I run a valid emission test? (a) Applicability. This subpart is addressed to you... maximum test speed. (g) Special and alternate procedures. If you are unable to run the duty cycle...

  8. Relations among conceptual knowledge, procedural knowledge, and procedural flexibility in two samples differing in prior knowledge.

    Science.gov (United States)

    Schneider, Michael; Rittle-Johnson, Bethany; Star, Jon R

    2011-11-01

    Competence in many domains rests on children developing conceptual and procedural knowledge, as well as procedural flexibility. However, research on the developmental relations between these different types of knowledge has yielded unclear results, in part because little attention has been paid to the validity of the measures or to the effects of prior knowledge on the relations. To overcome these problems, we modeled the three constructs in the domain of equation solving as latent factors and tested (a) whether the predictive relations between conceptual and procedural knowledge were bidirectional, (b) whether these interrelations were moderated by prior knowledge, and (c) how both constructs contributed to procedural flexibility. We analyzed data from 2 measurement points each from two samples (Ns = 228 and 304) of middle school students who differed in prior knowledge. Conceptual and procedural knowledge had stable bidirectional relations that were not moderated by prior knowledge. Both kinds of knowledge contributed independently to procedural flexibility. The results demonstrate how changes in complex knowledge structures contribute to competence development.

  9. Validity of reciprocity rule on mouse skin thermal damage due to CO2 laser irradiation

    Science.gov (United States)

    Parvin, P.; Dehghanpour, H. R.; Moghadam, M. S.; Daneshafrooz, V.

    2013-07-01

    CO2 laser (10.6 μm) is a well-known infrared coherent light source as a tool in surgery. At this wavelength there is a high absorbance coefficient (860 cm-1), because of vibration mode resonance of H2O molecules. Therefore, the majority of the irradiation energy is absorbed in the tissue and the temperature of the tissue rises as a function of power density and laser exposure duration. In this work, the tissue damage caused by CO2 laser (1-10 W, ˜40-400 W cm-2, 0.1-6 s) was measured using 30 mouse skin samples. Skin damage assessment was based on measurements of the depth of cut, mean diameter of the crater and the carbonized layer. The results show that tissue damage as assessed above parameters increased with laser fluence and saturated at 1000 J cm-2. Moreover, the damage effect due to high power density at short duration was not equivalent to that with low power density at longer irradiation time even though the energy delivered was identical. These results indicate the lack of validity of reciprocity (Bunsen-Roscoe) rule for the thermal damage.

  10. Chemometric and biological validation of a capillary electrophoresis metabolomic experiment of Schistosoma mansoni infection in mice.

    Science.gov (United States)

    Garcia-Perez, Isabel; Angulo, Santiago; Utzinger, Jürg; Holmes, Elaine; Legido-Quigley, Cristina; Barbas, Coral

    2010-07-01

    Metabonomic and metabolomic studies are increasingly utilized for biomarker identification in different fields, including biology of infection. The confluence of improved analytical platforms and the availability of powerful multivariate analysis software have rendered the multiparameter profiles generated by these omics platforms a user-friendly alternative to the established analysis methods where the quality and practice of a procedure is well defined. However, unlike traditional assays, validation methods for these new multivariate profiling tools have yet to be established. We propose a validation for models obtained by CE fingerprinting of urine from mice infected with the blood fluke Schistosoma mansoni. We have analysed urine samples from two sets of mice infected in an inter-laboratory experiment where different infection methods and animal husbandry procedures were employed in order to establish the core biological response to a S. mansoni infection. CE data were analysed using principal component analysis. Validation of the scores consisted of permutation scrambling (100 repetitions) and a manual validation method, using a third of the samples (not included in the model) as a test or prediction set. The validation yielded 100% specificity and 100% sensitivity, demonstrating the robustness of these models with respect to deciphering metabolic perturbations in the mouse due to a S. mansoni infection. A total of 20 metabolites across the two experiments were identified that significantly discriminated between S. mansoni-infected and noninfected control samples. Only one of these metabolites, allantoin, was identified as manifesting different behaviour in the two experiments. This study shows the reproducibility of CE-based metabolic profiling methods for disease characterization and screening and highlights the importance of much needed validation strategies in the emerging field of metabolomics.

  11. A content validated questionnaire for assessment of self reported venous blood sampling practices

    Directory of Open Access Journals (Sweden)

    Bölenius Karin

    2012-01-01

    Full Text Available Abstract Background Venous blood sampling is a common procedure in health care. It is strictly regulated by national and international guidelines. Deviations from guidelines due to human mistakes can cause patient harm. Validated questionnaires for health care personnel can be used to assess preventable "near misses"--i.e. potential errors and nonconformities during venous blood sampling practices that could transform into adverse events. However, no validated questionnaire that assesses nonconformities in venous blood sampling has previously been presented. The aim was to test a recently developed questionnaire in self reported venous blood sampling practices for validity and reliability. Findings We developed a questionnaire to assess deviations from best practices during venous blood sampling. The questionnaire contained questions about patient identification, test request management, test tube labeling, test tube handling, information search procedures and frequencies of error reporting. For content validity, the questionnaire was confirmed by experts on questionnaires and venous blood sampling. For reliability, test-retest statistics were used on the questionnaire answered twice. The final venous blood sampling questionnaire included 19 questions out of which 9 had in total 34 underlying items. It was found to have content validity. The test-retest analysis demonstrated that the items were generally stable. In total, 82% of the items fulfilled the reliability acceptance criteria. Conclusions The questionnaire could be used for assessment of "near miss" practices that could jeopardize patient safety and gives several benefits instead of assessing rare adverse events only. The higher frequencies of "near miss" practices allows for quantitative analysis of the effect of corrective interventions and to benchmark preanalytical quality not only at the laboratory/hospital level but also at the health care unit/hospital ward.

  12. Selection procedures in sports: Improving predictions of athletes’ future performance

    NARCIS (Netherlands)

    den Hartigh, Jan Rudolf; Niessen, Anna; Frencken, Wouter; Meijer, Rob R.

    The selection of athletes has been a central topic in sports sciences for decades. Yet, little consideration has been given to the theoretical underpinnings and predictive validity of the procedures. In this paper, we evaluate current selection procedures in sports given what we know from the

  13. Validation of EAF-2005 data

    International Nuclear Information System (INIS)

    Kopecky, J.

    2005-01-01

    Full text: Validation procedures applied on EAF-2003 starter file, which lead to the production of EAF-2005 library, are described. The results in terms of reactions with assigned quality scores in EAF-20005 are given. Further the extensive validation against the recent integral data is discussed together with the status of the final report 'Validation of EASY-2005 using integral measurements'. Finally, the novel 'cross section trend analysis' is presented with some examples of its use. This action will lead to the release of improved library EAF-2005.1 at the end of 2005, which shall be used as the starter file for EAF-2007. (author)

  14. Determination of methylmercury in marine sediment samples: Method validation and occurrence data

    Energy Technology Data Exchange (ETDEWEB)

    Carrasco, Luis; Vassileva, Emilia, E-mail: e.vasileva-veleva@iaea.org

    2015-01-01

    Highlights: • A method for MeHg determination at trace level in marine sediments is completely validated. • Validation is performed according to ISO-17025 and Eurachem guidelines. • The extraction efficiency of four sample preparation procedures is evaluated. • The uncertainty budget is used as a tool for evaluation of main uncertainty contributors. • Comparison with independent methods yields good agreement within stated uncertainty. - Abstract: The determination of methylmercury (MeHg) in sediment samples is a difficult task due to the extremely low MeHg/THg (total mercury) ratio and species interconversion. Here, we present the method validation of a cost-effective fit-for-purpose analytical procedure for the measurement of MeHg in sediments, which is based on aqueous phase ethylation, followed by purge and trap and hyphenated gas chromatography–pyrolysis–atomic fluorescence spectrometry (GC–Py–AFS) separation and detection. Four different extraction techniques, namely acid and alkaline leaching followed by solvent extraction and evaporation, microwave-assisted extraction with 2-mercaptoethanol, and acid leaching, solvent extraction and back extraction into sodium thiosulfate, were examined regarding their potential to selectively extract MeHg from estuarine sediment IAEA-405 certified reference material (CRM). The procedure based on acid leaching with HNO{sub 3}/CuSO{sub 4}, solvent extraction and back extraction into Na{sub 2}S{sub 2}O{sub 3} yielded the highest extraction recovery, i.e., 94 ± 3% and offered the possibility to perform the extraction of a large number of samples in a short time, by eliminating the evaporation step. The artifact formation of MeHg was evaluated by high performance liquid chromatography coupled to inductively coupled plasma mass spectrometry (HPLC–ICP–MS), using isotopically enriched Me{sup 201}Hg and {sup 202}Hg and it was found to be nonexistent. A full validation approach in line with ISO 17025 and

  15. Determination of methylmercury in marine sediment samples: Method validation and occurrence data

    International Nuclear Information System (INIS)

    Carrasco, Luis; Vassileva, Emilia

    2015-01-01

    Highlights: • A method for MeHg determination at trace level in marine sediments is completely validated. • Validation is performed according to ISO-17025 and Eurachem guidelines. • The extraction efficiency of four sample preparation procedures is evaluated. • The uncertainty budget is used as a tool for evaluation of main uncertainty contributors. • Comparison with independent methods yields good agreement within stated uncertainty. - Abstract: The determination of methylmercury (MeHg) in sediment samples is a difficult task due to the extremely low MeHg/THg (total mercury) ratio and species interconversion. Here, we present the method validation of a cost-effective fit-for-purpose analytical procedure for the measurement of MeHg in sediments, which is based on aqueous phase ethylation, followed by purge and trap and hyphenated gas chromatography–pyrolysis–atomic fluorescence spectrometry (GC–Py–AFS) separation and detection. Four different extraction techniques, namely acid and alkaline leaching followed by solvent extraction and evaporation, microwave-assisted extraction with 2-mercaptoethanol, and acid leaching, solvent extraction and back extraction into sodium thiosulfate, were examined regarding their potential to selectively extract MeHg from estuarine sediment IAEA-405 certified reference material (CRM). The procedure based on acid leaching with HNO 3 /CuSO 4 , solvent extraction and back extraction into Na 2 S 2 O 3 yielded the highest extraction recovery, i.e., 94 ± 3% and offered the possibility to perform the extraction of a large number of samples in a short time, by eliminating the evaporation step. The artifact formation of MeHg was evaluated by high performance liquid chromatography coupled to inductively coupled plasma mass spectrometry (HPLC–ICP–MS), using isotopically enriched Me 201 Hg and 202 Hg and it was found to be nonexistent. A full validation approach in line with ISO 17025 and Eurachem guidelines was followed

  16. A comprehensive model for the prediction of vibrations due to underground railway traffic: formulation and validation

    International Nuclear Information System (INIS)

    Costa, Pedro Alvares; Cardoso Silva, Antonio; Calçada, Rui; Lopes, Patricia; Fernandez, Jesus

    2016-01-01

    n this communication, a numerical approach for the prediction of vibrations induced in buildings due to railway traffic in tunnels is presented. The numerical model is based on the concept of dynamic sub structuring, being composed by three autonomous models to simulate the following main parts of the problem: i) generation of vibrations (train-track interaction); ii) propagation of vibrations (track - tunnel-ground system); iii) reception of vibrations (building coupled to the ground). The methodology proposed allows dealing with the three-dimensional characteristics of the problem with a reasonable computational effort [ 1 , 2 ] . After the brief description of the model, its experimental validation is performed. For that, a case study about vibrations inside of a building close to a shallow railway tunnel in Madrid are simulated and the experimental data [ 3 ] is compared with the predicted results [ 4 ]. Finally, the communication finishes with some insights about the potentialities and challenges of this numerical modelling approach on the prediction of the behavior of ancient structures subjected to vibrations induced by human sources (railway and road traffic, pile driving, etc)

  17. Development and empirical validation of symmetric component measures of multidimensional constructs: customer and competitor orientation.

    Science.gov (United States)

    Sørensen, Hans Eibe; Slater, Stanley F

    2008-08-01

    Atheoretical measure purification may lead to construct deficient measures. The purpose of this paper is to provide a theoretically driven procedure for the development and empirical validation of symmetric component measures of multidimensional constructs. Particular emphasis is placed on establishing a formalized three-step procedure for achieving a posteriori content validity. Then the procedure is applied to development and empirical validation of two symmetrical component measures of market orientation, customer orientation and competitor orientation. Analysis suggests that average variance extracted is particularly critical to reliability in the respecification of multi-indicator measures. In relation to this, the results also identify possible deficiencies in using Cronbach alpha for establishing reliable and valid measures.

  18. 20 CFR 404.725 - Evidence of a valid ceremonial marriage.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Evidence of a valid ceremonial marriage. 404... DISABILITY INSURANCE (1950- ) Evidence Evidence of Age, Marriage, and Death § 404.725 Evidence of a valid ceremonial marriage. (a) General. A valid ceremonial marriage is one that follows procedures set by law in...

  19. Haeckel or Hennig? The Gordian Knot of Characters, Development, and Procedures in Phylogeny.

    Science.gov (United States)

    Dupuis, Claude

    1984-01-01

    Discusses the conditions for validating customary phylogenetic procedures. Concludes that the requisites of homogeneity and completeness for proved short lineages seem satisfied by the Hennigian but not the Haeckelian procedure. The epistemological antinomy of the two procedures is emphasized for the first time. (Author/RH)

  20. A Turkish Version of the Critical-Care Pain Observation Tool: Reliability and Validity Assessment.

    Science.gov (United States)

    Aktaş, Yeşim Yaman; Karabulut, Neziha

    2017-08-01

    The study aim was to evaluate the validity and reliability of the Critical-Care Pain Observation Tool in critically ill patients. A repeated measures design was used for the study. A convenience sample of 66 patients who had undergone open-heart surgery in the cardiovascular surgery intensive care unit in Ordu, Turkey, was recruited for the study. The patients were evaluated by using the Critical-Care Pain Observation Tool at rest, during a nociceptive procedure (suctioning), and 20 minutes after the procedure while they were conscious and intubated after surgery. The Turkish version of the Critical-Care Pain Observation Tool has shown statistically acceptable levels of validity and reliability. Inter-rater reliability was supported by moderate-to-high-weighted κ coefficients (weighted κ coefficient = 0.55 to 1.00). For concurrent validity, significant associations were found between the scores on the Critical-Care Pain Observation Tool and the Behavioral Pain Scale scores. Discriminant validity was also supported by higher scores during suctioning (a nociceptive procedure) versus non-nociceptive procedures. The internal consistency of the Critical-Care Pain Observation Tool was 0.72 during a nociceptive procedure and 0.71 during a non-nociceptive procedure. The validity and reliability of the Turkish version of the Critical-Care Pain Observation Tool was determined to be acceptable for pain assessment in critical care, especially for patients who cannot communicate verbally. Copyright © 2016 American Society of PeriAnesthesia Nurses. Published by Elsevier Inc. All rights reserved.

  1. Application of industry-standard guidelines for the validation of avionics software

    Science.gov (United States)

    Hayhurst, Kelly J.; Shagnea, Anita M.

    1990-01-01

    The application of industry standards to the development of avionics software is discussed, focusing on verification and validation activities. It is pointed out that the procedures that guide the avionics software development and testing process are under increased scrutiny. The DO-178A guidelines, Software Considerations in Airborne Systems and Equipment Certification, are used by the FAA for certifying avionics software. To investigate the effectiveness of the DO-178A guidelines for improving the quality of avionics software, guidance and control software (GCS) is being developed according to the DO-178A development method. It is noted that, due to the extent of the data collection and configuration management procedures, any phase in the life cycle of a GCS implementation can be reconstructed. Hence, a fundamental development and testing platform has been established that is suitable for investigating the adequacy of various software development processes. In particular, the overall effectiveness and efficiency of the development method recommended by the DO-178A guidelines are being closely examined.

  2. Procedural key steps in laparoscopic colorectal surgery, consensus through Delphi methodology

    NARCIS (Netherlands)

    Dijkstra, Frederieke A.; Bosker, Robbert J. I.; Veeger, Nicolaas J. G. M.; van Det, Marc J.; Pierie, Jean Pierre E. N.

    While several procedural training curricula in laparoscopic colorectal surgery have been validated and published, none have focused on dividing surgical procedures into well-identified segments, which can be trained and assessed separately. This enables the surgeon and resident to focus on a

  3. The Management Advisory Committee of the Inspection Validation Centre - fifth report

    International Nuclear Information System (INIS)

    1988-07-01

    The Management Advisory Committee of the Inspection Validation Centre (IVC/MAC) was set up by the Chairman of the UKAEA early in 1983 with terms of reference to review the policy, scope, procedure and operation of the Inspection Validation Centre, to supervise its operation and to advise and report to the UKAEA appropriately. The Inspection Validation Centre (IVC) has been established at the UKAEA Northern Research Laboratories, Risley for the purpose of validating the procedures, equipment and personnel proposed by the CEGB for use in the ultrasonic inspection at various stages of the fabrication, erection and operation of the CEGB's PWR reactor pressure vessel and such other components as are identified by the CEGB. This report, for 1987/8, states that the IVC has continued to make progress in the provision of the validation services as specified. (author)

  4. Development and validation of a new questionnaire assessing quality of life in adults with hypopituitarism: Adult Hypopituitarism Questionnaire (AHQ).

    Science.gov (United States)

    Ishii, Hitoshi; Shimatsu, Akira; Okimura, Yasuhiko; Tanaka, Toshiaki; Hizuka, Naomi; Kaji, Hidesuke; Hanew, Kunihiko; Oki, Yutaka; Yamashiro, Sayuri; Takano, Koji; Chihara, Kazuo

    2012-01-01

    To develop and validate the Adult Hypopituitarism Questionnaire (AHQ) as a disease-specific, self-administered questionnaire for evaluation of quality of life (QOL) in adult patients with hypopituitarism. We developed and validated this new questionnaire, using a standardized procedure which included item development, pilot-testing and psychometric validation. Of the patients who participated in psychometric validation, those whose clinical conditions were judged to be stable were asked to answer the survey questionnaire twice, in order to assess test-retest reliability. Content validity of the initial questionnaire was evaluated via two pilot tests. After these tests, we made minor revisions and finalized the initial version of the questionnaire. The questionnaire was constructed with two domains, one psycho-social and the other physical. For psychometric assessment, analyses were performed on the responses of 192 adult patients with various types of hypopituitarism. The intraclass correlations of the respective domains were 0.91 and 0.95, and the Cronbach's alpha coefficients were 0.96 and 0.95, indicating adequate test-retest reliability and internal consistency for each domain. For known-group validity, patients with hypopituitarism due to hypothalamic disorder showed significantly lower scores in 11 out of 13 sub-domains compared to those who had hypopituitarism due to pituitary disorder. Regarding construct validity, the domain structure was found to be almost the same as that initially hypothesized. Exploratory factor analysis (n = 228) demonstrated that each domain consisted of six and seven sub-domains. The AHQ showed good reliability and validity for evaluating QOL in adult patients with hypopituitarism.

  5. The development of a quantitative measure for the complexity of emergency tasks stipulated in emergency operating procedures of nuclear power plants

    International Nuclear Information System (INIS)

    Park, Jin Kyun; Jung, Won Dea

    2006-11-01

    Previous studies have continuously pointed out that human performance is a decisive factor affecting the safety of complicated process systems. Subsequently, as the result of extensive efforts, it has been revealed that the provision of procedures is one of the most effective countermeasures, especially if human operators have to carry out their tasks under a very stressful environment. That is, since good procedures are helpful to not only enhance the performance of human operators but also the reduction of the possibility of a human error through stipulating detailed tasks to be done by human operators. Ironically, it has been emphasized that the performance of human operators could be impaired due to complicated procedures, because procedures directly govern the physical as well as cognitive behavior of human operators by institutionalizing detailed actions. Therefore, it is a prerequisite to develop a systematic framework that properly evaluate the complexity of tasks described in procedures. For this reason, a measure called TACOM (Task Complexity) that can quantify the complexity of emergency tasks described in the emergency operating procedures (EOPs) of NPPs has been developed. In this report, a technical background as well as practical steps to quantify the complexity of tasks were presented with a series of studies that were conducted to ensure the validity of the TACOM measure. As a result of validation studies, since it is shown that the TACOM measure seem to properly quantify the complexity of emergency tasks, it is desirable that the TACOM measure plays an important role in improving the performance of human operators

  6. The development of a quantitative measure for the complexity of emergency tasks stipulated in emergency operating procedures of nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jin Kyun; Jung, Won Dea

    2006-11-15

    Previous studies have continuously pointed out that human performance is a decisive factor affecting the safety of complicated process systems. Subsequently, as the result of extensive efforts, it has been revealed that the provision of procedures is one of the most effective countermeasures, especially if human operators have to carry out their tasks under a very stressful environment. That is, since good procedures are helpful to not only enhance the performance of human operators but also the reduction of the possibility of a human error through stipulating detailed tasks to be done by human operators. Ironically, it has been emphasized that the performance of human operators could be impaired due to complicated procedures, because procedures directly govern the physical as well as cognitive behavior of human operators by institutionalizing detailed actions. Therefore, it is a prerequisite to develop a systematic framework that properly evaluate the complexity of tasks described in procedures. For this reason, a measure called TACOM (Task Complexity) that can quantify the complexity of emergency tasks described in the emergency operating procedures (EOPs) of NPPs has been developed. In this report, a technical background as well as practical steps to quantify the complexity of tasks were presented with a series of studies that were conducted to ensure the validity of the TACOM measure. As a result of validation studies, since it is shown that the TACOM measure seem to properly quantify the complexity of emergency tasks, it is desirable that the TACOM measure plays an important role in improving the performance of human operators.

  7. What is validation

    International Nuclear Information System (INIS)

    Clark, H.K.

    1985-01-01

    Criteria for establishing the validity of a computational method to be used in assessing nuclear criticality safety, as set forth in ''American Standard for Nuclear Criticality Safety in Operations with Fissionable Materials Outside Reactors,'' ANSI/ANS-8.1-1983, are examined and discussed. Application of the criteria is illustrated by describing the procedures followed in deriving subcritical limits that have been incorporated in the Standard

  8. Quality control and assurance for validation of DOS/I measurements

    Science.gov (United States)

    Cerussi, Albert; Durkin, Amanda; Kwong, Richard; Quang, Timothy; Hill, Brian; Tromberg, Bruce J.; MacKinnon, Nick; Mantulin, William W.

    2010-02-01

    Ongoing multi-center clinical trials are crucial for Biophotonics to gain acceptance in medical imaging. In these trials, quality control (QC) and assurance (QA) are key to success and provide "data insurance". Quality control and assurance deal with standardization, validation, and compliance of procedures, materials and instrumentation. Specifically, QC/QA involves systematic assessment of testing materials, instrumentation performance, standard operating procedures, data logging, analysis, and reporting. QC and QA are important for FDA accreditation and acceptance by the clinical community. Our Biophotonics research in the Network for Translational Research in Optical Imaging (NTROI) program for breast cancer characterization focuses on QA/QC issues primarily related to the broadband Diffuse Optical Spectroscopy and Imaging (DOS/I) instrumentation, because this is an emerging technology with limited standardized QC/QA in place. In the multi-center trial environment, we implement QA/QC procedures: 1. Standardize and validate calibration standards and procedures. (DOS/I technology requires both frequency domain and spectral calibration procedures using tissue simulating phantoms and reflectance standards, respectively.) 2. Standardize and validate data acquisition, processing and visualization (optimize instrument software-EZDOS; centralize data processing) 3. Monitor, catalog and maintain instrument performance (document performance; modularize maintenance; integrate new technology) 4. Standardize and coordinate trial data entry (from individual sites) into centralized database 5. Monitor, audit and communicate all research procedures (database, teleconferences, training sessions) between participants ensuring "calibration". This manuscript describes our ongoing efforts, successes and challenges implementing these strategies.

  9. Technical skills assessment toolbox: a review using the unitary framework of validity.

    Science.gov (United States)

    Ghaderi, Iman; Manji, Farouq; Park, Yoon Soo; Juul, Dorthea; Ott, Michael; Harris, Ilene; Farrell, Timothy M

    2015-02-01

    The purpose of this study was to create a technical skills assessment toolbox for 35 basic and advanced skills/procedures that comprise the American College of Surgeons (ACS)/Association of Program Directors in Surgery (APDS) surgical skills curriculum and to provide a critical appraisal of the included tools, using contemporary framework of validity. Competency-based training has become the predominant model in surgical education and assessment of performance is an essential component. Assessment methods must produce valid results to accurately determine the level of competency. A search was performed, using PubMed and Google Scholar, to identify tools that have been developed for assessment of the targeted technical skills. A total of 23 assessment tools for the 35 ACS/APDS skills modules were identified. Some tools, such as Operative Performance Rating System (OSATS) and Objective Structured Assessment of Technical Skill (OPRS), have been tested for more than 1 procedure. Therefore, 30 modules had at least 1 assessment tool, with some common surgical procedures being addressed by several tools. Five modules had none. Only 3 studies used Messick's framework to design their validity studies. The remaining studies used an outdated framework on the basis of "types of validity." When analyzed using the contemporary framework, few of these studies demonstrated validity for content, internal structure, and relationship to other variables. This study provides an assessment toolbox for common surgical skills/procedures. Our review shows that few authors have used the contemporary unitary concept of validity for development of their assessment tools. As we progress toward competency-based training, future studies should provide evidence for various sources of validity using the contemporary framework.

  10. Emergency procedures of nuclear power plants-Evolution

    International Nuclear Information System (INIS)

    Atalla, D.L.

    1988-01-01

    During the TMI event the operators had some difficulties to accurately diagnose the accident, causing delay to recover the plant, and allowing the conditions to deteriorate. Further analysis concluded that the plant emergency procedures were incomplete, and did not cover the possibility of multiple and simultaneous failures. This paper covers a new approach for developing emergency procedures to create a new general strategy by providing valid instructions for all kinds of possible incidents in a nuclear power plant. (author) [pt

  11. [Ecological validity and multitasking environments in the evaluation of the executive functions].

    Science.gov (United States)

    Bombín-González, Igor; Cifuentes-Rodríguez, Alicia; Climent-Martínez, Gema; Luna-Lario, Pilar; Cardas-Ibáñez, Jaione; Tirapu-Ustárroz, Javier; Díaz-Orueta, Unai

    2014-07-16

    Evaluation of executive functions is a major issue of neuropsychological assessment, due to the role displayed by these on a cognitive, behavioural and emotional level, and the implication of these functions in daily life functioning. In order to perform a reliable assessment, the strategy traditionally followed for the evaluation of executive functions has been their atomization in different cognitive subprocesses, which is useful in a clinical or a research context. However, in clinical practice it is frequently artificial to disintegrate a global and complex cognitive process, such as executive functions, in a variety of related components; thus, tests designed according to these theoretical processes have low value in clinical procedures (diagnosis, rehabilitation design) due to their poor correspondence with the subject's or patient's clinical reality. The aims of the present work are to revise the concept of ecological validity applied to the evaluation of executive functions, and to perform a critical review of executive functions assessment by means of multitask paradigms as a way to increase the ecological validity and predictive value of the subject's functional performance. After a historical journey around the (low) ecological validity of single-task tests, and the bet in favour of a multitask paradigm for the evaluation of executive functions, up-to-date existing multitask tests are presented meticulously (with their respective advantages and disadvantages). Finally, concrete recommendations about how to develop multitask tests in the future are presented, attending to concrete parameters related to the context, tasks, objectives, rules and scoring.

  12. Efficiency of performing pulmonary procedures in a shared endoscopy unit: procedure time, turnaround time, delays, and procedure waiting time.

    Science.gov (United States)

    Verma, Akash; Lee, Mui Yok; Wang, Chunhong; Hussein, Nurmalah B M; Selvi, Kalai; Tee, Augustine

    2014-04-01

    The purpose of this study was to assess the efficiency of performing pulmonary procedures in the endoscopy unit in a large teaching hospital. A prospective study from May 20 to July 19, 2013, was designed. The main outcome measures were procedure delays and their reasons, duration of procedural steps starting from patient's arrival to endoscopy unit, turnaround time, total case durations, and procedure wait time. A total of 65 procedures were observed. The most common procedure was BAL (61%) followed by TBLB (31%). Overall procedures for 35 (53.8%) of 65 patients were delayed by ≥ 30 minutes, 21/35 (60%) because of "spillover" of the gastrointestinal and surgical cases into the time block of pulmonary procedure. Time elapsed between end of pulmonary procedure and start of the next procedure was ≥ 30 minutes in 8/51 (16%) of cases. In 18/51 (35%) patients there was no next case in the room after completion of the pulmonary procedure. The average idle time of the room after the end of pulmonary procedure and start of next case or end of shift at 5:00 PM if no next case was 58 ± 53 minutes. In 17/51 (33%) patients the room's idle time was >60 minutes. A total of 52.3% of patients had the wait time >2 days and 11% had it ≥ 6 days, reason in 15/21 (71%) being unavailability of the slot. Most pulmonary procedures were delayed due to spillover of the gastrointestinal and surgical cases into the block time allocated to pulmonary procedures. The most common reason for difficulty encountered in scheduling the pulmonary procedure was slot unavailability. This caused increased procedure waiting time. The strategies to reduce procedure delays and turnaround times, along with improved scheduling methods, may have a favorable impact on the volume of procedures performed in the unit thereby optimizing the existing resources.

  13. The need for culture sensitive diagnostic procedures

    NARCIS (Netherlands)

    Zandi, Tekleh; Havenaar, Johan M.; Limburg-Okken, Annechien G.; van Es, Hans; Sidali, Salah; Kadri, Nadia; van den Brink, Wim; Kahn, Rene S.

    Objective We examine the procedural validity of a standardized instrument for the diagnosis of psychotic disorders in Morocco. Method Twenty-nine patients from Casablanca, Morocco, with a psychotic or mood disorder were examined using the Comprehensive Assessment of Symptoms and History (CASH) an

  14. Validation of a Video-based Game-Understanding Test Procedure in Badminton.

    Science.gov (United States)

    Blomqvist, Minna T.; Luhtanen, Pekka; Laakso, Lauri; Keskinen, Esko

    2000-01-01

    Reports the development and validation of video-based game-understanding tests in badminton for elementary and secondary students. The tests included different sequences that simulated actual game situations. Players had to solve tactical problems by selecting appropriate solutions and arguments for their decisions. Results suggest that the test…

  15. Development and validation of a virtual reality simulator: human factors input to interventional radiology training.

    Science.gov (United States)

    Johnson, Sheena Joanne; Guediri, Sara M; Kilkenny, Caroline; Clough, Peter J

    2011-12-01

    This study developed and validated a virtual reality (VR) simulator for use by interventional radiologists. Research in the area of skill acquisition reports practice as essential to become a task expert. Studies on simulation show skills learned in VR can be successfully transferred to a real-world task. Recently, with improvements in technology, VR simulators have been developed to allow complex medical procedures to be practiced without risking the patient. Three studies are reported. In Study I, 35 consultant interventional radiologists took part in a cognitive task analysis to empirically establish the key competencies of the Seldinger procedure. In Study 2, 62 participants performed one simulated procedure, and their performance was compared by expertise. In Study 3, the transferability of simulator training to a real-world procedure was assessed with 14 trainees. Study I produced 23 key competencies that were implemented as performance measures in the simulator. Study 2 showed the simulator had both face and construct validity, although some issues were identified. Study 3 showed the group that had undergone simulator training received significantly higher mean performance ratings on a subsequent patient procedure. The findings of this study support the centrality of validation in the successful design of simulators and show the utility of simulators as a training device. The studies show the key elements of a validation program for a simulator. In addition to task analysis and face and construct validities, the authors highlight the importance of transfer of training in validation studies.

  16. Validation of Procedures for Monitoring Crewmember Immune Function

    Science.gov (United States)

    Crucian, Brian; Stowe, Raymond; Mehta, Satish; Uchakin, Peter; Quiriarte, Heather; Pierson, Duane; Sams, Clarence

    2009-01-01

    There is ample evidence to suggest that space flight leads to immune system dysregulation, however the nature of the phenomenon as it equilibrates over longer flights has not been determined. This dysregulation may be a result of microgravity, confinement, physiological stress, radiation, environment or other mission-associated factors. The clinical risk (if any) for exploration-class space flight is unknown, but may include increased incidence of infection, allergy, hypersensitivity, hematological malignancy or altered wound healing. The objective of this Supplemental Medical Objective (SMO) is to determine the status of the immune system, physiological stress and latent viral reactivation (a clinical outcome that can be measured) during both short and long-duration spaceflight. In addition, this study will develop and validate an immune monitoring strategy consistent with operational flight requirements and constraints. Pre-mission, in-flight and post-flight blood and saliva samples will be obtained from participating crewmembers. Assays included peripheral immunophenotype, T cell function, cytokine profiles (RNA, intracellular, secreted), viral-specific immunity, latent viral reactivation (EBV, CMV, VZV), and stress hormone measurements. This study is currently ongoing. To date, 10 short duration and 5 long-duration crewmembers have completed the study. Technically, the study is progressing well. In-flight blood samples are being collected, and returned for analysis, including functional assays that require live cells. For all in-flight samples to date, sample viability has been acceptable. Preliminary data (n = 4/7; long/short duration, respectively) indicate that distribution of most peripheral leukocyte subsets is largely unaltered during flight. Exceptions include elevated T cells, reduced B/NK cells, increased memory T cells and increased central memory CD8+ T cells. General T cell function, early blastogenesis response to mitogenic stimulation, is markedly

  17. Validity evidence and reliability of a simulated patient feedback instrument.

    Science.gov (United States)

    Schlegel, Claudia; Woermann, Ulrich; Rethans, Jan-Joost; van der Vleuten, Cees

    2012-01-27

    In the training of healthcare professionals, one of the advantages of communication training with simulated patients (SPs) is the SP's ability to provide direct feedback to students after a simulated clinical encounter. The quality of SP feedback must be monitored, especially because it is well known that feedback can have a profound effect on student performance. Due to the current lack of valid and reliable instruments to assess the quality of SP feedback, our study examined the validity and reliability of one potential instrument, the 'modified Quality of Simulated Patient Feedback Form' (mQSF). Content validity of the mQSF was assessed by inviting experts in the area of simulated clinical encounters to rate the importance of the mQSF items. Moreover, generalizability theory was used to examine the reliability of the mQSF. Our data came from videotapes of clinical encounters between six simulated patients and six students and the ensuing feedback from the SPs to the students. Ten faculty members judged the SP feedback according to the items on the mQSF. Three weeks later, this procedure was repeated with the same faculty members and recordings. All but two items of the mQSF received importance ratings of > 2.5 on a four-point rating scale. A generalizability coefficient of 0.77 was established with two judges observing one encounter. The findings for content validity and reliability with two judges suggest that the mQSF is a valid and reliable instrument to assess the quality of feedback provided by simulated patients.

  18. Verification and software validation for nuclear instrumentation

    International Nuclear Information System (INIS)

    Gaytan G, E.; Salgado G, J. R.; De Andrade O, E.; Ramirez G, A.

    2014-10-01

    In this work is presented a Verification Methodology and Software Validation, to be applied in instruments of nuclear use with associate software. This methodology was developed under the auspices of IAEA, through the regional projects RLA4022 (ARCAL XCIX) and RLA1011 (RLA CXXIII), led by Mexico. In the first project three plans and three procedures were elaborated taking into consideration IEEE standards, and in the second project these documents were updated considering ISO and IEC standards. The developed methodology has been distributed to the participant countries of Latin America in the ARCAL projects and two related courses have been imparted with the participation of several countries, and participating institutions of Mexico like Instituto Nacional de Investigaciones Nucleares (ININ), Comision Federal de Electricidad (CFE) and Comision Nacional de Seguridad Nuclear y Salvaguardias (CNSNS). In the ININ due to the necessity to work with Software Quality Guarantee in systems for the nuclear power plant of the CFE, a Software Quality Guarantee Plan and five procedures were developed in the year 2004, obtaining the qualification of the ININ for software development for the nuclear power plant of CFE. These first documents were developed taking like reference IEEE standards and regulator guides of NRC, being the first step for the development of the methodology. (Author)

  19. Interlaboratory study for the validation of an ecotoxicological procedure to monitor the quality of septic sludge received at a wastewater treatment plant.

    Science.gov (United States)

    Robidoux, P Y; Choucri, A; Bastien, C; Sunahara, G I; López-Gastey, J

    2001-01-01

    Septic tank sludge is regularly hauled to the Montreal Urban Community (MUC) wastewater treatment plant. It is then discharged and mixed with the wastewater inflow before entering the primary chemical treatment process. An ecotoxicological procedure integrating chemical and toxicological analyses has been recently developed and applied to screen for the illicit discharge of toxic substances in septic sludge. The toxicity tests used were the Microtox, the bacterial-respiration, and the lettuce (Lactuca sativa) root elongation tests. In order to validate the applicability of the proposed procedure, a two-year interlaboratory study was carried out. In general, the results obtained by two independent laboratories (MUC and the Centre d'expertise en analyse environnementale du Quebec) were comparable and reproducible. Some differences were found using the Microtox test. Organic (e.g., phenol and formaldehyde) and inorganic (e.g., nickel and cyanide) spiked septic sludge were detected with good reliability and high efficiency. The relative efficiency to detect spiked substances was > 70% and confirms the results of previous studies. In addition, the respiration test was the most efficient toxicological tool to detect spiked substances, whereas the Microtox was the least efficient (septic sludge.

  20. Automatic Validation of Protocol Narration

    DEFF Research Database (Denmark)

    Bodei, Chiara; Buchholtz, Mikael; Degano, Pierpablo

    2003-01-01

    We perform a systematic expansion of protocol narrations into terms of a process algebra in order to make precise some of the detailed checks that need to be made in a protocol. We then apply static analysis technology to develop an automatic validation procedure for protocols. Finally, we...

  1. Development of a reliable estimation procedure of radioactivity inventory in a BWR plant due to neutron irradiation for decommissioning

    Directory of Open Access Journals (Sweden)

    Tanaka Ken-ichi

    2017-01-01

    Full Text Available Reliable information of radioactivity inventory resulted from the radiological characterization is important in order to plan decommissioning planning and is also crucial in order to promote decommissioning in effectiveness and in safe. The information is referred to by planning of decommissioning strategy and by an application to regulator. Reliable information of radioactivity inventory can be used to optimize the decommissioning processes. In order to perform the radiological characterization reliably, we improved a procedure of an evaluation of neutron-activated materials for a Boiling Water Reactor (BWR. Neutron-activated materials are calculated with calculation codes and their validity should be verified with measurements. The evaluation of neutron-activated materials can be divided into two processes. One is a distribution calculation of neutron-flux. Another is an activation calculation of materials. The distribution calculation of neutron-flux is performed with neutron transport calculation codes with appropriate cross section library to simulate neutron transport phenomena well. Using the distribution of neutron-flux, we perform distribution calculations of radioactivity concentration. We also estimate a time dependent distribution of radioactivity classification and a radioactive-waste classification. The information obtained from the evaluation is utilized by other tasks in the preparatory tasks to make the decommissioning plan and the activity safe and rational.

  2. Development of a reliable estimation procedure of radioactivity inventory in a BWR plant due to neutron irradiation for decommissioning

    Science.gov (United States)

    Tanaka, Ken-ichi; Ueno, Jun

    2017-09-01

    Reliable information of radioactivity inventory resulted from the radiological characterization is important in order to plan decommissioning planning and is also crucial in order to promote decommissioning in effectiveness and in safe. The information is referred to by planning of decommissioning strategy and by an application to regulator. Reliable information of radioactivity inventory can be used to optimize the decommissioning processes. In order to perform the radiological characterization reliably, we improved a procedure of an evaluation of neutron-activated materials for a Boiling Water Reactor (BWR). Neutron-activated materials are calculated with calculation codes and their validity should be verified with measurements. The evaluation of neutron-activated materials can be divided into two processes. One is a distribution calculation of neutron-flux. Another is an activation calculation of materials. The distribution calculation of neutron-flux is performed with neutron transport calculation codes with appropriate cross section library to simulate neutron transport phenomena well. Using the distribution of neutron-flux, we perform distribution calculations of radioactivity concentration. We also estimate a time dependent distribution of radioactivity classification and a radioactive-waste classification. The information obtained from the evaluation is utilized by other tasks in the preparatory tasks to make the decommissioning plan and the activity safe and rational.

  3. A procedure for the significance testing of unmodeled errors in GNSS observations

    Science.gov (United States)

    Li, Bofeng; Zhang, Zhetao; Shen, Yunzhong; Yang, Ling

    2018-01-01

    It is a crucial task to establish a precise mathematical model for global navigation satellite system (GNSS) observations in precise positioning. Due to the spatiotemporal complexity of, and limited knowledge on, systematic errors in GNSS observations, some residual systematic errors would inevitably remain even after corrected with empirical model and parameterization. These residual systematic errors are referred to as unmodeled errors. However, most of the existing studies mainly focus on handling the systematic errors that can be properly modeled and then simply ignore the unmodeled errors that may actually exist. To further improve the accuracy and reliability of GNSS applications, such unmodeled errors must be handled especially when they are significant. Therefore, a very first question is how to statistically validate the significance of unmodeled errors. In this research, we will propose a procedure to examine the significance of these unmodeled errors by the combined use of the hypothesis tests. With this testing procedure, three components of unmodeled errors, i.e., the nonstationary signal, stationary signal and white noise, are identified. The procedure is tested by using simulated data and real BeiDou datasets with varying error sources. The results show that the unmodeled errors can be discriminated by our procedure with approximately 90% confidence. The efficiency of the proposed procedure is further reassured by applying the time-domain Allan variance analysis and frequency-domain fast Fourier transform. In summary, the spatiotemporally correlated unmodeled errors are commonly existent in GNSS observations and mainly governed by the residual atmospheric biases and multipath. Their patterns may also be impacted by the receiver.

  4. Guidelines for the verification and validation of expert system software and conventional software: Validation scenarios. Volume 6

    International Nuclear Information System (INIS)

    Mirsky, S.M.; Hayes, J.E.; Miller, L.A.

    1995-03-01

    This report is the sixth volume in a series of reports describing the results of the Expert System Verification and Validation (V ampersand V) project which is jointly funded by the US Nuclear Regulatory Commission and the Electric Power Research Institute. The ultimate objective is the formulation of guidelines for the V ampersand V of expert systems for use in nuclear power applications. This activity was concerned with the development of a methodology for selecting validation scenarios and subsequently applying it to two expert systems used for nuclear utility applications. Validation scenarios were defined and classified into five categories: PLANT, TEST, BASICS, CODE, and LICENSING. A sixth type, REGRESSION, is a composite of the others and refers to the practice of using trusted scenarios to ensure that modifications to software did not change unmodified functions. Rationale was developed for preferring scenarios selected from the categories in the order listed and for determining under what conditions to select scenarios from other types. A procedure incorporating all of the recommendations was developed as a generalized method for generating validation scenarios. The procedure was subsequently applied to two expert systems used in the nuclear industry and was found to be effective, given that an experienced nuclear engineer made the final scenario selections. A method for generating scenarios directly from the knowledge base component was suggested

  5. Guidelines for the verification and validation of expert system software and conventional software: Validation scenarios. Volume 6

    Energy Technology Data Exchange (ETDEWEB)

    Mirsky, S.M.; Hayes, J.E.; Miller, L.A. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    This report is the sixth volume in a series of reports describing the results of the Expert System Verification and Validation (V&V) project which is jointly funded by the US Nuclear Regulatory Commission and the Electric Power Research Institute. The ultimate objective is the formulation of guidelines for the V&V of expert systems for use in nuclear power applications. This activity was concerned with the development of a methodology for selecting validation scenarios and subsequently applying it to two expert systems used for nuclear utility applications. Validation scenarios were defined and classified into five categories: PLANT, TEST, BASICS, CODE, and LICENSING. A sixth type, REGRESSION, is a composite of the others and refers to the practice of using trusted scenarios to ensure that modifications to software did not change unmodified functions. Rationale was developed for preferring scenarios selected from the categories in the order listed and for determining under what conditions to select scenarios from other types. A procedure incorporating all of the recommendations was developed as a generalized method for generating validation scenarios. The procedure was subsequently applied to two expert systems used in the nuclear industry and was found to be effective, given that an experienced nuclear engineer made the final scenario selections. A method for generating scenarios directly from the knowledge base component was suggested.

  6. Validating the applicability of the GUM procedure

    Science.gov (United States)

    Cox, Maurice G.; Harris, Peter M.

    2014-08-01

    This paper is directed at practitioners seeking a degree of assurance in the quality of the results of an uncertainty evaluation when using the procedure in the Guide to the Expression of Uncertainty in Measurement (GUM) (JCGM 100 : 2008). Such assurance is required in adhering to general standards such as International Standard ISO/IEC 17025 or other sector-specific standards. We investigate the extent to which such assurance can be given. For many practical cases, a measurement result incorporating an evaluated uncertainty that is correct to one significant decimal digit would be acceptable. Any quantification of the numerical precision of an uncertainty statement is naturally relative to the adequacy of the measurement model and the knowledge used of the quantities in that model. For general univariate and multivariate measurement models, we emphasize the use of a Monte Carlo method, as recommended in GUM Supplements 1 and 2. One use of this method is as a benchmark in terms of which measurement results provided by the GUM can be assessed in any particular instance. We mainly consider measurement models that are linear in the input quantities, or have been linearized and the linearization process is deemed to be adequate. When the probability distributions for those quantities are independent, we indicate the use of other approaches such as convolution methods based on the fast Fourier transform and, particularly, Chebyshev polynomials as benchmarks.

  7. The Management Advisory Committee of the Inspection Validation Centre. 2nd report

    International Nuclear Information System (INIS)

    1985-06-01

    The document is the second report of the Management Advisory Committee of the Inspection Validation Centre (I.V.C.). The IVC is concerned with the ultrasonic inspection of the CEGB's proposed PWR reactor pressure vessel, and other components. The report deals with the technical progress since May 1984, and includes: interim validation, retrospective validation, examination of procedures, test assembly manufacture, interim validation of manual forging inspections, and validation facilities. (U.K.)

  8. A Fourier-based textural feature extraction procedure

    Science.gov (United States)

    Stromberg, W. D.; Farr, T. G.

    1986-01-01

    A procedure is presented to discriminate and characterize regions of uniform image texture. The procedure utilizes textural features consisting of pixel-by-pixel estimates of the relative emphases of annular regions of the Fourier transform. The utility and derivation of the features are described through presentation of a theoretical justification of the concept followed by a heuristic extension to a real environment. Two examples are provided that validate the technique on synthetic images and demonstrate its applicability to the discrimination of geologic texture in a radar image of a tropical vegetated area.

  9. Study of the microstructure of neutron irradiated beryllium for the validation of the ANFIBE code

    International Nuclear Information System (INIS)

    Rabaglino, E.; Ferrero, C.; Reimann, J.; Ronchi, C.; Schulenberg, T.

    2002-01-01

    The behaviour of beryllium under fast neutron irradiation is a key issue of the helium cooled pebble bed tritium breeding blanket, due to the production of large quantities of helium and of a non-negligible amount of tritium. To optimise the design, a reliable prediction of swelling due to helium bubbles and of tritium inventory during normal and off-normal operation of a fusion power reactor is needed. The ANFIBE code (ANalysis of Fusion Irradiated BEryllium) is being developed to meet this need. The code has to be applied in a range of irradiation conditions where no experimental data are available, therefore a detailed gas kinetics model, and a specific and particularly careful validation strategy are needed. The validation procedure of the first version of the code was based on macroscopic data of swelling and tritium release. This approach is, however, incomplete, since a verification of the microscopic behaviour of the gas in the metal is necessary to obtain a reliable description of swelling. This paper discusses a general strategy for a thorough validation of the gas kinetics models in ANFIBE. The microstructure characterisation of weakly irradiated beryllium pebbles, with different visual examination techniques, is then presented as an example of the application of this strategy. In particular, the advantage of developing 3D techniques, such as X-ray microtomography, is demonstrated

  10. The validation of evacuation simulation models through the analysis of behavioural uncertainty

    International Nuclear Information System (INIS)

    Lovreglio, Ruggiero; Ronchi, Enrico; Borri, Dino

    2014-01-01

    Both experimental and simulation data on fire evacuation are influenced by a component of uncertainty caused by the impact of the unexplained variance in human behaviour, namely behavioural uncertainty (BU). Evacuation model validation studies should include the study of this type of uncertainty during the comparison of experiments and simulation results. An evacuation model validation procedure is introduced in this paper to study the impact of BU. This methodology is presented through a case study for the comparison between repeated experimental data and simulation results produced by FDS+Evac, an evacuation model for the simulation of human behaviour in fire, which makes use of distribution laws. - Highlights: • Validation of evacuation models is investigated. • Quantitative evaluation of behavioural uncertainty is performed. • A validation procedure is presented through an evacuation case study

  11. State Token Petri Net modeling method for formal verification of computerized procedure including operator's interruptions of procedure execution flow

    International Nuclear Information System (INIS)

    Kim, Yun Goo; Seong, Poong Hyun

    2012-01-01

    The Computerized Procedure System (CPS) is one of the primary operating support systems in the digital Main Control Room. The CPS displays procedure on the computer screen in the form of a flow chart, and displays plant operating information along with procedure instructions. It also supports operator decision making by providing a system decision. A procedure flow should be correct and reliable, as an error would lead to operator misjudgement and inadequate control. In this paper we present a modeling for the CPS that enables formal verification based on Petri nets. The proposed State Token Petri Nets (STPN) also support modeling of a procedure flow that has various interruptions by the operator, according to the plant condition. STPN modeling is compared with Coloured Petri net when they are applied to Emergency Operating Computerized Procedure. A converting program for Computerized Procedure (CP) to STPN has been also developed. The formal verification and validation methods of CP with STPN increase the safety of a nuclear power plant and provide digital quality assurance means that are needed when the role and function of the CPS is increasing.

  12. Linear Unlearning for Cross-Validation

    DEFF Research Database (Denmark)

    Hansen, Lars Kai; Larsen, Jan

    1996-01-01

    The leave-one-out cross-validation scheme for generalization assessment of neural network models is computationally expensive due to replicated training sessions. In this paper we suggest linear unlearning of examples as an approach to approximative cross-validation. Further, we discuss...... time series prediction benchmark demonstrate the potential of the linear unlearning technique...

  13. The predictive and discriminant validity of the zone of proximal development.

    Science.gov (United States)

    Meijer, J; Elshout, J J

    2001-03-01

    Dynamic measurement procedures are supposed to uncover the zone of proximal development and to increase predictive validity in comparison to conventional, static measurement procedures. Two alternative explanations for the discrepancies between static and dynamic measurements were investigated. The first focuses on Vygotsky's learning potential theory, the second considers the role of anxiety tendency during test taking. If test anxious tendencies are mitigated by dynamic testing procedures, in particular the availability of assistance, the concept of the zone of proximal development may be superfluous in explaining the differences between the outcomes of static and dynamic measurement. Participants were students from secondary education in the Netherlands. They were tested repeatedly in grade three as well as in grade four. Participants were between 14 and 17 years old; their average age was 15.4 years with a standard deviation of .52. Two types of mathematics tests were used in a longitudinal experiment. The first type of test consisted of open-ended items, which participants had to solve completely on their own. With the second type of test, assistance was available to participants during the test. The latter so-called learning test was conceived of as a dynamic testing procedure. Furthermore, a test anxiety questionnaire was administered repeatedly. Structural equation modelling was used to analyse the data. Apart from emotionality and worry, lack of self-confidence appears to be an important constituent of test anxiety. The learning test appears to contribute to the predictive validity of conventional tests and thus a part of Vygotsky's claims were substantiated. Moreover, the mere inclusion of a test anxiety factor into an explanatory model for the gathered data is not sufficient. Apart from test anxiety and mathematical ability it is necessary to assume a factor which may be construed as mathematics learning potential. The results indicate that the observed

  14. Image-guided procedures in brain biopsy.

    Science.gov (United States)

    Fujita, K; Yanaka, K; Meguro, K; Narushima, K; Iguchi, M; Nakai, Y; Nose, T

    1999-07-01

    Image-guided procedures, such as computed tomography (CT)-guided stereotactic and ultrasound-guided methods, can assist neurosurgeons in localizing the relevant pathology. The characteristics of image-guided procedures are important for their appropriate use, especially in brain biopsy. This study reviewed the results of various image-guided brain biopsies to ascertain the advantages and disadvantages. Brain biopsies assisted by CT-guided stereotactic, ultrasound-guided, Neuronavigator-guided, and the combination of ultrasound and Neuronavigator-guided procedures were carried out in seven, eight, one, and three patients, respectively. Four patients underwent open biopsy without a guiding system. Twenty of 23 patients had a satisfactory diagnosis after the initial biopsy. Three patients failed to have a definitive diagnosis after the initial procedure, one due to insufficient volume sampling after CT-guided procedure, and two due to localization failure by ultrasound because the lesions were nonechogenic. All patients who underwent biopsy using the combination of ultrasound and Neuronavigator-guided methods had a satisfactory result. The CT-guided procedure provided an efficient method of approaching any intracranial target and was appropriate for the diagnosis of hypodense lesions, but tissue sampling was sometimes not sufficient to achieve a satisfactory diagnosis. The ultrasound-guided procedure was suitable for the investigation of hyperdense lesions, but was difficult to localize nonechogenic lesions. The combination of ultrasound and Neuronavigator methods improved the diagnostic accuracy even in nonechogenic lesions such as malignant lymphoma. Therefore, it is essential to choose the most appropriate guiding method for brain biopsy according to the radiological nature of the lesions.

  15. Development of automated operating procedure system using fuzzy colored petri nets for nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Seung Jun; Seong, Poong Hyun

    2004-01-01

    In this work, AuTomated Operating Procedure System (ATOPS) is developed. ATOPS is an automation system for emergency operation of a nuclear power plant (NPP) and it can monitor signals, diagnose statuses, and generate control actions according to corresponding operating procedures without any human operator's help. Main functions of ATOPS are an anomaly detection function and a procedure execution function but only the procedure execution function is implemented in this work because this work is just the first step. In the procedure execution function, operating procedures of NPPs are analyzed and modeled using Fuzzy Colored Petri Nets (FCPN) and executed depending on decision making of the inference engine. In this work, ATOPS prototype is developed to demonstrate its feasibility and it is also validated using the FISA-2/WS simulator. The validation is performed for the cases of a loss of coolant accident (LOCA) and a steam generator tube rupture (SGTR). The simulation results show that ATOPS works correctly in the emergency situations

  16. Motive Criminal Procedure Evidence

    Directory of Open Access Journals (Sweden)

    В. В. Вапнярчук

    2015-03-01

    Full Text Available In the article the need for such a level of mental regulation of behavior of proving motivation. The latter refers to internal motivation conscious entity Criminal Procedure proof, due to specific needs, interests and goals that cause a person to act rishymist. Detailed attention is given to the first two determinants, namely the nature of needs and interests. In particular, analyzes highlighted in the literature variety of needs (physiological, ekzistentsionalni, social, prestige, cognitive, aesthetic and spiritual and the manifestation of some of them in the criminal procedural proof.

  17. Feature Extraction for Structural Dynamics Model Validation

    Energy Technology Data Exchange (ETDEWEB)

    Farrar, Charles [Los Alamos National Laboratory; Nishio, Mayuko [Yokohama University; Hemez, Francois [Los Alamos National Laboratory; Stull, Chris [Los Alamos National Laboratory; Park, Gyuhae [Chonnam Univesity; Cornwell, Phil [Rose-Hulman Institute of Technology; Figueiredo, Eloi [Universidade Lusófona; Luscher, D. J. [Los Alamos National Laboratory; Worden, Keith [University of Sheffield

    2016-01-13

    As structural dynamics becomes increasingly non-modal, stochastic and nonlinear, finite element model-updating technology must adopt the broader notions of model validation and uncertainty quantification. For example, particular re-sampling procedures must be implemented to propagate uncertainty through a forward calculation, and non-modal features must be defined to analyze nonlinear data sets. The latter topic is the focus of this report, but first, some more general comments regarding the concept of model validation will be discussed.

  18. Analytic Validation of Immunohistochemistry Assays: New Benchmark Data From a Survey of 1085 Laboratories.

    Science.gov (United States)

    Stuart, Lauren N; Volmar, Keith E; Nowak, Jan A; Fatheree, Lisa A; Souers, Rhona J; Fitzgibbons, Patrick L; Goldsmith, Jeffrey D; Astles, J Rex; Nakhleh, Raouf E

    2017-09-01

    - A cooperative agreement between the College of American Pathologists (CAP) and the United States Centers for Disease Control and Prevention was undertaken to measure laboratories' awareness and implementation of an evidence-based laboratory practice guideline (LPG) on immunohistochemical (IHC) validation practices published in 2014. - To establish new benchmark data on IHC laboratory practices. - A 2015 survey on IHC assay validation practices was sent to laboratories subscribed to specific CAP proficiency testing programs and to additional nonsubscribing laboratories that perform IHC testing. Specific questions were designed to capture laboratory practices not addressed in a 2010 survey. - The analysis was based on responses from 1085 laboratories that perform IHC staining. Ninety-six percent (809 of 844) always documented validation of IHC assays. Sixty percent (648 of 1078) had separate procedures for predictive and nonpredictive markers, 42.7% (220 of 515) had procedures for laboratory-developed tests, 50% (349 of 697) had procedures for testing cytologic specimens, and 46.2% (363 of 785) had procedures for testing decalcified specimens. Minimum case numbers were specified by 85.9% (720 of 838) of laboratories for nonpredictive markers and 76% (584 of 768) for predictive markers. Median concordance requirements were 95% for both types. For initial validation, 75.4% (538 of 714) of laboratories adopted the 20-case minimum for nonpredictive markers and 45.9% (266 of 579) adopted the 40-case minimum for predictive markers as outlined in the 2014 LPG. The most common method for validation was correlation with morphology and expected results. Laboratories also reported which assay changes necessitated revalidation and their minimum case requirements. - Benchmark data on current IHC validation practices and procedures may help laboratories understand the issues and influence further refinement of LPG recommendations.

  19. Cross validation in LULOO

    DEFF Research Database (Denmark)

    Sørensen, Paul Haase; Nørgård, Peter Magnus; Hansen, Lars Kai

    1996-01-01

    The leave-one-out cross-validation scheme for generalization assessment of neural network models is computationally expensive due to replicated training sessions. Linear unlearning of examples has recently been suggested as an approach to approximative cross-validation. Here we briefly review...... the linear unlearning scheme, dubbed LULOO, and we illustrate it on a systemidentification example. Further, we address the possibility of extracting confidence information (error bars) from the LULOO ensemble....

  20. The necessity to rationalize library procedures and activities

    OpenAIRE

    Matjaž Žaucer

    2007-01-01

    The necessity of the rationalization of work and procedures in libraries due to the altered requirements and needs of users and due to the expected aggravated demands of the environment are displayed in the article. The causes for the changes in the librarianship and the causes for the changes in the environment are stated. The article analyzes the searching procedure in the Slovenian union bibliographic and catalogue database using the UDC. Upon a practical experiment it was established that...

  1. Generation of integral experiment covariance data and their impact on criticality safety validation

    Energy Technology Data Exchange (ETDEWEB)

    Stuke, Maik; Peters, Elisabeth; Sommer, Fabian

    2016-11-15

    The quantification of statistical dependencies in data of critical experiments and how to account for them properly in validation procedures has been discussed in the literature by various groups. However, these subjects are still an active topic in the Expert Group on Uncertainty Analysis for Criticality Safety Assessment (UACSA) of the OECDNEA Nuclear Science Committee. The latter compiles and publishes the freely available experimental data collection, the International Handbook of Evaluated Criticality Safety Benchmark Experiments, ICSBEP. Most of the experiments were performed as series and share parts of experimental setups, consequently leading to correlation effects in the results. The correct consideration of correlated data seems to be inevitable if the experimental data in a validation procedure is limited or one cannot rely on a sufficient number of uncorrelated data sets, e.g. from different laboratories using different setups. The general determination of correlations and the underlying covariance data as well as the consideration of them in a validation procedure is the focus of the following work. We discuss and demonstrate possible effects on calculated k{sub eff}'s, their uncertainties, and the corresponding covariance matrices due to interpretation of evaluated experimental data and its translation into calculation models. The work shows effects of various modeling approaches, varying distribution functions of parameters and compares and discusses results from the applied Monte-Carlo sampling method with available data on correlations. Our findings indicate that for the reliable determination of integral experimental covariance matrices or the correlation coefficients a detailed study of the underlying experimental data, the modeling approach and assumptions made, and the resulting sensitivity analysis seems to be inevitable. Further, a Bayesian method is discussed to include integral experimental covariance data when estimating an

  2. Generation of integral experiment covariance data and their impact on criticality safety validation

    International Nuclear Information System (INIS)

    Stuke, Maik; Peters, Elisabeth; Sommer, Fabian

    2016-11-01

    The quantification of statistical dependencies in data of critical experiments and how to account for them properly in validation procedures has been discussed in the literature by various groups. However, these subjects are still an active topic in the Expert Group on Uncertainty Analysis for Criticality Safety Assessment (UACSA) of the OECDNEA Nuclear Science Committee. The latter compiles and publishes the freely available experimental data collection, the International Handbook of Evaluated Criticality Safety Benchmark Experiments, ICSBEP. Most of the experiments were performed as series and share parts of experimental setups, consequently leading to correlation effects in the results. The correct consideration of correlated data seems to be inevitable if the experimental data in a validation procedure is limited or one cannot rely on a sufficient number of uncorrelated data sets, e.g. from different laboratories using different setups. The general determination of correlations and the underlying covariance data as well as the consideration of them in a validation procedure is the focus of the following work. We discuss and demonstrate possible effects on calculated k eff 's, their uncertainties, and the corresponding covariance matrices due to interpretation of evaluated experimental data and its translation into calculation models. The work shows effects of various modeling approaches, varying distribution functions of parameters and compares and discusses results from the applied Monte-Carlo sampling method with available data on correlations. Our findings indicate that for the reliable determination of integral experimental covariance matrices or the correlation coefficients a detailed study of the underlying experimental data, the modeling approach and assumptions made, and the resulting sensitivity analysis seems to be inevitable. Further, a Bayesian method is discussed to include integral experimental covariance data when estimating an application

  3. Simple multicomponent batch distillation procedure with a variable reflux policy

    Directory of Open Access Journals (Sweden)

    A. N. García

    2014-06-01

    Full Text Available This paper describes a shortcut procedure for batch distillation simulation with a variable reflux policy. The procedure starts from a shortcut method developed by Sundaram and Evans in 1993 and uses an iterative cycle to calculate the reflux ratio at each moment. The functional relationship between the concentrations at the bottom and the dome is evaluated using the Fenske equation and is complemented with the equations proposed by Underwood and Gilliland. The results of this procedure are consistent with those obtained using a fast method widely validated in the relevant literature.

  4. A Simulation Approach for Performance Validation during Embedded Systems Design

    Science.gov (United States)

    Wang, Zhonglei; Haberl, Wolfgang; Herkersdorf, Andreas; Wechs, Martin

    Due to the time-to-market pressure, it is highly desirable to design hardware and software of embedded systems in parallel. However, hardware and software are developed mostly using very different methods, so that performance evaluation and validation of the whole system is not an easy task. In this paper, we propose a simulation approach to bridge the gap between model-driven software development and simulation based hardware design, by merging hardware and software models into a SystemC based simulation environment. An automated procedure has been established to generate software simulation models from formal models, while the hardware design is originally modeled in SystemC. As the simulation models are annotated with timing information, performance issues are tackled in the same pass as system functionality, rather than in a dedicated approach.

  5. Calibration and validation of a model describing complete autotrophic nitrogen removal in a granular SBR system

    DEFF Research Database (Denmark)

    Vangsgaard, Anna Katrine; Mutlu, Ayten Gizem; Gernaey, Krist

    2013-01-01

    BACKGROUND: A validated model describing the nitritation-anammox process in a granular sequencing batch reactor (SBR) system is an important tool for: a) design of future experiments and b) prediction of process performance during optimization, while applying process control, or during system scale......-up. RESULTS: A model was calibrated using a step-wise procedure customized for the specific needs of the system. The important steps in the procedure were initialization, steady-state and dynamic calibration, and validation. A fast and effective initialization approach was developed to approximate pseudo...... screening of the parameter space proposed by Sin et al. (2008) - to find the best fit of the model to dynamic data. Finally, the calibrated model was validated with an independent data set. CONCLUSION: The presented calibration procedure is the first customized procedure for this type of system...

  6. Human factor analysis related to new symptom based procedures used by control room crews during treatment of emergency states

    International Nuclear Information System (INIS)

    Holy, J.

    1999-01-01

    New symptom based emergency procedures have been developed for Nuclear Power Plant Dukovany in the Czech Republic. As one point of the process of verification and validation of the procedures, a specific effort was devoted to detailed analysis of the procedures from human factors and human reliability point of view. The course and results of the analysis are discussed in this article. Although the analyzed procedures have been developed for one specific plant of WWER-440/213 type, most of the presented results may be valid for many other procedures recently developed for semi-automatic control of those technological units which are operated under measurable level of risk. (author)

  7. Procedures for Determining the Performance of Stand-Alone Photovoltaic Systems; TOPICAL

    International Nuclear Information System (INIS)

    DeBlasio, R.; Durand, S.; Hansen, R.; Hutchinson, P.; Kroposki, B.; McNutt, P.; Rosenthal, A.; Thomas, M.

    1999-01-01

    This document provides the procedures for determining the performance of stand-alone PV systems. The procedures in this document provide a common approach for evaluating whether a given PV system is suitable to perform the function for which it was designed and manufactured to accomplish, and whether it will provide adequate power to run the load. These procedures cover small stand-alone PV systems. They cover complete outdoor system testing. Test results are valid only for the system that is tested

  8. Measuring Vocational Preferences: Ranking versus Categorical Rating Procedures.

    Science.gov (United States)

    Carifio, James

    1978-01-01

    Describes a study to compare the relative validities of ranking v categorical rating procedures for obtaining student vocational preference data in exploratory program assignment situations. Students indicated their vocational program preferences from career clusters, and the frequency of wrong assignments made by each method was analyzed. (MF)

  9. Man-in-the-loop validation plan for the Millstone Unit 3 SPDS

    International Nuclear Information System (INIS)

    Blanch, P.M.; Wilkinson, C.D.

    1985-01-01

    This paper describes the man-in-the-loop validation plan for the Millstone Point Unit 3 (MP3) Safety Parameter Display System (SPDS). MP3 is a pressurized water reactor scheduled to load fuel November, 1985. The SPDS is being implemented as part of plant construction. This paper provides an overview of the validation process. Detailed validation procedures, scenarios, and evaluation forms will be incorporated into the validation plan to produce the detailed validation program. The program document will provide all of the new detailed instructions necessary to perform the man-in-the-loop validation

  10. Formal Verification of Computerized Procedure with Colored Petri Nets

    International Nuclear Information System (INIS)

    Kim, Yun Goo; Shin, Yeong Cheol

    2008-01-01

    Computerized Procedure System (CPS) supports nuclear power plant operators in performing operating procedures which are instructions to guide in monitoring, decision making and controlling nuclear power plants. Computerized Procedure (CP) should be loaded to CPS. Due to its execution characteristic, computerized procedure acts like a software in CPS. For example, procedure flows are determined by operator evaluation and computerized procedure logic which are pre-defined. So the verification of Computerized Procedure logic and execution flow is needed before computerized procedures are installed in the system. Formal verification methods are proposed and the modeling of operating procedures with Coloured Petri Nets(CP-nets) is presented

  11. 9 CFR 124.42 - Hearing procedure.

    Science.gov (United States)

    2010-01-01

    ... Diligence Hearing § 124.42 Hearing procedure. (a) The presiding officer shall be appointed by the... hearing. (g) The due diligence hearing will be conducted in accordance with rules of practice adopted for... opportunity to participate as a party in the hearing. The standard of due diligence set forth in § 124.33 will...

  12. Regulatory perspectives on human factors validation

    International Nuclear Information System (INIS)

    Harrison, F.; Staples, L.

    2001-01-01

    Validation is an important avenue for controlling the genesis of human error, and thus managing loss, in a human-machine system. Since there are many ways in which error may intrude upon system operation, it is necessary to consider the performance-shaping factors that could introduce error and compromise system effectiveness. Validation works to this end by examining, through objective testing and measurement, the newly developed system, procedure or staffing level, in order to identify and eliminate those factors which may negatively influence human performance. It is essential that validation be done in a high-fidelity setting, in an objective and systematic manner, using appropriate measures, if meaningful results are to be obtained, In addition, inclusion of validation work in any design process can be seen as contributing to a good safety culture, since such activity allows licensees to eliminate elements which may negatively impact on human behaviour. (author)

  13. Methods and procedures for the verification and validation of artificial neural networks

    CERN Document Server

    Taylor, Brian J

    2006-01-01

    Neural networks are members of a class of software that have the potential to enable intelligent computational systems capable of simulating characteristics of biological thinking and learning. This volume introduces some of the methods and techniques used for the verification and validation of neural networks and adaptive systems.

  14. Compressive strength test for cemented waste forms: validation process

    International Nuclear Information System (INIS)

    Haucz, Maria Judite A.; Candido, Francisco Donizete; Seles, Sandro Rogerio

    2007-01-01

    In the Cementation Laboratory (LABCIM), of the Development Centre of the Nuclear Technology (CNEN/CDTN-MG), hazardous/radioactive wastes are incorporated in cement, to transform them into monolithic products, preventing or minimizing the contaminant release to the environment. The compressive strength test is important to evaluate the cemented product quality, in which it is determined the compression load necessary to rupture the cemented waste form. In LABCIM a specific procedure was developed to determine the compressive strength of cement waste forms based on the Brazilian Standard NBR 7215. The accreditation of this procedure is essential to assure reproductive and accurate results in the evaluation of these products. To achieve this goal the Laboratory personal implemented technical and administrative improvements in accordance with the NBR ISO/IEC 17025 standard 'General requirements for the competence of testing and calibration laboratories'. As the developed procedure was not a standard one the norm ISO/IEC 17025 requests its validation. There are some methodologies to do that. In this paper it is described the current status of the accreditation project, especially the validation process of the referred procedure and its results. (author)

  15. A procedure for noise uncoupling in laser interferometry

    CERN Document Server

    Barone, F; Rosa, R D; Eleuteri, A; Milano, L; Qipiani, K

    2002-01-01

    A numerical procedure for noise recognition and uncoupling is described. The procedure is applied to a Michelson interferometer and is effective in seismic and acoustic noise uncoupling from the output signal of the interferometer. Due to the low data flow coming from the instrumentation this uncoupling can be performed in real time and it is useful as a data quality procedure for interferometer data output.

  16. REVISION PERMISSIABILITY IN CIVIL PROCEDURE IN REPUBLIC OF MACEDONIA

    Directory of Open Access Journals (Sweden)

    Marina Gligorova

    2015-04-01

    Full Text Available The revision as an extraordinary legal remedy is one more legal instrument for litigant in the effort to achieve protection of the rights or to defend against ungrounded claims of the other party. Litigants may declare revision of the litigation process due to substantive violations of the provisions of Civil Procedure and incorrect application of substantive law. Declaring revision because of a substantive violation of the provisions of Civil Procedure is limited. The purpose of this research paper is to investigate the most common reasons for filing revision of the litigation process in the period from June 2011 to June in 2012. The research includes what kind of reasons are often repeated, and the volume, or the number of reviews submitted to the Supreme Court of the Republic of Macedonia. As general hypothesis is that most of the adopted revisions are due to substantial violations of the provisions of civil procedure. Two-thirds of the stated revisions in front of the Supreme Court of Republic of Macedonia were rejected as unfounded and only one third of the submitted revisions from June 2011 to June 2012 were grounded. Since accepted revisions 59% due to incorrect application of substantive law, and 41% due to substantial violations of the provisions of Civil Procedure.

  17. Algorithm for Video Summarization of Bronchoscopy Procedures

    Directory of Open Access Journals (Sweden)

    Leszczuk Mikołaj I

    2011-12-01

    Full Text Available Abstract Background The duration of bronchoscopy examinations varies considerably depending on the diagnostic and therapeutic procedures used. It can last more than 20 minutes if a complex diagnostic work-up is included. With wide access to videobronchoscopy, the whole procedure can be recorded as a video sequence. Common practice relies on an active attitude of the bronchoscopist who initiates the recording process and usually chooses to archive only selected views and sequences. However, it may be important to record the full bronchoscopy procedure as documentation when liability issues are at stake. Furthermore, an automatic recording of the whole procedure enables the bronchoscopist to focus solely on the performed procedures. Video recordings registered during bronchoscopies include a considerable number of frames of poor quality due to blurry or unfocused images. It seems that such frames are unavoidable due to the relatively tight endobronchial space, rapid movements of the respiratory tract due to breathing or coughing, and secretions which occur commonly in the bronchi, especially in patients suffering from pulmonary disorders. Methods The use of recorded bronchoscopy video sequences for diagnostic, reference and educational purposes could be considerably extended with efficient, flexible summarization algorithms. Thus, the authors developed a prototype system to create shortcuts (called summaries or abstracts of bronchoscopy video recordings. Such a system, based on models described in previously published papers, employs image analysis methods to exclude frames or sequences of limited diagnostic or education value. Results The algorithm for the selection or exclusion of specific frames or shots from video sequences recorded during bronchoscopy procedures is based on several criteria, including automatic detection of "non-informative", frames showing the branching of the airways and frames including pathological lesions. Conclusions

  18. Development of an automated operating procedure system using fuzzy colored petri nets for nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Seung Jun; Seong, Poong Hyun

    2002-01-01

    In this work, AuTomated Operating Procedure System (ATOPS) is developed. ATOPS is an automation system for operation of a nuclear power plant (NPP) which can monitor signals, diagnose statuses, and generate control actions according to corresponding operating procedures, without any human operator's help. Main functions of ATOPS are anomaly detection function and procedure execution function, but only the procedure execution function is implemented because this work is just the first step. In the procedure execution function, operating procedures of NPP are analyzed and modeled using Fuzzy Colored Petri Nets (FCPN), and executed depending on decision making of the inference engine. In this work, an ATOPS prototype is developed in order to demonstrate its feasibility and it is also validated using FISA-2/WS simulator. The validation is performed for the cases of a loss of coolant accident (LOCA) and a steam generator tube rupture (SGTR). The simulation results show that ATOPS works correctly in the emergency situations

  19. Validation of a Full-Immersion Simulation Platform for Percutaneous Nephrolithotomy Using Three-Dimensional Printing Technology.

    Science.gov (United States)

    Ghazi, Ahmed; Campbell, Timothy; Melnyk, Rachel; Feng, Changyong; Andrusco, Alex; Stone, Jonathan; Erturk, Erdal

    2017-12-01

    The restriction of resident hours with an increasing focus on patient safety and a reduced caseload has impacted surgical training. A complex and complication prone procedure such as percutaneous nephrolithotomy (PCNL) with a steep learning curve may create an unsafe environment for hands-on resident training. In this study, we validate a high fidelity, inanimate PCNL model within a full-immersion simulation environment. Anatomically correct models of the human pelvicaliceal system, kidney, and relevant adjacent structures were created using polyvinyl alcohol hydrogels and three-dimensional-printed injection molds. All steps of a PCNL were simulated including percutaneous renal access, nephroscopy, and lithotripsy. Five experts (>100 caseload) and 10 novices (simulation. Face and content validity were calculated using model ratings for similarity to the real procedure and usefulness as a training tool. Differences in performance among groups with various levels of experience using clinically relevant procedural metrics were used to calculate construct validity. The model was determined to have an excellent face and content validity with an average score of 4.5/5.0 and 4.6/5.0, respectively. There were significant differences between novice and expert operative metrics including mean fluoroscopy time, the number of percutaneous access attempts, and number of times the needle was repositioned. Experts achieved better stone clearance with fewer procedural complications. We demonstrated the face, content, and construct validity of an inanimate, full task trainer for PCNL. Construct validity between experts and novices was demonstrated using incorporated procedural metrics, which permitted the accurate assessment of performance. While hands-on training under supervision remains an integral part of any residency, this full-immersion simulation provides a comprehensive tool for surgical skills development and evaluation before hands-on exposure.

  20. Validation of the ATLAS hadronic calibration with the LAr End-Cap beam tests data

    International Nuclear Information System (INIS)

    Barillari, Teresa

    2009-01-01

    The high granularity of the ATLAS calorimeter and the large number of expected particles per event require a clustering algorithm that is able to suppress noise and pile-up efficiently. Therefore the cluster reconstruction is the essential first step in the hadronic calibration. The identification of electromagnetic components within a hadronic cluster using cluster shape variables is the next step in the hadronic calibration procedure. Finally the energy density of individual cells is used to assign the proper weight to correct for the invisible energy deposits of hadrons due to the non-compensating nature of the ATLAS calorimeter and to correct for energy losses in material non instrumented with read-out. The weighting scheme employs the energy density in individual cells. Therefore the validation of the Monte Carlo simulation, which is used to define the weighting parameters and energy correction algorithms, is an essential step in the hadronic calibration procedure. Pion data, obtained in a beam test corresponding to the pseudorapidity region 2.5 < |η| < 4.0 in ATLAS and in the energy range 40 GeV ≤ E ≤ 200 GeV, have been compared with Monte Carlo simulations, using the full ATLAS hadronic calibration procedure.

  1. Validation of dispersion model of RTARC-DSS based on ''KIT'' field experiments

    International Nuclear Information System (INIS)

    Duran, J.

    2000-01-01

    The aim of this study is to present the performance of the Gaussian dispersion model RTARC-DSS (Real Time Accident Release Consequences - Decision Support System) at the 'Kit' field experiments. The Model Validation Kit is a collection of three experimental data sets from Kincaid, Copenhagen, Lillestrom and supplementary Indianopolis experimental campaigns accompanied by software for model evaluation. The validation of the model has been performed on the basis of the maximum arc-wise concentrations using the Bootstrap resampling procedure the variation of the model residuals. Validation was performed for the short-range distances (about 1 - 10 km, maximum for Kincaid data set - 50 km from source). Model evaluation procedure and amount of relative over- or under-prediction are discussed and compared with the model. (author)

  2. Comparative clinical study of the effect of LLLT in the immediate and late treatments of hypoesthesia due to surgical procedures

    Science.gov (United States)

    Ladalardo, Thereza C.; Brugnera, Aldo, Jr.; Pinheiro, Antonio L. B.; Castanho Garrini, Ana E.; Bologna, Elisangela D.; Takamoto, Marcia; Siqueira, Jose T.; Dias, Pedro; Campos, Roberto A. d. C.

    2002-06-01

    We evaluated the effect of LLLT in 68 patients who presented hypoesthesia due to odontological surgery procedures: dental implant surgeries (N=51); extraction of impacted lower third molars (N=10); endodontics in lower first molars (N=7). Lesions treated within 30 days after the nerve injury had occurred were part of the immediate group, and lesions with more than 30 days from the occurrence of the injury were part of the late group. Treatments were carried out with an infrared diode laser of 40 mW-830nm, continuous wave emission, spot size 3 mm2, and a total dosage of 18 joules per session in a contact mode of application, 20 sessions altogether. The efficacy of laser therapy in peripheral nerve regeneration is also related to the degree of the peripheral nerve lesion, and not only to the lesion duration. LLLT resulted in neurosensory functional improvement in both immediate and late treatments of hypoesthesia.

  3. Test validation of nuclear and fossil fuel control operators

    International Nuclear Information System (INIS)

    Moffie, D.J.

    1976-01-01

    To establish job relatedness, one must go through a procedure of concurrent and predictive validation. For concurrent validity a group of employees is tested and the test scores are related to performance concurrently or during the same time period. For predictive validity, individuals are tested but the results of these tests are not used at the time of employment. The tests are sealed and scored at a later date, and then related to job performance. Job performance data include ratings by supervisors, actual job performance indices, turnover, absenteeism, progress in training, etc. The testing guidelines also stipulate that content and construct validity can be used

  4. Standardized Procedure Content And Data Structure Based On Human Factors Requirements For Computer-Based Procedures

    International Nuclear Information System (INIS)

    Bly, Aaron; Oxstrand, Johanna; Le Blanc, Katya L

    2015-01-01

    Most activities that involve human interaction with systems in a nuclear power plant are guided by procedures. Traditionally, the use of procedures has been a paper-based process that supports safe operation of the nuclear power industry. However, the nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. Advances in digital technology make computer-based procedures (CBPs) a valid option that provides further enhancement of safety by improving human performance related to procedure use. The transition from paper-based procedures (PBPs) to CBPs creates a need for a computer-based procedure system (CBPS). A CBPS needs to have the ability to perform logical operations in order to adjust to the inputs received from either users or real time data from plant status databases. Without the ability for logical operations the procedure is just an electronic copy of the paper-based procedure. In order to provide the CBPS with the information it needs to display the procedure steps to the user, special care is needed in the format used to deliver all data and instructions to create the steps. The procedure should be broken down into basic elements and formatted in a standard method for the CBPS. One way to build the underlying data architecture is to use an Extensible Markup Language (XML) schema, which utilizes basic elements to build each step in the smart procedure. The attributes of each step will determine the type of functionality that the system will generate for that step. The CBPS will provide the context for the step to deliver referential information, request a decision, or accept input from the user. The XML schema needs to provide all data necessary for the system to accurately perform each step without the need for the procedure writer to reprogram the CBPS. The research team at the Idaho National Laboratory has developed a prototype CBPS for field workers as well as the

  5. Standardized Procedure Content And Data Structure Based On Human Factors Requirements For Computer-Based Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Bly, Aaron; Oxstrand, Johanna; Le Blanc, Katya L

    2015-02-01

    Most activities that involve human interaction with systems in a nuclear power plant are guided by procedures. Traditionally, the use of procedures has been a paper-based process that supports safe operation of the nuclear power industry. However, the nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. Advances in digital technology make computer-based procedures (CBPs) a valid option that provides further enhancement of safety by improving human performance related to procedure use. The transition from paper-based procedures (PBPs) to CBPs creates a need for a computer-based procedure system (CBPS). A CBPS needs to have the ability to perform logical operations in order to adjust to the inputs received from either users or real time data from plant status databases. Without the ability for logical operations the procedure is just an electronic copy of the paper-based procedure. In order to provide the CBPS with the information it needs to display the procedure steps to the user, special care is needed in the format used to deliver all data and instructions to create the steps. The procedure should be broken down into basic elements and formatted in a standard method for the CBPS. One way to build the underlying data architecture is to use an Extensible Markup Language (XML) schema, which utilizes basic elements to build each step in the smart procedure. The attributes of each step will determine the type of functionality that the system will generate for that step. The CBPS will provide the context for the step to deliver referential information, request a decision, or accept input from the user. The XML schema needs to provide all data necessary for the system to accurately perform each step without the need for the procedure writer to reprogram the CBPS. The research team at the Idaho National Laboratory has developed a prototype CBPS for field workers as well as the

  6. Validity and Reliability of Psychosocial Factors Related to Breast Cancer Screening.

    Science.gov (United States)

    Zapka, Jane G.; And Others

    1991-01-01

    The construct validity of hypothesized survey items and data reduction procedures for selected psychosocial constructs frequently used in breast cancer screening research were investigated in telephone interviews with randomly selected samples of 1,184 and 903 women and a sample of 169 Hispanic clinic clients. Validity of the constructs is…

  7. New Standards for the Validation of EMC Test Sites particularly above 1 GHz

    Directory of Open Access Journals (Sweden)

    S. Battermann

    2005-01-01

    Full Text Available Standards for the validation of alternative test sites with conducting groundplane exist for the frequency range 30-1000 MHz since the end of the eighties. Recently the procedure for fully anechoic rooms (FAR has been included in CISPR 16 after more than 10 years intensive discussion in standards committees (CENELEC, 2002; CISPR, 2004. But there are no standards available for the validation of alternative test sites above 1 GHz. The responsible working group (WG1 in CISPR/A has drawn up the 7th common draft (CD. A CDV will be published in spring 2005. The German standards committee VDE AK 767.4.1 participates in the drafting of the standard. All suggested measurement procedures proposed in the last CDs have been investigated by measurements and theoretical analysis. This contribution describes the basic ideas and problems of the validation procedure of the test site. Furthermore measurement results and numerical calculations will be presented especially for the use of omni-directional antennas.

  8. AFT Chief Promises Due-Process Reform

    Science.gov (United States)

    Sawchuk, Stephen

    2010-01-01

    The president of the American Federation of Teachers (AFT), Randi Weingarten, is putting the sensitive issue of due process on the education reform table, with a pledge to work with districts to streamline the often-cumbersome procedures for dismissing teachers who fail to improve their performance after receiving help and support. She has also…

  9. HIFU procedures at moderate intensities-effect of large blood vessels

    International Nuclear Information System (INIS)

    Hariharan, P; Myers, M R; Banerjee, R K

    2007-01-01

    A three-dimensional computational model is presented for studying the efficacy of high-intensity focused ultrasound (HIFU) procedures targeted near large blood vessels. The analysis applies to procedures performed at intensities below the threshold for cavitation, boiling and highly nonlinear propagation, but high enough to increase tissue temperature a few degrees per second. The model is based upon the linearized KZK equation and the bioheat equation in tissue. In the blood vessel the momentum and energy equations are satisfied. The model is first validated in a tissue phantom, to verify the absence of bubble formation and nonlinear effects. Temperature rise and lesion-volume calculations are then shown for different beam locations and orientations relative to a large vessel. Both single and multiple ablations are considered. Results show that when the vessel is located within about a beam width (few mm) of the ultrasound beam, significant reduction in lesion volume is observed due to blood flow. However, for gaps larger than a beam width, blood flow has no major effect on the lesion formation. Under the clinically representative conditions considered, the lesion volume is reduced about 40% (relative to the no-flow case) when the beam is parallel to the blood vessel, compared to about 20% for a perpendicular orientation. Procedures involving multiple ablation sites are affected less by blood flow than single ablations. The model also suggests that optimally focused transducers can generate lesions that are significantly larger (>2 times) than the ones produced by highly focused beams

  10. HIFU procedures at moderate intensities-effect of large blood vessels

    Energy Technology Data Exchange (ETDEWEB)

    Hariharan, P [Mechanical, Industrial, and Nuclear Engineering Department, University of Cincinnati, Cincinnati, OH (United States); Myers, M R [Division of Solid and Fluid Mechanics, Center for Devices and Radiological Health, US Food and Drug Administration, 10903 New Hampshire Avenue, Building 62, Silver Spring, MD 20993-0002 (United States); Banerjee, R K [Mechanical, Industrial, and Nuclear Engineering Department, University of Cincinnati, Cincinnati, OH (United States)

    2007-07-21

    A three-dimensional computational model is presented for studying the efficacy of high-intensity focused ultrasound (HIFU) procedures targeted near large blood vessels. The analysis applies to procedures performed at intensities below the threshold for cavitation, boiling and highly nonlinear propagation, but high enough to increase tissue temperature a few degrees per second. The model is based upon the linearized KZK equation and the bioheat equation in tissue. In the blood vessel the momentum and energy equations are satisfied. The model is first validated in a tissue phantom, to verify the absence of bubble formation and nonlinear effects. Temperature rise and lesion-volume calculations are then shown for different beam locations and orientations relative to a large vessel. Both single and multiple ablations are considered. Results show that when the vessel is located within about a beam width (few mm) of the ultrasound beam, significant reduction in lesion volume is observed due to blood flow. However, for gaps larger than a beam width, blood flow has no major effect on the lesion formation. Under the clinically representative conditions considered, the lesion volume is reduced about 40% (relative to the no-flow case) when the beam is parallel to the blood vessel, compared to about 20% for a perpendicular orientation. Procedures involving multiple ablation sites are affected less by blood flow than single ablations. The model also suggests that optimally focused transducers can generate lesions that are significantly larger (>2 times) than the ones produced by highly focused beams.

  11. Selenium speciation in phosphate mine soils and evaluation of a sequential extraction procedure using XAFS

    International Nuclear Information System (INIS)

    Favorito, Jessica E.; Luxton, Todd P.; Eick, Matthew J.; Grossl, Paul R.

    2017-01-01

    Selenium is a trace element found in western US soils, where ingestion of Se-accumulating plants has resulted in livestock fatalities. Therefore, a reliable understanding of Se speciation and bioavailability is critical for effective mitigation. Sequential extraction procedures (SEP) are often employed to examine Se phases and speciation in contaminated soils but may be limited by experimental conditions. We examined the validity of a SEP using X-ray absorption spectroscopy (XAS) for both whole and a sequence of extracted soils. The sequence included removal of soluble, PO 4 -extractable, carbonate, amorphous Fe-oxide, crystalline Fe-oxide, organic, and residual Se forms. For whole soils, XANES analyses indicated Se(0) and Se(-II) predominated, with lower amounts of Se(IV) present, related to carbonates and Fe-oxides. Oxidized Se species were more elevated and residual/elemental Se was lower than previous SEP results from ICP-AES suggested. For soils from the SEP sequence, XANES results indicated only partial recovery of carbonate, Fe-oxide and organic Se. This suggests Se was incompletely removed during designated extractions, possibly due to lack of mineral solubilization or reagent specificity. Selenium fractions associated with Fe-oxides were reduced in amount or removed after using hydroxylamine HCl for most soils examined. XANES results indicate partial dissolution of solid-phases may occur during extraction processes. This study demonstrates why precautions should be taken to improve the validity of SEPs. Mineralogical and chemical characterizations should be completed prior to SEP implementation to identify extractable phases or mineral components that may influence extraction effectiveness. Sequential extraction procedures can be appropriately tailored for reliable quantification of speciation in contaminated soils. - Highlights: • XANES spectra indicated whole soils consisted of mostly elemental and organic Se and lower amounts of sorbed oxidized Se.

  12. Improved Design of Crew Operation in Computerized Procedure System of APR1400

    Energy Technology Data Exchange (ETDEWEB)

    Seong, No Kyu; Jung, Yeon Sub; Sung, Chan Ho [KHNP, Daejeon (Korea, Republic of)

    2016-05-15

    The operators perform the paper-based procedures in analog-based conventional main control room (MCR) depending on only communications between operators except a procedure controller such as a Shift Supervisor (SS), however in digital-based MCR the operators can confirm the procedures simultaneously in own console when the procedure controller of computerized procedure (CP) opens the CP. The synchronization and a synchronization function between procedure controller and other operators has to be considered to support the function of crew operation. This paper suggests the improved design of crew operation in computerized procedure system of APR1400. This paper suggests the improved design of APR1400 CPS. These improvements can help operators perform the crew procedures more efficiently. And they reduce a burden of communication and misunderstanding of computerized procedures. These improvements can be applied to CPS after human factors engineering verification and validation.

  13. A baseline-free procedure for transformation models under interval censorship.

    Science.gov (United States)

    Gu, Ming Gao; Sun, Liuquan; Zuo, Guoxin

    2005-12-01

    An important property of Cox regression model is that the estimation of regression parameters using the partial likelihood procedure does not depend on its baseline survival function. We call such a procedure baseline-free. Using marginal likelihood, we show that an baseline-free procedure can be derived for a class of general transformation models under interval censoring framework. The baseline-free procedure results a simplified and stable computation algorithm for some complicated and important semiparametric models, such as frailty models and heteroscedastic hazard/rank regression models, where the estimation procedures so far available involve estimation of the infinite dimensional baseline function. A detailed computational algorithm using Markov Chain Monte Carlo stochastic approximation is presented. The proposed procedure is demonstrated through extensive simulation studies, showing the validity of asymptotic consistency and normality. We also illustrate the procedure with a real data set from a study of breast cancer. A heuristic argument showing that the score function is a mean zero martingale is provided.

  14. Experimental Validation of an Efficient Fan-Beam Calibration Procedure for k-Nearest Neighbor Position Estimation in Monolithic Scintillator Detectors

    Science.gov (United States)

    Borghi, Giacomo; Tabacchini, Valerio; Seifert, Stefan; Schaart, Dennis R.

    2015-02-01

    Monolithic scintillator detectors can achieve excellent spatial resolution and coincidence resolving time. However, their practical use for positron emission tomography (PET) and other applications in the medical imaging field is still limited due to drawbacks of the different methods used to estimate the position of interaction. Common statistical methods for example require the collection of an extensive dataset of reference events with a narrow pencil beam aimed at a fine grid of reference positions. Such procedures are time consuming and not straightforwardly implemented in systems composed of many detectors. Here, we experimentally demonstrate for the first time a new calibration procedure for k-nearest neighbor ( k-NN) position estimation that utilizes reference data acquired with a fan beam. The procedure is tested on two detectors consisting of 16 mm ×16 mm ×10 mm and 16 mm ×16 mm ×20 mm monolithic, Ca-codoped LSO:Ce crystals and digital photon counter (DPC) arrays. For both detectors, the spatial resolution and the bias obtained with the new method are found to be practically the same as those obtained with the previously used method based on pencil-beam irradiation, while the calibration time is reduced by a factor of 20. Specifically, a FWHM of 1.1 mm and a FWTM of 2.7 mm were obtained using the fan-beam method with the 10 mm crystal, whereas a FWHM of 1.5 mm and a FWTM of 6 mm were achieved with the 20 mm crystal. Using a fan beam made with a 4.5 MBq 22Na point-source and a tungsten slit collimator with 0.5 mm aperture, the total measurement time needed to acquire the reference dataset was 3 hours for the thinner crystal and 2 hours for the thicker one.

  15. Radiochemical procedure for the determination of plutonium isotopes in powdered milk

    International Nuclear Information System (INIS)

    Taddei, M.H.T.; Silva, N.C.

    2006-01-01

    A radiochemical procedure for the determination of alpha-emitting isotopes of plutonium in powdered milk is proposed. The procedure involves sample dissolution (by HNO 3 and HClO 4 ), separation by ionic-exchange resin, electrodeposition and alpha-spectroscopy. In order to determine the chemical recovery, 242 Pu was employed as a tracer. A reference material (Marine Sediment IAEA 135) was analyzed to validate such procedure, and to show its reliability. Afterwards, some powdered milk, produced for international trade, was analyzed and chemical recovery was found to be around 95%. (author)

  16. Validation of FORTRAN emulators for the G2 varian control programs

    International Nuclear Information System (INIS)

    Delorme, G.

    1996-01-01

    The extensive use of the Gentilly full scope simulator for training and verification of plant procedures, forced the development of a reliable desktop simulator for software maintenance purposes. For that we needed emulators for the control programs which run on the DCC Varian computers in the full scope simulator. This paper presents the validation results for the Reactor Regulating System (RRS) program. This emulator was programmed in a modular fashion providing ease of maintenance and of transportation to another environment. The results obtained with specific tests or with integrated testing involving complex control rule interactions, compared favorably with the ones obtained using the actual plant control programs running on the full scope simulator, which constitutes an irrefutable validation procedure. This RRS package along with the other emulators being validated In this manner could be used in safety codes with confidence. (author)

  17. Vision based flight procedure stereo display system

    Science.gov (United States)

    Shen, Xiaoyun; Wan, Di; Ma, Lan; He, Yuncheng

    2008-03-01

    A virtual reality flight procedure vision system is introduced in this paper. The digital flight map database is established based on the Geographic Information System (GIS) and high definitions satellite remote sensing photos. The flight approaching area database is established through computer 3D modeling system and GIS. The area texture is generated from the remote sensing photos and aerial photographs in various level of detail. According to the flight approaching procedure, the flight navigation information is linked to the database. The flight approaching area vision can be dynamic displayed according to the designed flight procedure. The flight approaching area images are rendered in 2 channels, one for left eye images and the others for right eye images. Through the polarized stereoscopic projection system, the pilots and aircrew can get the vivid 3D vision of the flight destination approaching area. Take the use of this system in pilots preflight preparation procedure, the aircrew can get more vivid information along the flight destination approaching area. This system can improve the aviator's self-confidence before he carries out the flight mission, accordingly, the flight safety is improved. This system is also useful in validate the visual flight procedure design, and it helps to the flight procedure design.

  18. Asymptotically optimum multialternative sequential procedures for discernment of processes minimizing average length of observations

    Science.gov (United States)

    Fishman, M. M.

    1985-01-01

    The problem of multialternative sequential discernment of processes is formulated in terms of conditionally optimum procedures minimizing the average length of observations, without any probabilistic assumptions about any one occurring process, rather than in terms of Bayes procedures minimizing the average risk. The problem is to find the procedure that will transform inequalities into equalities. The problem is formulated for various models of signal observation and data processing: (1) discernment of signals from background interference by a multichannel system; (2) discernment of pulse sequences with unknown time delay; (3) discernment of harmonic signals with unknown frequency. An asymptotically optimum sequential procedure is constructed which compares the statistics of the likelihood ratio with the mean-weighted likelihood ratio and estimates the upper bound for conditional average lengths of observations. This procedure is shown to remain valid as the upper bound for the probability of erroneous partial solutions decreases approaching zero and the number of hypotheses increases approaching infinity. It also remains valid under certain special constraints on the probability such as a threshold. A comparison with a fixed-length procedure reveals that this sequential procedure decreases the length of observations to one quarter, on the average, when the probability of erroneous partial solutions is low.

  19. A Comparative Analysis of the Procedure Employed in Item ...

    African Journals Online (AJOL)

    Zimbabwe Journal of Educational Research ... and psychological scales designed to measure constructs in education and social sciences were purposively selected for the study based on accessibility and availability of validation information. The instruments used for the study were scaling procedures used in 27 published ...

  20. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  1. Data validation summary report for the 100-HR-3 Round 8, Phases 1 and 2 groundwater sampling task

    International Nuclear Information System (INIS)

    1996-01-01

    This report presents a summary of data validation results on groundwater samples collected for the 100-HR-3 Round 8 Groundwater Sampling task. The analyses performed for this project consisted of: metals, general chemistry, and radiochemistry. The laboratories conducting the analyses were Quanterra Environmental Services (QES) and Lockheed Analytical Services. As required by the contract and the WHC statement of work (WHC 1994), data validation was conducted using the Westinghouse data validation procedures for chemical and radiochemical analyses (WHC 1993a and 1993b). Sample results were validated to levels A and D as described in the data validation procedures. At the completion of validation and verification of each data package, a data validation summary was prepared and transmitted with the original documentation to Environmental Restoration Contract (ERC) for inclusion in the project QA record

  2. Occupational exposures from selected interventional radiological procedures

    International Nuclear Information System (INIS)

    Janeczek, J.; Beal, A.; James, D.

    2001-01-01

    The number of radiology and cardiology interventional procedures has significantly increased in recent years due to better diagnostic equipment resulting in an increase in radiation dose to the staff and patients. The assessment of staff doses was performed for cardiac catheterization and for three other non-cardiac procedures. The scattered radiation distribution resulting from the cardiac catheterization procedure was measured prior to the staff dose measurements. Staff dose measurements included those of the left shoulder, eye, thyroid and hand doses of the cardiologist. In non-cardiac procedures doses to the hands of the radiologist were measured for nephrostomy, fistulogram and percutaneous transluminal angioplasty procedures. Doses to the radiologist or cardiologist were found to be relatively high if correct protection was not observed. (author)

  3. On the dimensionality of organizational justice: a construct validation of a measure.

    Science.gov (United States)

    Colquitt, J A

    2001-06-01

    This study explores the dimensionality of organizational justice and provides evidence of construct validity for a new justice measure. Items for this measure were generated by strictly following the seminal works in the justice literature. The measure was then validated in 2 separate studies. Study 1 occurred in a university setting, and Study 2 occurred in a field setting using employees in an automobile parts manufacturing company. Confirmatory factor analyses supported a 4-factor structure to the measure, with distributive, procedural, interpersonal, and informational justice as distinct dimensions. This solution fit the data significantly better than a 2- or 3-factor solution using larger interactional or procedural dimensions. Structural equation modeling also demonstrated predictive validity for the justice dimensions on important outcomes, including leader evaluation, rule compliance, commitment, and helping behavior.

  4. 21 CFR 314.530 - Withdrawal procedures.

    Science.gov (United States)

    2010-04-01

    ... Serious or Life-Threatening Illnesses § 314.530 Withdrawal procedures. (a) For new drugs approved under... benefit; (2) The applicant fails to perform the required postmarketing study with due diligence; (3) Use...

  5. 5 CFR 1215.8 - Procedures for salary offset.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 3 2010-01-01 2010-01-01 false Procedures for salary offset. 1215.8... MANAGEMENT Salary Offset § 1215.8 Procedures for salary offset. (a) Deductions to liquidate an employee's... payment due to a separated employee including but not limited to final salary payment or leave in...

  6. Motor assessment instruments and psychometric procedures: A systematic review

    Directory of Open Access Journals (Sweden)

    Pâmella de Medeiros

    2017-03-01

    Full Text Available It was our objective to identify the psychometric elements to an epistemological reflection through a systematic review of cross-cultural validation procedures of TGMD-2 batteries, MABC-2 and KTK. Searches were carried out by two evaluators independently without year and language restrictions in six databases: Web of Science, Science Direct, Lilacs, Scopus, Pubmed and The ScientificElectronic Library Online - SciELO. The key words used were: "MABC", "TGMD" and "KTK" all of them combined with the word "validity". There was a total of 734 articles, of which, after the exclusion criteria, remained only 11 studies. It was found that there are differences between the authors in relation to the psychometric factors taken into account in cross-cultural validation. So that there was a lack of unanimity of the validation criteria of all studies in this field.

  7. Cross-Cultural Validation of the Patient Perception of Integrated Care Survey.

    Science.gov (United States)

    Tietschert, Maike V; Angeli, Federica; van Raak, Arno J A; Ruwaard, Dirk; Singer, Sara J

    2017-07-20

    To test the cross-cultural validity of the U.S. Patient Perception of Integrated Care (PPIC) Survey in a Dutch sample using a standardized procedure. Primary data collected from patients of five primary care centers in the south of the Netherlands, through survey research from 2014 to 2015. Cross-sectional data collected from patients who saw multiple health care providers during 6 months preceding data collection. The PPIC survey includes 59 questions that measure patient perceived care integration across providers, settings, and time. Data analysis followed a standardized procedure guiding data preparation, psychometric analysis, and included invariance testing with the U.S. dataset. Latent scale structures of the Dutch and U.S. survey were highly comparable. Factor "Integration with specialist" had lower reliability scores and noninvariance. For the remaining factors, internal consistency and invariance estimates were strong. The standardized cross-cultural validation procedure produced strong support for comparable psychometric characteristics of the Dutch and U.S. surveys. Future research should examine the usability of the proposed procedure for contexts with greater cultural differences. © Health Research and Educational Trust.

  8. Validation of a novel laparoscopic adjustable gastric band simulator.

    Science.gov (United States)

    Sankaranarayanan, Ganesh; Adair, James D; Halic, Tansel; Gromski, Mark A; Lu, Zhonghua; Ahn, Woojin; Jones, Daniel B; De, Suvranu

    2011-04-01

    Morbid obesity accounts for more than 90,000 deaths per year in the United States. Laparoscopic adjustable gastric banding (LAGB) is the second most common weight loss procedure performed in the US and the most common in Europe and Australia. Simulation in surgical training is a rapidly advancing field that has been adopted by many to prepare surgeons for surgical techniques and procedures. The aim of our study was to determine face, construct, and content validity for a novel virtual reality laparoscopic adjustable gastric band simulator. Twenty-eight subjects were categorized into two groups (expert and novice), determined by their skill level in laparoscopic surgery. Experts consisted of subjects who had at least 4 years of laparoscopic training and operative experience. Novices consisted of subjects with medical training but with less than 4 years of laparoscopic training. The subjects used the virtual reality laparoscopic adjustable band surgery simulator. They were automatically scored according to various tasks. The subjects then completed a questionnaire to evaluate face and content validity. On a 5-point Likert scale (1 = lowest score, 5 = highest score), the mean score for visual realism was 4.00 ± 0.67 and the mean score for realism of the interface and tool movements was 4.07 ± 0.77 (face validity). There were significant differences in the performances of the two subject groups (expert and novice) based on total scores (p virtual reality laparoscopic adjustable gastric band simulator. Our initial results demonstrate excellent face, construct, and content validity findings. To our knowledge, this is the first virtual reality simulator with haptic feedback for training residents and surgeons in the laparoscopic adjustable gastric banding procedure.

  9. CFD validation experiments at the Lockheed-Georgia Company

    Science.gov (United States)

    Malone, John B.; Thomas, Andrew S. W.

    1987-01-01

    Information is given in viewgraph form on computational fluid dynamics (CFD) validation experiments at the Lockheed-Georgia Company. Topics covered include validation experiments on a generic fighter configuration, a transport configuration, and a generic hypersonic vehicle configuration; computational procedures; surface and pressure measurements on wings; laser velocimeter measurements of a multi-element airfoil system; the flowfield around a stiffened airfoil; laser velocimeter surveys of a circulation control wing; circulation control for high lift; and high angle of attack aerodynamic evaluations.

  10. Effect of Spike Lavender Lakhlakhe on Pain Intensity Due to Phlebotomy Procedure in Premature Infants Hospitalized in Neonatal Intensive Care Unit: A Randomized Clinical Trial

    Directory of Open Access Journals (Sweden)

    Noushin Beheshtipoor

    2017-06-01

    Full Text Available Background: A Premature infants undergo multiple painful procedures during treatment; thus, it must be tried to limit complications caused by diagnostic and treatment procedures using simple and practical methods. This study was performed to evaluate the effect of spike lavender lakhlakhe on pain intensity due to phlebotomy in hospitalized premature infants.Methods: This single-arm, randomized clinical trial was performed on 30 infants chosen through convenience sampling method. Each newborn was considered as its own control. For the test group, one drop of pure (100% spike lavender lakhlakhe was taken by a standard dropper and diluted with 4 ml of warm distilled water by the research assistant. This mixture was stirred at 2-3 cm distance of the newborns’ nose from 60 minutes before until 2 minutes after phlebotomy, such that it could be smelled by the newborns. In both groups, heart rate and blood oxygen saturation were measured by a standard portable device, and the corresponding data was recorded in data collection sheets. Moreover, the infants’ facial expression changes were recorded by a camera and the intensity of pain was measured by Premature Infant Pain Profile before and after the procedure. Finally, the data was analyzed by paired comparison analysis test in SPSS, version 17.Results: Comparison of mean pain intensity caused by phlebotomy in the control and test groups showed a significant difference (7.667±0.311 vs. 4.882±0.311; P

  11. Validation of internal dosimetry protocols based on stochastic method

    International Nuclear Information System (INIS)

    Mendes, Bruno M.; Fonseca, Telma C.F.; Almeida, Iassudara G.; Trindade, Bruno M.; Campos, Tarcisio P.R.

    2015-01-01

    Computational phantoms adapted to Monte Carlo codes have been applied successfully in radiation dosimetry fields. NRI research group has been developing Internal Dosimetry Protocols - IDPs, addressing distinct methodologies, software and computational human-simulators, to perform internal dosimetry, especially for new radiopharmaceuticals. Validation of the IDPs is critical to ensure the reliability of the simulations results. Inter comparisons of data from literature with those produced by our IDPs is a suitable method for validation. The aim of this study was to validate the IDPs following such inter comparison procedure. The Golem phantom has been reconfigured to run on MCNP5. The specific absorbed fractions (SAF) for photon at 30, 100 and 1000 keV energies were simulated based on the IDPs and compared with reference values (RV) published by Zankl and Petoussi-Henss, 1998. The SAF average differences from RV and those obtained in IDP simulations was 2.3 %. The SAF largest differences were found in situations involving low energy photons at 30 keV. The Adrenals and thyroid, i.e. the lowest mass organs, had the highest SAF discrepancies towards RV as 7.2 % and 3.8 %, respectively. The statistic differences of SAF applying our IDPs from reference values were considered acceptable at the 30, 100 and 1000 keV spectra. We believe that the main reason for the discrepancies in IDPs run, found in lower masses organs, was due to our source definition methodology. Improvements of source spatial distribution in the voxels may provide outputs more consistent with reference values for lower masses organs. (author)

  12. Validation of internal dosimetry protocols based on stochastic method

    Energy Technology Data Exchange (ETDEWEB)

    Mendes, Bruno M.; Fonseca, Telma C.F., E-mail: bmm@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil); Almeida, Iassudara G.; Trindade, Bruno M.; Campos, Tarcisio P.R., E-mail: tprcampos@yahoo.com.br [Universidade Federal de Minas Gerais (DEN/UFMG), Belo Horizonte, MG (Brazil). Departamento de Engenharia Nuclear

    2015-07-01

    Computational phantoms adapted to Monte Carlo codes have been applied successfully in radiation dosimetry fields. NRI research group has been developing Internal Dosimetry Protocols - IDPs, addressing distinct methodologies, software and computational human-simulators, to perform internal dosimetry, especially for new radiopharmaceuticals. Validation of the IDPs is critical to ensure the reliability of the simulations results. Inter comparisons of data from literature with those produced by our IDPs is a suitable method for validation. The aim of this study was to validate the IDPs following such inter comparison procedure. The Golem phantom has been reconfigured to run on MCNP5. The specific absorbed fractions (SAF) for photon at 30, 100 and 1000 keV energies were simulated based on the IDPs and compared with reference values (RV) published by Zankl and Petoussi-Henss, 1998. The SAF average differences from RV and those obtained in IDP simulations was 2.3 %. The SAF largest differences were found in situations involving low energy photons at 30 keV. The Adrenals and thyroid, i.e. the lowest mass organs, had the highest SAF discrepancies towards RV as 7.2 % and 3.8 %, respectively. The statistic differences of SAF applying our IDPs from reference values were considered acceptable at the 30, 100 and 1000 keV spectra. We believe that the main reason for the discrepancies in IDPs run, found in lower masses organs, was due to our source definition methodology. Improvements of source spatial distribution in the voxels may provide outputs more consistent with reference values for lower masses organs. (author)

  13. A broad view of model validation

    International Nuclear Information System (INIS)

    Tsang, C.F.

    1989-10-01

    The safety assessment of a nuclear waste repository requires the use of models. Such models need to be validated to ensure, as much as possible, that they are a good representation of the actual processes occurring in the real system. In this paper we attempt to take a broad view by reviewing step by step the modeling process and bringing out the need to validating every step of this process. This model validation includes not only comparison of modeling results with data from selected experiments, but also evaluation of procedures for the construction of conceptual models and calculational models as well as methodologies for studying data and parameter correlation. The need for advancing basic scientific knowledge in related fields, for multiple assessment groups, and for presenting our modeling efforts in open literature to public scrutiny is also emphasized. 16 refs

  14. Social validity in single-case research: A systematic literature review of prevalence and application.

    Science.gov (United States)

    Snodgrass, Melinda R; Chung, Moon Y; Meadan, Hedda; Halle, James W

    2018-03-01

    Single-case research (SCR) has been a valuable methodology in special education research. Montrose Wolf (1978), an early pioneer in single-case methodology, coined the term "social validity" to refer to the social importance of the goals selected, the acceptability of procedures employed, and the effectiveness of the outcomes produced in applied investigations. Since 1978, many contributors to SCR have included social validity as a feature of their articles and several authors have examined the prevalence and role of social validity in SCR. We systematically reviewed all SCR published in six highly-ranked special education journals from 2005 to 2016 to establish the prevalence of social validity assessments and to evaluate their scientific rigor. We found relatively low, but stable prevalence with only 28 publications addressing all three factors of the social validity construct (i.e., goals, procedures, outcomes). We conducted an in-depth analysis of the scientific rigor of these 28 publications. Social validity remains an understudied construct in SCR, and the scientific rigor of social validity assessments is often lacking. Implications and future directions are discussed. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. Contract Law, Due Process, and the NCAA.

    Science.gov (United States)

    Dickerson, Jaffe D.; Chapman, Mayer

    1978-01-01

    The NCAA has enjoyed almost total freedom from judicial scrutiny of its rules, procedures, and official acts in large part because of its private nature as an unincorporated association. The function of the NCAA, California State University, Hayward v NCAA, and due process of the student-athlete are discussed. (MLW)

  16. Sensitivity and uncertainty analyses applied to criticality safety validation. Volume 2

    International Nuclear Information System (INIS)

    Broadhead, B.L.; Hopper, C.M.; Parks, C.V.

    1999-01-01

    This report presents the application of sensitivity and uncertainty (S/U) analysis methodologies developed in Volume 1 to the code/data validation tasks of a criticality safety computational study. Sensitivity and uncertainty analysis methods were first developed for application to fast reactor studies in the 1970s. This work has revitalized and updated the existing S/U computational capabilities such that they can be used as prototypic modules of the SCALE code system, which contains criticality analysis tools currently in use by criticality safety practitioners. After complete development, simplified tools are expected to be released for general use. The methods for application of S/U and generalized linear-least-square methodology (GLLSM) tools to the criticality safety validation procedures were described in Volume 1 of this report. Volume 2 of this report presents the application of these procedures to the validation of criticality safety analyses supporting uranium operations where enrichments are greater than 5 wt %. Specifically, the traditional k eff trending analyses are compared with newly developed k eff trending procedures, utilizing the D and c k coefficients described in Volume 1. These newly developed procedures are applied to a family of postulated systems involving U(11)O 2 fuel, with H/X values ranging from 0--1,000. These analyses produced a series of guidance and recommendations for the general usage of these various techniques. Recommendations for future work are also detailed

  17. 40 CFR 53.66 - Test procedure: Volatility test.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 5 2010-07-01 2010-07-01 false Test procedure: Volatility test. 53.66... Characteristics of Class II Equivalent Methods for PM2.5 § 53.66 Test procedure: Volatility test. (a) Overview. This test is designed to ensure that the candidate method's losses due to volatility when sampling semi...

  18. Development of assessment procedures at the CEGB's nuclear power training centre

    International Nuclear Information System (INIS)

    Chapman, C.R.; Harris, N.D.C.

    1986-01-01

    The work of a power station engineer can be considered under four aspects: technology, diagnosis action and communication. The development, validation and use of assessment procedures can successfully incorporate the same aspects. The purposes of assessment are reporting training achievement and giving feedback to course members and tutorial staff. The development of standardized procedures to produce, evaluate and mark assessments and to optimize feedback ensures objectivity and uniformity. This has been achieved at the Central Electricity Generating Board's Nuclear Power Training Centre by enlisting an educational consultant to provide guidance and assist in training the resident tutors in assessment procedures. (author)

  19. 34 CFR 682.207 - Due diligence in disbursing a loan.

    Science.gov (United States)

    2010-07-01

    ... 34 Education 3 2010-07-01 2010-07-01 false Due diligence in disbursing a loan. 682.207 Section 682.207 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF... § 682.207 Due diligence in disbursing a loan. (a)(1) This section prescribes procedures for lenders to...

  20. Validation process of simulation model

    International Nuclear Information System (INIS)

    San Isidro, M. J.

    1998-01-01

    It is presented a methodology on empirical validation about any detailed simulation model. This king of validation it is always related with an experimental case. The empirical validation has a residual sense, because the conclusions are based on comparisons between simulated outputs and experimental measurements. This methodology will guide us to detect the fails of the simulation model. Furthermore, it can be used a guide in the design of posterior experiments. Three steps can be well differentiated: Sensitivity analysis. It can be made with a DSA, differential sensitivity analysis, and with a MCSA, Monte-Carlo sensitivity analysis. Looking the optimal domains of the input parameters. It has been developed a procedure based on the Monte-Carlo methods and Cluster techniques, to find the optimal domains of these parameters. Residual analysis. This analysis has been made on the time domain and on the frequency domain, it has been used the correlation analysis and spectral analysis. As application of this methodology, it is presented the validation carried out on a thermal simulation model on buildings, Esp., studying the behavior of building components on a Test Cell of LECE of CIEMAT. (Author) 17 refs

  1. Experimental validation of Monte Carlo calculations for organ dose

    International Nuclear Information System (INIS)

    Yalcintas, M.G.; Eckerman, K.F.; Warner, G.G.

    1980-01-01

    The problem of validating estimates of absorbed dose due to photon energy deposition is examined. The computational approaches used for the estimation of the photon energy deposition is examined. The limited data for validation of these approaches is discussed and suggestions made as to how better validation information might be obtained

  2. Verification and validation in computational fluid dynamics

    Science.gov (United States)

    Oberkampf, William L.; Trucano, Timothy G.

    2002-04-01

    Verification and validation (V&V) are the primary means to assess accuracy and reliability in computational simulations. This paper presents an extensive review of the literature in V&V in computational fluid dynamics (CFD), discusses methods and procedures for assessing V&V, and develops a number of extensions to existing ideas. The review of the development of V&V terminology and methodology points out the contributions from members of the operations research, statistics, and CFD communities. Fundamental issues in V&V are addressed, such as code verification versus solution verification, model validation versus solution validation, the distinction between error and uncertainty, conceptual sources of error and uncertainty, and the relationship between validation and prediction. The fundamental strategy of verification is the identification and quantification of errors in the computational model and its solution. In verification activities, the accuracy of a computational solution is primarily measured relative to two types of highly accurate solutions: analytical solutions and highly accurate numerical solutions. Methods for determining the accuracy of numerical solutions are presented and the importance of software testing during verification activities is emphasized. The fundamental strategy of validation is to assess how accurately the computational results compare with the experimental data, with quantified error and uncertainty estimates for both. This strategy employs a hierarchical methodology that segregates and simplifies the physical and coupling phenomena involved in the complex engineering system of interest. A hypersonic cruise missile is used as an example of how this hierarchical structure is formulated. The discussion of validation assessment also encompasses a number of other important topics. A set of guidelines is proposed for designing and conducting validation experiments, supported by an explanation of how validation experiments are different

  3. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    Energy Technology Data Exchange (ETDEWEB)

    Ekechukwu, A.

    2008-12-17

    This document proposes to provide a listing of available sources which can be used to validate analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers, and books reviewed is given in Appendix 1. Available validation documents and guides are listed in the appendix; each has a brief description of application and use. In the referenced sources, there are varying approaches to validation and varying descriptions of validation at different stages in method development. This discussion focuses on validation and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all documents were published in English.

  4. Development of automated generation system of accidental operating procedures for a PWR

    International Nuclear Information System (INIS)

    Artaud, J.L.

    1991-06-01

    The aim of the ACACIA project is to develop an automated generation system of accident operating procedures for a PWR. This research and development study, common at CEA and EDF, has two objectives: at mean-dated the realization of a validation tool and a procedure generation; at long-dated the dynamic generation of real time procedures. This work is consecrated at the realization of 2 prototypes. These prototypes and the technics used are described in detail. The last chapter explores the perspectives given by this type of tool [fr

  5. A Procedural Skills OSCE: Assessing Technical and Non-Technical Skills of Internal Medicine Residents

    Science.gov (United States)

    Pugh, Debra; Hamstra, Stanley J.; Wood, Timothy J.; Humphrey-Murto, Susan; Touchie, Claire; Yudkowsky, Rachel; Bordage, Georges

    2015-01-01

    Internists are required to perform a number of procedures that require mastery of technical and non-technical skills, however, formal assessment of these skills is often lacking. The purpose of this study was to develop, implement, and gather validity evidence for a procedural skills objective structured clinical examination (PS-OSCE) for internal…

  6. Martius procedure revisited for urethrovaginal fistula

    Directory of Open Access Journals (Sweden)

    N P Rangnekar

    2000-01-01

    Full Text Available Background: Urethrovaginal fistula is a dreadful com-plication of obstetric trauma due to prolonged labour or obstetric intervention commonly seen in developing coun-tries. Due to prolonged ischaemic changes, the fistula is resistant to healing. The strategic location of the fistula leads to postoperative impairment of continence mecha-nism. Anatomical repair was previously the commonest mode of surgical management, but was associated with a miserable cumulative cure rate ranging from 16-60%. Hence we tried to study the efficacy of Martius procedure in the management of urethrovaginal fistula. Material and Methods: We studied the outcome of 12 urethrovaginal fistulae, all caused by obstetric trauma, treated surgically with Martius procedure in 8 and with anatomical repair in 4, retrospectively. 9 patients had re-current fistulae while I patient had multiple fistulae. Pa-tients were followed up for the period ranging from 6 months to 4′/2 years for fistula healing, continence and postoperative complications like dvspareunia. Results: Cumulative cure rate ofMartius procedure was 87.5% with no postoperative stress incontinence, while fistula healing rate of anatomical repair was only 25% (I patient out of 4 which was also complicated by Intrin-sic Sphincter Deficiency (ISD. In case of recurrent fistu-lae the success rate of anatomical repair was 0% compared to 83.33% with Martius procedure. Conclusions: Martius procedure has shown much bet-ter overall cure rate compared to anatomical repair be-cause - a it provides better reinforcement to urethral suture line, b it provides better blood supply and lymph drainage to the ischaemic fistulous area, c provides sur-face for epithelialization and, d helps to maintain conti-nence. Hence we recommend Martius procedure as a surgical modality for the treatment of urethrovaginal fis-tula.

  7. Application of a computational situation assessment model to human system interface design and experimental validation of its effectiveness

    International Nuclear Information System (INIS)

    Lee, Hyun-Chul; Koh, Kwang-Yong; Seong, Poong-Hyun

    2013-01-01

    Highlights: ► We validate the effectiveness of a proposed procedure thru an experiment. ► The proposed procedure addresses the salient coding of the key information. ► It was found that salience coding affects operators’ attention significantly. ► The first observation to the key information quickly guided to the correct situation awareness. ► It was validated the proposed procedure is effective for better situation awareness. - Abstract: To evaluate the effects of human cognitive characteristics on situation awareness, a computational situation assessment model of nuclear power plant operators has been developed, as well as a procedure to apply the developed model to the design of human system interfaces (HSIs). The concept of the proposed procedure is to identify the key information source, which is expected to guarantee fast and accurate diagnosis when operators attend to it. The developed computational model is used to search the diagnostic paths and the key information source. In this study, an experiment with twelve trained participants was executed to validate the effectiveness of the proposed procedure. Eighteen scenarios covering various accidents were administered twice for each subject, and experimental data were collected and analyzed. As a result of the data analysis, it was validated that the salience level of information sources significantly influences the attention of operators, and the first observation of the key information sources leads operators to a quick and correct situation assessment. Therefore, we conclude that the proposed procedure for applying the developed model to HSI design is effective

  8. Ensuring due process in the IACUC and animal welfare setting: considerations in developing noncompliance policies and procedures for institutional animal care and use committees and institutional officials.

    Science.gov (United States)

    Hansen, Barbara C; Gografe, Sylvia; Pritt, Stacy; Jen, Kai-Lin Catherine; McWhirter, Camille A; Barman, Susan M; Comuzzie, Anthony; Greene, Molly; McNulty, Justin A; Michele, Daniel Eugene; Moaddab, Naz; Nelson, Randall J; Norris, Karen; Uray, Karen D; Banks, Ron; Westlund, Karin N; Yates, Bill J; Silverman, Jerald; Hansen, Kenneth D; Redman, Barbara

    2017-10-01

    Every institution that is involved in research with animals is expected to have in place policies and procedures for the management of allegations of noncompliance with the Animal Welfare Act and the U.S. Public Health Service Policy on the Humane Care and Use of Laboratory Animals. We present here a model set of recommendations for institutional animal care and use committees and institutional officials to ensure appropriate consideration of allegations of noncompliance with federal Animal Welfare Act regulations that carry a significant risk or specific threat to animal welfare. This guidance has 3 overarching aims: 1 ) protecting the welfare of research animals; 2 ) according fair treatment and due process to an individual accused of noncompliance; and 3 ) ensuring compliance with federal regulations. Through this guidance, the present work seeks to advance the cause of scientific integrity, animal welfare, and the public trust while recognizing and supporting the critical importance of animal research for the betterment of the health of both humans and animals.-Hansen, B. C., Gografe, S., Pritt, S., Jen, K.-L. C., McWhirter, C. A., Barman, S. M., Comuzzie, A., Greene, M., McNulty, J. A., Michele, D. E., Moaddab, N., Nelson, R. J., Norris, K., Uray, K. D., Banks, R., Westlund, K. N., Yates, B. J., Silverman, J., Hansen, K. D., Redman, B. Ensuring due process in the IACUC and animal welfare setting: considerations in developing noncompliance policies and procedures for institutional animal care and use committees and institutional officials. © FASEB.

  9. Validating the passenger traffic model for Copenhagen

    DEFF Research Database (Denmark)

    Overgård, Christian Hansen; VUK, Goran

    2006-01-01

    The paper presents a comprehensive validation procedure for the passenger traffic model for Copenhagen based on external data from the Danish national travel survey and traffic counts. The model was validated for the years 2000 to 2004, with 2004 being of particular interest because the Copenhagen...... matched the observed traffic better than those of the transit assignment model. With respect to the metro forecasts, the model over-predicts metro passenger flows by 10% to 50%. The wide range of findings from the project resulted in two actions. First, a project was started in January 2005 to upgrade...

  10. Model performance analysis and model validation in logistic regression

    Directory of Open Access Journals (Sweden)

    Rosa Arboretti Giancristofaro

    2007-10-01

    Full Text Available In this paper a new model validation procedure for a logistic regression model is presented. At first, we illustrate a brief review of different techniques of model validation. Next, we define a number of properties required for a model to be considered "good", and a number of quantitative performance measures. Lastly, we describe a methodology for the assessment of the performance of a given model by using an example taken from a management study.

  11. Validation of an online replanning technique for prostate adaptive radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Peng Cheng; Chen Guangpei; Ahunbay, Ergun; Wang Dian; Lawton, Colleen; Li, X Allen, E-mail: ali@mcw.edu [Department of Radiation Oncology, Medical College of Wisconsin, Milwaukee, WI (United States)

    2011-06-21

    We have previously developed an online adaptive replanning technique to rapidly adapt the original plan according to daily CT. This paper reports the quality assurance (QA) developments in its clinical implementation for prostate cancer patients. A series of pre-clinical validation tests were carried out to verify the overall accuracy and consistency of the online replanning procedure. These tests include (a) phantom measurements of 22 individual patient adaptive plans to verify their accuracy and deliverability and (b) efficiency and applicability of the online replanning process. A four-step QA procedure was established to ensure the safe and accurate delivery of an adaptive plan, including (1) offline phantom measurement of the original plan, (2) online independent monitor unit (MU) calculation for a redundancy check, (3) online verification of plan-data transfer using an in-house software and (4) offline validation of actually delivered beam parameters. The pre-clinical validations demonstrate that the newly implemented online replanning technique is dosimetrically accurate and practically efficient. The four-step QA procedure is capable of identifying possible errors in the process of online adaptive radiotherapy and to ensure the safe and accurate delivery of the adaptive plans. Based on the success of this work, the online replanning technique has been used in the clinic to correct for interfractional changes during the prostate radiation therapy.

  12. Validation of an online replanning technique for prostate adaptive radiotherapy

    International Nuclear Information System (INIS)

    Peng Cheng; Chen Guangpei; Ahunbay, Ergun; Wang Dian; Lawton, Colleen; Li, X Allen

    2011-01-01

    We have previously developed an online adaptive replanning technique to rapidly adapt the original plan according to daily CT. This paper reports the quality assurance (QA) developments in its clinical implementation for prostate cancer patients. A series of pre-clinical validation tests were carried out to verify the overall accuracy and consistency of the online replanning procedure. These tests include (a) phantom measurements of 22 individual patient adaptive plans to verify their accuracy and deliverability and (b) efficiency and applicability of the online replanning process. A four-step QA procedure was established to ensure the safe and accurate delivery of an adaptive plan, including (1) offline phantom measurement of the original plan, (2) online independent monitor unit (MU) calculation for a redundancy check, (3) online verification of plan-data transfer using an in-house software and (4) offline validation of actually delivered beam parameters. The pre-clinical validations demonstrate that the newly implemented online replanning technique is dosimetrically accurate and practically efficient. The four-step QA procedure is capable of identifying possible errors in the process of online adaptive radiotherapy and to ensure the safe and accurate delivery of the adaptive plans. Based on the success of this work, the online replanning technique has been used in the clinic to correct for interfractional changes during the prostate radiation therapy.

  13. Validation of an ultra-fast UPLC-UV method for the separation of antituberculosis tablets.

    Science.gov (United States)

    Nguyen, Dao T-T; Guillarme, Davy; Rudaz, Serge; Veuthey, Jean-Luc

    2008-04-01

    A simple method using ultra performance LC (UPLC) coupled with UV detection was developed and validated for the determination of antituberculosis drugs in combined dosage form, i. e. isoniazid (ISN), pyrazinamide (PYR) and rifampicin (RIF). Drugs were separated on a short column (2.1 mm x 50 mm) packed with 1.7 mum particles, using an elution gradient procedure. At 30 degrees C, less than 2 min was necessary for the complete separation of the three antituberculosis drugs, while the original USP method was performed in 15 min. Further improvements were obtained with the combination of UPLC and high temperature (up to 90 degrees C), namely HT-UPLC, which allows the application of higher mobile phase flow rates. Therefore, the separation of ISN, PYR and RIF was performed in less than 1 min. After validation (selectivity, trueness, precision and accuracy), both methods (UPLC and HT-UPLC) have proven suitable for the routine quality control analysis of antituberculosis drugs in combined dosage form. Additionally, a large number of samples per day can be analysed due to the short analysis times.

  14. Validation of IEEE P1547.1 Interconnection Test Procedures: ASCO 7000 Soft Load Transfer System

    Energy Technology Data Exchange (ETDEWEB)

    Kroposki, B.; Englebretson, S.; Pink, C.; Daley, J.; Siciliano, R.; Hinton, D.

    2003-09-01

    This report presents the preliminary results of testing the ASCO 7000 Soft Load Transfer System according to IEEE P1547.1 procedures. The ASCO system interconnects synchronous generators with the electric power system and provides monitoring and control for the generator and grid connection through extensive protective functions. The purpose of this testing is to evaluate and give feedback on the contents of IEEE Draft Standard P1547.1 Conformance Tests Procedures for Equipment Interconnecting Distributed Resources With Electric Power Systems.

  15. CTF Void Drift Validation Study

    Energy Technology Data Exchange (ETDEWEB)

    Salko, Robert K. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Gosdin, Chris [Pennsylvania State Univ., University Park, PA (United States); Avramova, Maria N. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Gergar, Marcus [Pennsylvania State Univ., University Park, PA (United States)

    2015-10-26

    This milestone report is a summary of work performed in support of expansion of the validation and verification (V&V) matrix for the thermal-hydraulic subchannel code, CTF. The focus of this study is on validating the void drift modeling capabilities of CTF and verifying the supporting models that impact the void drift phenomenon. CTF uses a simple turbulent-diffusion approximation to model lateral cross-flow due to turbulent mixing and void drift. The void drift component of the model is based on the Lahey and Moody model. The models are a function of two-phase mass, momentum, and energy distribution in the system; therefore, it is necessary to correctly model the ow distribution in rod bundle geometry as a first step to correctly calculating the void distribution due to void drift.

  16. 42 CFR 478.15 - QIO review of changes resulting from DRG validation.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false QIO review of changes resulting from DRG validation... review of changes resulting from DRG validation. (a) General rules. (1) A provider or practitioner dissatisfied with a change to the diagnostic or procedural coding information made by a QIO as a result of DRG...

  17. Systematic and efficient side chain optimization for molecular docking using a cheapest-path procedure.

    Science.gov (United States)

    Schumann, Marcel; Armen, Roger S

    2013-05-30

    Molecular docking of small-molecules is an important procedure for computer-aided drug design. Modeling receptor side chain flexibility is often important or even crucial, as it allows the receptor to adopt new conformations as induced by ligand binding. However, the accurate and efficient incorporation of receptor side chain flexibility has proven to be a challenge due to the huge computational complexity required to adequately address this problem. Here we describe a new docking approach with a very fast, graph-based optimization algorithm for assignment of the near-optimal set of residue rotamers. We extensively validate our approach using the 40 DUD target benchmarks commonly used to assess virtual screening performance and demonstrate a large improvement using the developed side chain optimization over rigid receptor docking (average ROC AUC of 0.693 vs. 0.623). Compared to numerous benchmarks, the overall performance is better than nearly all other commonly used procedures. Furthermore, we provide a detailed analysis of the level of receptor flexibility observed in docking results for different classes of residues and elucidate potential avenues for further improvement. Copyright © 2013 Wiley Periodicals, Inc.

  18. Estimating Global Burden of Disease due to congenital anomaly

    DEFF Research Database (Denmark)

    Boyle, Breidge; Addor, Marie-Claude; Arriola, Larraitz

    2018-01-01

    OBJECTIVE: To validate the estimates of Global Burden of Disease (GBD) due to congenital anomaly for Europe by comparing infant mortality data collected by EUROCAT registries with the WHO Mortality Database, and by assessing the significance of stillbirths and terminations of pregnancy for fetal...... the burden of disease due to congenital anomaly, and thus declining YLL over time may obscure lack of progress in primary, secondary and tertiary prevention....

  19. A procedure for the determination of Po-210 in water samples by alpha spectrometry

    International Nuclear Information System (INIS)

    2009-01-01

    Reliable, comparable and 'fit for purpose' results are an essential requirement for any decision based on analytical measurements. For the analyst, the availability of tested and validated analytical procedures is a extremely important tool for production of such analytical measurements. For maximum utility, such procedures should be comprehensive, clearly formulated, and readily available to both the analyst and the customer for reference. Since 2004 the Environment programme of the IAEA has included activities aimed at the development of a set of procedures for the determination of radionuclides in terrestrial environmental samples. In the case of 210 Po, this started with the collection and review of about 130 papers from the scientific literature. Based on this review, two candidate methods for the chemical separation of 210 Po from water samples were selected for testing, refinement and validation in accordance with ISO guidelines. A comprehensive methodology for calculation of results including quantification of measurement uncertainty was also developed. This report presents the final procedure which was developed based on that work

  20. Validation of the Boston Carpal Tunnel Questionnaire in Russia

    Directory of Open Access Journals (Sweden)

    D. G. Yusupova

    2018-01-01

    Full Text Available International scales and questionnaires have become widespread in Russian neurology. Validation is a procedure of top priority necessary before applying this kind of diagnostic instrument in Russian-speaking population. In this article, validation of the Boston Carpal Tunnel Questionnaire (BCTQ intended for patients with this disease is described. Use of validated Russian version would allow to objectively evaluate severity of clinical manifestations of carpal tunnel syndrome and follow patient dynamics. We present the official BCTQ version recommended for use in Russia, as well as data that showed high sensitivity and reliability of this instrument for clinical evaluation of carpal tunnel syndrome.

  1. Integrated System Validation Usability Questionnaire: Computerized Procedures; Desarrollo del Cuestionario de Facilidad de Uso para la Validación de Sistemas Integrados: Procedimientos de Operacion

    Energy Technology Data Exchange (ETDEWEB)

    Garcés, Ma. I.; Torralba, B.

    2015-07-01

    The Research and Development (R&D) project on “Theoretical and Methodological Approaches to Integrated System Validation of Control Rooms, 2014-2015”, in which the research activities described in this report are framed, has two main objectives: to develop the items for an usability methodology conceived as a part of the measurement framework for performance-based control room evaluation that the OECD Halden Reactor Project will test in the experiments planned for 2015; and the statistical analysis of the data generated in the experimental activities of the Halden Man-Machine Laboratory (HAMMLAB) facility, with previous usability questionnaires, in 2010 and 2011. In this report, the procedure designed to meet the first goal of the project is described, in particular, the process followed to identify the items related to operating procedures, both computer and paper-based, one of the elements to be included in the usability questionnaire. Three phases are performed, in the first one, the approaches developed by the United States Nuclear Regulatory Commission, NRC, are reviewed, the models used by the nuclear industry and their technical support organizations, mainly, the Electric Power Research Institute, EPRI, are analyzed, and scientist advances are also explored. In the remaining stages, general and specific guidelines for computerized and paper-based procedures are compared and criteria for the preliminary selection of the items that should be incorporated into the usability questionnaire are defined. This proposal will be reviewed and adapted by the Halden Reactor Project to the design of the specific experiments performed in HAMLAB.

  2. CosmoQuest:Using Data Validation for More Than Just Data Validation

    Science.gov (United States)

    Lehan, C.; Gay, P.

    2016-12-01

    It is often taken for granted that different scientists completing the same task (e.g. mapping geologic features) will get the same results, and data validation is often skipped or under-utilized due to time and funding constraints. Robbins et. al (2014), however, demonstrated that this is a needed step, as large variation can exist even among collaborating team members completing straight-forward tasks like marking craters. Data Validation should be much more than a simple post-project verification of results. The CosmoQuest virtual research facility employs regular data-validation for a variety of benefits, including real-time user feedback, real-time tracking to observe user activity while it's happening, and using pre-solved data to analyze users' progress and to help them retain skills. Some creativity in this area can drastically improve project results. We discuss methods of validating data in citizen science projects and outline the variety of uses for validation, which, when used properly, improves the scientific output of the project and the user experience for the citizens doing the work. More than just a tool for scientists, validation can assist users in both learning and retaining important information and skills, improving the quality and quantity of data gathered. Real-time analysis of user data can give key information in the effectiveness of the project that a broad glance would miss, and properly presenting that analysis is vital. Training users to validate their own data, or the data of others, can significantly improve the accuracy of misinformed or novice users.

  3. A software prototype development of human system interfaces for human factors engineering validation tests of SMART MCR

    International Nuclear Information System (INIS)

    Lim, Jong Tae; Han, Kwan Ho; Yang, Seung Won

    2011-02-01

    An integrated system validation test bed used for human factors engineering validation test is being developed. This study has a goal to develop a software prototype for HFE validation of SMART MCR design. To achieve these, first, some prototype specifications of the software was developed. Then software prototypes of alarm reduction logic system, Plant Protection System, ESF-CCS, Elastic Tile Alarm Indication, and EID-based HSIs were implemented as codes. Test procedures for the software prototypes were established to verify the completeness of the codes implemented. The careful software test has been done according to these test procedures, and the result were documented

  4. Validation of a clinical assessment tool for spinal anaesthesia.

    LENUS (Irish Health Repository)

    Breen, D

    2011-07-01

    There is a need for a procedure-specific means of assessment of clinical performance in anaesthesia. The aim of this study was to devise a tool for assessing the performance of spinal anaesthesia, which has both content and construct validity.

  5. Model Validation and Verification of Data Mining from the ...

    African Journals Online (AJOL)

    Michael Horsfall

    In this paper, we seek to present a hybrid method for Model Validation and Verification of Data Mining from the ... This model generally states the numerical value of knowledge .... procedures found in the field of software engineering should be ...

  6. 20 CFR 670.545 - How does Job Corps ensure that students receive due process in disciplinary actions?

    Science.gov (United States)

    2010-04-01

    ... receive due process in disciplinary actions? 670.545 Section 670.545 Employees' Benefits EMPLOYMENT AND... process in disciplinary actions? The center operator must ensure that all students receive due process in disciplinary proceedings according to procedures developed by the Secretary. These procedures must include, at...

  7. Validation of evaluated neutron standard cross sections

    International Nuclear Information System (INIS)

    Badikov, S.; Golashvili, T.

    2008-01-01

    Some steps of the validation and verification of the new version of the evaluated neutron standard cross sections were carried out. In particular: -) the evaluated covariance data was checked for physical consistency, -) energy-dependent evaluated cross-sections were tested in most important neutron benchmark field - 252 Cf spontaneous fission neutron field, -) a procedure of folding differential standard neutron data in group representation for preparation of specialized libraries of the neutron standards was verified. The results of the validation and verification of the neutron standards can be summarized as follows: a) the covariance data of the evaluated neutron standards is physically consistent since all the covariance matrices of the evaluated cross sections are positive definite, b) the 252 Cf spectrum averaged standard cross-sections are in agreement with the evaluated integral data (except for 197 Au(n,γ) reaction), c) a procedure of folding differential standard neutron data in group representation was tested, as a result a specialized library of neutron standards in the ABBN 28-group structure was prepared for use in reactor applications. (authors)

  8. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  9. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William L.; Trucano, Timothy G.

    2008-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  10. Verification and validation guidelines for high integrity systems. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Hecht, H.; Hecht, M.; Dinsmore, G.; Hecht, S.; Tang, D. [SoHaR, Inc., Beverly Hills, CA (United States)

    1995-03-01

    High integrity systems include all protective (safety and mitigation) systems for nuclear power plants, and also systems for which comparable reliability requirements exist in other fields, such as in the process industries, in air traffic control, and in patient monitoring and other medical systems. Verification aims at determining that each stage in the software development completely and correctly implements requirements that were established in a preceding phase, while validation determines that the overall performance of a computer system completely and correctly meets system requirements. Volume I of the report reviews existing classifications for high integrity systems and for the types of errors that may be encountered, and makes recommendations for verification and validation procedures, based on assumptions about the environment in which these procedures will be conducted. The final chapter of Volume I deals with a framework for standards in this field. Volume II contains appendices dealing with specific methodologies for system classification, for dependability evaluation, and for two software tools that can automate otherwise very labor intensive verification and validation activities.

  11. Verification and validation guidelines for high integrity systems. Volume 1

    International Nuclear Information System (INIS)

    Hecht, H.; Hecht, M.; Dinsmore, G.; Hecht, S.; Tang, D.

    1995-03-01

    High integrity systems include all protective (safety and mitigation) systems for nuclear power plants, and also systems for which comparable reliability requirements exist in other fields, such as in the process industries, in air traffic control, and in patient monitoring and other medical systems. Verification aims at determining that each stage in the software development completely and correctly implements requirements that were established in a preceding phase, while validation determines that the overall performance of a computer system completely and correctly meets system requirements. Volume I of the report reviews existing classifications for high integrity systems and for the types of errors that may be encountered, and makes recommendations for verification and validation procedures, based on assumptions about the environment in which these procedures will be conducted. The final chapter of Volume I deals with a framework for standards in this field. Volume II contains appendices dealing with specific methodologies for system classification, for dependability evaluation, and for two software tools that can automate otherwise very labor intensive verification and validation activities

  12. Some considerations for validation of repository performance assessment models

    International Nuclear Information System (INIS)

    Eisenberg, N.

    1991-01-01

    Validation is an important aspect of the regulatory uses of performance assessment. A substantial body of literature exists indicating the manner in which validation of models is usually pursued. Because performance models for a nuclear waste repository cannot be tested over the long time periods for which the model must make predictions, the usual avenue for model validation is precluded. Further impediments to model validation include a lack of fundamental scientific theory to describe important aspects of repository performance and an inability to easily deduce the complex, intricate structures characteristic of a natural system. A successful strategy for validation must attempt to resolve these difficulties in a direct fashion. Although some procedural aspects will be important, the main reliance of validation should be on scientific substance and logical rigor. The level of validation needed will be mandated, in part, by the uses to which these models are put, rather than by the ideal of validation of a scientific theory. Because of the importance of the validation of performance assessment models, the NRC staff has engaged in a program of research and international cooperation to seek progress in this important area. 2 figs., 16 refs

  13. Multi-platform operational validation of the Western Mediterranean SOCIB forecasting system

    Science.gov (United States)

    Juza, Mélanie; Mourre, Baptiste; Renault, Lionel; Tintoré, Joaquin

    2014-05-01

    The development of science-based ocean forecasting systems at global, regional, and local scales can support a better management of the marine environment (maritime security, environmental and resources protection, maritime and commercial operations, tourism, ...). In this context, SOCIB (the Balearic Islands Coastal Observing and Forecasting System, www.socib.es) has developed an operational ocean forecasting system in the Western Mediterranean Sea (WMOP). WMOP uses a regional configuration of the Regional Ocean Modelling System (ROMS, Shchepetkin and McWilliams, 2005) nested in the larger scale Mediterranean Forecasting System (MFS) with a spatial resolution of 1.5-2km. WMOP aims at reproducing both the basin-scale ocean circulation and the mesoscale variability which is known to play a crucial role due to its strong interaction with the large scale circulation in this region. An operational validation system has been developed to systematically assess the model outputs at daily, monthly and seasonal time scales. Multi-platform observations are used for this validation, including satellite products (Sea Surface Temperature, Sea Level Anomaly), in situ measurements (from gliders, Argo floats, drifters and fixed moorings) and High-Frequency radar data. The validation procedures allow to monitor and certify the general realism of the daily production of the ocean forecasting system before its distribution to users. Additionally, different indicators (Sea Surface Temperature and Salinity, Eddy Kinetic Energy, Mixed Layer Depth, Heat Content, transports in key sections) are computed every day both at the basin-scale and in several sub-regions (Alboran Sea, Balearic Sea, Gulf of Lion). The daily forecasts, validation diagnostics and indicators from the operational model over the last months are available at www.socib.es.

  14. Development and Validation of the Meaning of Work Inventory among French Workers

    Science.gov (United States)

    Arnoux-Nicolas, Caroline; Sovet, Laurent; Lhotellier, Lin; Bernaud, Jean-Luc

    2017-01-01

    The purpose of this study was to validate a psychometric instrument among French workers for assessing the meaning of work. Following an empirical framework, a two-step procedure consisted of exploring and then validating the scale among distinctive samples. The consequent Meaning of Work Inventory is a 15-item scale based on a four-factor model,…

  15. 40 CFR 1054.501 - How do I run a valid emission test?

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false How do I run a valid emission test... Procedures § 1054.501 How do I run a valid emission test? (a) Applicability. This subpart is addressed to you... provisions of 40 CFR 1065.405 describes how to prepare an engine for testing. However, you may consider...

  16. Italian version of Dyspnoea-12: cultural-linguistic validation, quantitative and qualitative content validity study.

    Science.gov (United States)

    Caruso, Rosario; Arrigoni, Cristina; Groppelli, Katia; Magon, Arianna; Dellafiore, Federica; Pittella, Francesco; Grugnetti, Anna Maria; Chessa, Massimo; Yorke, Janelle

    2018-01-16

    Dyspnoea-12 is a valid and reliable scale to assess dyspneic symptom, considering its severity, physical and emotional components. However, it is not available in Italian version due to it was not yet translated and validated. For this reason, the aim of this study was to develop an Italian version Dyspnoea-12, providing a cultural and linguistic validation, supported by the quantitative and qualitative content validity. This was a methodological study, divided into two phases: phase one is related to the cultural and linguistic validation, phase two is related to test the quantitative and qualitative content validity. Linguistic validation followed a standardized translation process. Quantitative content validity was assessed computing content validity ratio (CVR) and index (I-CVIs and S-CVI) from expert panellists response. Qualitative content validity was assessed by the narrative analysis on the answers of three open-ended questions to the expert panellists, aimed to investigate the clarity and the pertinence of the Italian items. The translation process found a good agreement in considering clear the items in both the six involved bilingual expert translators and among the ten voluntary involved patients. CVR, I-CVIs and S-CVI were satisfactory for all the translated items. This study has represented a pivotal step to use Dyspnoea-12 amongst Italian patients. Future researches are needed to deeply investigate the Italian version of  Dyspnoea-12 construct validity and its reliability, and to describe how dyspnoea components (i.e. physical and emotional) impact the life of patients with cardiorespiratory diseases.

  17. Age validation of canary rockfish (Sebastes pinniger) using two independent otolith techniques: lead-radium and bomb radiocarbon dating.

    Energy Technology Data Exchange (ETDEWEB)

    Andrews, A H; Kerr, L A; Cailliet, G M; Brown, T A; Lundstrom, C C; Stanley, R D

    2007-11-04

    Canary rockfish (Sebastes pinniger) have long been an important part of recreational and commercial rockfish fishing from southeast Alaska to southern California, but localized stock abundances have declined considerably. Based on age estimates from otoliths and other structures, lifespan estimates vary from about 20 years to over 80 years. For the purpose of monitoring stocks, age composition is routinely estimated by counting growth zones in otoliths; however, age estimation procedures and lifespan estimates remain largely unvalidated. Typical age validation techniques have limited application for canary rockfish because they are deep dwelling and may be long lived. In this study, the unaged otolith of the pair from fish aged at the Department of Fisheries and Oceans Canada was used in one of two age validation techniques: (1) lead-radium dating and (2) bomb radiocarbon ({sup 14}C) dating. Age estimate accuracy and the validity of age estimation procedures were validated based on the results from each technique. Lead-radium dating proved successful in determining a minimum estimate of lifespan was 53 years and provided support for age estimation procedures up to about 50-60 years. These findings were further supported by {Delta}{sup 14}C data, which indicated a minimum estimate of lifespan was 44 {+-} 3 years. Both techniques validate, to differing degrees, age estimation procedures and provide support for inferring that canary rockfish can live more than 80 years.

  18. D0 Central Tracking Solenoid Energization, Controls, Interlocks and Quench Protection Initial Validation Procedures

    International Nuclear Information System (INIS)

    Jaskierny, W.; Hance, R.

    1998-01-01

    This note presents the inspection and tests to be performed on the DZERO solenoid energization, controls, interlocks and quench protection system before it is energized for the first time. This test is to be performed with a 5000A jumper at the end of the bus instead of the solenoid. This system is based in DZERO room 511. A copy of this note shall be annotated, signed and dated by the person coordinating the procedure; and filed with the system maintenance records. Annotations shall include comments about any aspect of the procedure that is abnormal or unsuccessful. The following inspections and tests shall be performed by persons knowledgeable about the system. Each individual test step should be reviewed and understood before proceeding with that step.

  19. Influence of the derivatization procedure on the results of the gaschromatographic fatty acid analysis of human milk and infant formulae.

    Science.gov (United States)

    Kohn, G; van der Ploeg, P; Möbius, M; Sawatzki, G

    1996-09-01

    Many different analytical procedures for fatty acid analysis of infant formulae and human milk are described. The objective was to study possible pitfalls in the use of different acid-catalyzed procedures compared to a base-catalyzed procedure based on sodium-methoxide in methanol. The influence of the different methods on the relative fatty acid composition (wt% of total fatty acids) and the total fatty acid recovery rate (expressed as % of total lipids) was studied in two experimental LCP-containing formulae and a human milk sample. MeOH/HCl-procedures were found to result in an incomplete transesterification of triglycerides, if an additional nonpolar solvent like toluene or hexane is not added and a water-free preparation is not guaranteed. In infant formulae the low transesterification of triglycerides (up to only 37%) could result in an 100%-overestimation of the relative amount of LCP, if these fatty acids primarily derive from phospholipids. This is the case in infant formulae containing egg lipids as raw materials. In formula containing fish oils and in human milk the efficacy of esterification results in incorrect absolute amounts of fatty acids, but has no remarkable effect on the relative fatty acid distribution. This is due to the fact that in these samples LCP are primarily bound to triglycerides. Furthermore, in formulae based on butterfat the derivatization procedure should be designed in such a way that losses of short-chain fatty acids due to evaporation steps can be avoided. The procedure based on sodium methoxide was found to result in a satisfactory (about 90%) conversion of formula lipids and a reliable content of all individual fatty acids. Due to a possibly high amount of free fatty acids in human milk, which are not methylated by sodium-methoxide, caution is expressed about the use of this reagent for fatty acid analysis of mothers milk. It is concluded that accurate fatty acid analysis of infant formulae and human milk requires a careful

  20. Does direct observation of procedural skills reflect trainee's progress in otolaryngology?

    Science.gov (United States)

    Awad, Z; Hayden, L; Muthuswamy, K; Ziprin, P; Darzi, A; Tolley, N S

    2014-06-01

    UK surgical trainees are required to undertake work-based assessments each year in order to progress in their training. Direct Observation of Procedural Skills (DOPS) is one of these assessments. We aim to investigate the validity of DOPS in assessing otolaryngology trainees at all levels. A retrospective search of the portfolios of all otolaryngology trainees in North Thames was carried out to identify otolaryngology-specific DOPS. A score (Cs) was calculated for each DOPS based on the percentage of satisfactorily-rated items. The overall performance rating (Ps) was analysed as a separate variable and compared with Cs. The Ps and Cs results were then compared across trainee grades and levels within each grade: Core trainees (CT1-CT2) and specialty trainees (ST3-ST8). Seven hundred and sixty-seven otolaryngology DOPS were completed between August 2008 and September 2013. The tool was found to be reliable and internally consistent. Trainees in ST grade had higher Cs and Ps scores than CT grade (P Otolaryngology DOPS is a useful tool in assessing otolaryngology trainees especially from CT1-ST3 level. DOPS can also differentiate between junior and senior trainees. However, it was not able to demonstrate progress at levels above ST3, most likely due to the simplicity of the procedures which trainees tend to master in the first few years of training. © 2014 John Wiley & Sons Ltd.

  1. Toward a CFD nose-to-tail capability - Hypersonic unsteady Navier-Stokes code validation

    Science.gov (United States)

    Edwards, Thomas A.; Flores, Jolen

    1989-01-01

    Computational fluid dynamics (CFD) research for hypersonic flows presents new problems in code validation because of the added complexity of the physical models. This paper surveys code validation procedures applicable to hypersonic flow models that include real gas effects. The current status of hypersonic CFD flow analysis is assessed with the Compressible Navier-Stokes (CNS) code as a case study. The methods of code validation discussed to beyond comparison with experimental data to include comparisons with other codes and formulations, component analyses, and estimation of numerical errors. Current results indicate that predicting hypersonic flows of perfect gases and equilibrium air are well in hand. Pressure, shock location, and integrated quantities are relatively easy to predict accurately, while surface quantities such as heat transfer are more sensitive to the solution procedure. Modeling transition to turbulence needs refinement, though preliminary results are promising.

  2. k0IAEA software validation at CDTN/CNEN, Brazil, using certified reference materials

    International Nuclear Information System (INIS)

    Menezes, M.A.B.C.; Jacimovic, R.

    2007-01-01

    The IAEA distributed the k 0I AEA software package program to several laboratories. The Laboratory for Neutron Activation Analysis, at CDTN/CNEN (Centro de Desenvolvimento da Tecnologia Nuclear/Comissao Nacional de Energia Nuclear), Belo Horizonte, Brazil, acquired the k 0I AEA software package during the Workshop on Nuclear Data for Activation Analysis, 2005, held at the Abdus Salam International Centre for Theoretical Physics, Trieste, Italy. This paper is about the validation procedure carried out at the local laboratory aiming at the validation of the k 0I AEA software package. After the software was set up according to the guidelines, the procedure followed at CDTN/CNEN to validate the k 0I AEA software was to analyse several reference materials. The overall results pointed out that the k 0I AEA software is working properly. (author)

  3. PolyNano M.6.1.1 Process validation state-of-the-art

    DEFF Research Database (Denmark)

    Tosello, Guido; Hansen, Hans Nørgaard; Calaon, Matteo

    2012-01-01

    Nano project. Methods for replication process validation are presented and will be further investigated in WP6 “Process Chain Validation” and applied to PolyNano study cases. Based on the available information, effective best practice standard process validation will be defined and implemented...... assessment methods, and presents measuring procedures/techniques suitable for replication fidelity studies. The report reviews state‐of‐the‐art research results regarding replication obtained at different scales, tooling technologies based on surface replication, process validation trough design...

  4. A posteriori model validation for the temporal order of directed functional connectivity maps.

    Science.gov (United States)

    Beltz, Adriene M; Molenaar, Peter C M

    2015-01-01

    A posteriori model validation for the temporal order of neural directed functional connectivity maps is rare. This is striking because models that require sequential independence among residuals are regularly implemented. The aim of the current study was (a) to apply to directed functional connectivity maps of functional magnetic resonance imaging data an a posteriori model validation procedure (i.e., white noise tests of one-step-ahead prediction errors combined with decision criteria for revising the maps based upon Lagrange Multiplier tests), and (b) to demonstrate how the procedure applies to single-subject simulated, single-subject task-related, and multi-subject resting state data. Directed functional connectivity was determined by the unified structural equation model family of approaches in order to map contemporaneous and first order lagged connections among brain regions at the group- and individual-levels while incorporating external input, then white noise tests were run. Findings revealed that the validation procedure successfully detected unmodeled sequential dependencies among residuals and recovered higher order (greater than one) simulated connections, and that the procedure can accommodate task-related input. Findings also revealed that lags greater than one were present in resting state data: With a group-level network that contained only contemporaneous and first order connections, 44% of subjects required second order, individual-level connections in order to obtain maps with white noise residuals. Results have broad methodological relevance (e.g., temporal validation is necessary after directed functional connectivity analyses because the presence of unmodeled higher order sequential dependencies may bias parameter estimates) and substantive implications (e.g., higher order lags may be common in resting state data).

  5. A posteriori model validation for the temporal order of directed functional connectivity maps

    Directory of Open Access Journals (Sweden)

    Adriene M. Beltz

    2015-08-01

    Full Text Available A posteriori model validation for the temporal order of neural directed functional connectivity maps is rare. This is striking because models that require sequential independence among residuals are regularly implemented. The aim of the current study was (a to apply to directed functional connectivity maps of functional magnetic resonance imaging data an a posteriori model validation procedure (i.e., white noise tests of one-step-ahead prediction errors combined with decision criteria for revising the maps based upon Lagrange Multiplier tests, and (b to demonstrate how the procedure applies to single-subject simulated, single-subject task-related, and multi-subject resting state data. Directed functional connectivity was determined by the unified structural equation model family of approaches in order to map contemporaneous and first order lagged connections among brain regions at the group- and individual-levels while incorporating external input, then white noise tests were run. Findings revealed that the validation procedure successfully detected unmodeled sequential dependencies among residuals and recovered higher order (greater than one simulated connections, and that the procedure can accommodate task-related input. Findings also revealed that lags greater than one were present in resting state data: With a group-level network that contained only contemporaneous and first order connections, 44% of subjects required second order, individual-level connections in order to obtain maps with white noise residuals. Results have broad methodological relevance (e.g., temporal validation is necessary after directed functional connectivity analyses because the presence of unmodeled higher order sequential dependencies may bias parameter estimates and substantive implications (e.g., higher order lags may be common in resting state data.

  6. Methods for decreasing population doses due to medical use of ionizing radiations

    International Nuclear Information System (INIS)

    Marej, A.N.

    1984-01-01

    The problem of radiation safety of population as regard to irradiation of a great contingents of people due to diagnosis procedures, carried out using X-ray and radiological methods of examination, is considered. It is shown, that prevention from excessive irradiation of population due to X-ray radiodiagnostic procedures is possible by realization the complex of activities, including legislative, organizational, technical and other measures. Human exposure doses in diagnosis most not exceed permissible ones, established on the basis of cost-benefit criterion. The necessity of the maximum limitation of exposure of pregnant women and children is emphasized

  7. Validation of comprehensive space radiation transport code

    International Nuclear Information System (INIS)

    Shinn, J.L.; Simonsen, L.C.; Cucinotta, F.A.

    1998-01-01

    The HZETRN code has been developed over the past decade to evaluate the local radiation fields within sensitive materials on spacecraft in the space environment. Most of the more important nuclear and atomic processes are now modeled and evaluation within a complex spacecraft geometry with differing material components, including transition effects across boundaries of dissimilar materials, are included. The atomic/nuclear database and transport procedures have received limited validation in laboratory testing with high energy ion beams. The codes have been applied in design of the SAGE-III instrument resulting in material changes to control injurious neutron production, in the study of the Space Shuttle single event upsets, and in validation with space measurements (particle telescopes, tissue equivalent proportional counters, CR-39) on Shuttle and Mir. The present paper reviews the code development and presents recent results in laboratory and space flight validation

  8. Anaphylaxis due to head injury.

    Science.gov (United States)

    Bruner, Heather C; Bruner, David I

    2015-05-01

    Both anaphylaxis and head injury are often seen in the emergency department, but they are rarely seen in combination. We present a case of a 30-year-old woman who presented with anaphylaxis with urticaria and angioedema following a minor head injury. The patient responded well to intramuscular epinephrine without further complications or airway compromise. Prior case reports have reported angioedema from hereditary angioedema during dental procedures and maxillofacial surgery, but there have not been any cases of first-time angioedema or anaphylaxis due to head injury.

  9. Entropy Evaluation Based on Value Validity

    Directory of Open Access Journals (Sweden)

    Tarald O. Kvålseth

    2014-09-01

    Full Text Available Besides its importance in statistical physics and information theory, the Boltzmann-Shannon entropy S has become one of the most widely used and misused summary measures of various attributes (characteristics in diverse fields of study. It has also been the subject of extensive and perhaps excessive generalizations. This paper introduces the concept and criteria for value validity as a means of determining if an entropy takes on values that reasonably reflect the attribute being measured and that permit different types of comparisons to be made for different probability distributions. While neither S nor its relative entropy equivalent S* meet the value-validity conditions, certain power functions of S and S* do to a considerable extent. No parametric generalization offers any advantage over S in this regard. A measure based on Euclidean distances between probability distributions is introduced as a potential entropy that does comply fully with the value-validity requirements and its statistical inference procedure is discussed.

  10. Update on procedure-related risks for prenatal diagnosis techniques

    DEFF Research Database (Denmark)

    Tabor, Ann; Alfirevic, Zarko

    2010-01-01

    Introduction: As a consequence of the introduction of effective screening methods, the number of invasive prenatal diagnostic procedures is steadily declining. The aim of this review is to summarize the risks related to these procedures. Material and Methods: Review of the literature. Results: Data...... from randomised controlled trials as well as from systematic reviews and a large national registry study are consistent with a procedure-related miscarriage rate of 0.5-1.0% for amniocentesis as well as for chorionic villus sampling (CVS). In single-center studies performance may be remarkably good due...... not be performed before 15 + 0 weeks' gestation. CVS on the other hand should not be performed before 10 weeks' gestation due to a possible increase in risk of limb reduction defects. Discussion: Experienced operators have a higher success rate and a lower complication rate. The decreasing number of prenatal...

  11. 40 CFR 1048.501 - How do I run a valid emission test?

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false How do I run a valid emission test... § 1048.501 How do I run a valid emission test? (a) Use the equipment and procedures for spark-ignition... 86.132-96(h) and then operate the engine for 60 minutes over repeat runs of the duty cycle specified...

  12. Mollusc reproductive toxicity tests - Development and validation of test guidelines

    DEFF Research Database (Denmark)

    Ducrot, Virginie; Holbech, Henrik; Kinnberg, Karin Lund

    . Draft standard operating procedures (SOPs) have been designed based upon literature and expert knowledge from project partners. Pre-validation studies have been implemented to validate the proposed test conditions and identify issues in performing the SOPs and analyzing test results. Pre-validation work......The Organisation for Economic Cooperation and Development is promoting the development and validation of mollusc toxicity tests within its test guidelines programme, eventually aiming for the standardization of mollusc apical toxicity tests. Through collaborative work between academia, industry...... and stakeholders, this study aims to develop innovative partial life-cycle tests on the reproduction of the freshwater gastropods Potamopyrgus antipodarum and Lymnaea stagnalis, which are relevant candidate species for the standardization of mollusc apical toxicity tests assessing reprotoxic effects of chemicals...

  13. Ghanaian nurses' knowledge of invasive procedural pain and its effect on children, parents and nurses.

    Science.gov (United States)

    Anim-Boamah, Oboshie; Aziato, Lydia; Adabayeri, Victoria May

    2017-09-11

    To explore Ghanaian nurses' knowledge of invasive procedural pain in children who are in hospital and to identify the effect of unrelieved pain on children, parents and nurses. An exploratory, descriptive and qualitative design was adopted. A purposive sampling technique was used and individual face-to-face, semi-structured interviews were conducted with 16 registered nurses from four children's units at a hospital in the Eastern Region of Ghana. Thematic and content analyses were performed. Four themes emerged: types of invasive procedure; pain expression; pain assessment; and effects of unrelieved pain. Participants had adequate knowledge of painful invasive procedures, however, they were not aware of the range of available validated pain assessment tools, using observations and body language instead to assess pain. Ghanaian nurses require education on the use of validated rating scales to assess procedural pain in children. The inclusion of pain assessment and management in pre-registration curricula could improve knowledge. ©2012 RCN Publishing Company Ltd. All rights reserved. Not to be copied, transmitted or recorded in any way, in whole or part, without prior permission of the publishers.

  14. Robot-assisted procedures in pediatric neurosurgery.

    Science.gov (United States)

    De Benedictis, Alessandro; Trezza, Andrea; Carai, Andrea; Genovese, Elisabetta; Procaccini, Emidio; Messina, Raffaella; Randi, Franco; Cossu, Silvia; Esposito, Giacomo; Palma, Paolo; Amante, Paolina; Rizzi, Michele; Marras, Carlo Efisio

    2017-05-01

    OBJECTIVE During the last 3 decades, robotic technology has rapidly spread across several surgical fields due to the continuous evolution of its versatility, stability, dexterity, and haptic properties. Neurosurgery pioneered the development of robotics, with the aim of improving the quality of several procedures requiring a high degree of accuracy and safety. Moreover, robot-guided approaches are of special interest in pediatric patients, who often have altered anatomy and challenging relationships between the diseased and eloquent structures. Nevertheless, the use of robots has been rarely reported in children. In this work, the authors describe their experience using the ROSA device (Robotized Stereotactic Assistant) in the neurosurgical management of a pediatric population. METHODS Between 2011 and 2016, 116 children underwent ROSA-assisted procedures for a variety of diseases (epilepsy, brain tumors, intra- or extraventricular and tumor cysts, obstructive hydrocephalus, and movement and behavioral disorders). Each patient received accurate preoperative planning of optimal trajectories, intraoperative frameless registration, surgical treatment using specific instruments held by the robotic arm, and postoperative CT or MR imaging. RESULTS The authors performed 128 consecutive surgeries, including implantation of 386 electrodes for stereo-electroencephalography (36 procedures), neuroendoscopy (42 procedures), stereotactic biopsy (26 procedures), pallidotomy (12 procedures), shunt placement (6 procedures), deep brain stimulation procedures (3 procedures), and stereotactic cyst aspiration (3 procedures). For each procedure, the authors analyzed and discussed accuracy, timing, and complications. CONCLUSIONS To the best their knowledge, the authors present the largest reported series of pediatric neurosurgical cases assisted by robotic support. The ROSA system provided improved safety and feasibility of minimally invasive approaches, thus optimizing the surgical

  15. Pancreaticoduodenectomy: a rare procedure for the management of complex pancreaticoduodenal injuries.

    Science.gov (United States)

    Asensio, Juan A; Petrone, Patrizio; Roldán, Gustavo; Kuncir, Eric; Demetriades, Demetrios

    2003-12-01

    Pancreaticoduodenectomy (Whipple's procedure) is a formidable procedure when undertaken for severe pancreaticoduodenal injury. The purposes of this study were to review our experience with this procedure for trauma; to classify injury grades for both pancreatic and duodenal injuries in patients undergoing pancreaticoduodenectomy according to the American Association for the Surgery of Trauma-Organ Injury Scale for pancreatic and duodenal injury; and to validate existing indications for performance of this procedure. We performed a retrospective 126-month study (May 1992 to December 2002) of all patients admitted with proven complex pancreaticoduodenal injuries requiring pancreaticoduodenectomy. Eighteen patients were included; mean age was 32 +/- 12 years (SD), mean Revised Trauma Score was 6.84 +/- 2.13 (SD), and mean Injury Severity Score was 27 +/- 8 (SD). There were 17 penetrating injuries (94%) and 1 blunt injury (6%). One of 18 patients had an emergency department thoracotomy and died (100% mortality); 5 of the remaining 17 patients required operating room thoracotomies, and only 1 survived (80% mortality). There was 1 AAST-OIS pancreas grade IV injury, and there were 17 pancreas grade V injuries and 18 AAST-OIS duodenum grade V injuries. Indications for pancreaticoduodenectomy were: massive uncontrollable retropancreatic hemorrhage, 13 patients (72%); massive unreconstructable injury to the head of the pancreas/main pancreatic duct and intrapancreatic portion/distal common bile duct, 18 patients (100%); and massive unreconstructable injury, 18 patients (100%). Mean estimated blood loss was 6,888 +/- 7,866 mL, and overall survival was 67% (12 of 18 patients). Complex pancreaticoduodenal injuries requiring pancreaticoduodenectomy (Whipple's procedure) are uncommon but highly lethal; virtually all are classified as AAST-OIS grade V for both pancreas and duodenum. Current indications for performance of pancreaticoduodenectomy are valid and should be strictly

  16. The Procedural Queer: Substantive Due Process, "Lawrence v. Texas," and Queer Rhetorical Futures

    Science.gov (United States)

    Campbell, Peter Odell

    2012-01-01

    This essay discusses Justice Anthony M. Kennedy's choice to foreground arguments from due process rather than equal protection in the majority opinion in Lawrence v. Texas. Kennedy's choice can realize constitutional legal doctrine that is more consistent with radical queer politics than arguments from equal protection. Unlike some recent…

  17. A symptom based decision tree approach to boiling water reactor emergency operating procedures

    International Nuclear Information System (INIS)

    Knobel, R.C.

    1984-01-01

    This paper describes a Decision Tree approach to development of BWR Emergency Operating Procedures for use by operators during emergencies. This approach utilizes the symptom based Emergency Procedure Guidelines approved for implementation by the USNRC. Included in the paper is a discussion of the relative merits of the event based Emergency Operating Procedures currently in use at USBWR plants. The body of the paper is devoted to a discussion of the Decision Tree Approach to Emergency Operating Procedures soon to be implemented at two United States Boiling Water Reactor plants, why this approach solves many of the problems with procedures indentified in the post accident reviews of Three Mile Island procedures, and why only now is this approach both desirable and feasible. The paper discusses how nuclear plant simulators were involved in the development of the Emergency Operating Procedure decision trees, and in the verification and validation of these procedures. (orig./HP)

  18. Eye bank procedures: donor selection criteria.

    Science.gov (United States)

    Sousa, Sidney Júlio de Faria E; Sousa, Stella Barretto de Faria E

    2018-01-01

    Eye banks use sterile procedures to manipulate the eye, antiseptic measures for ocular surface decontamination, and rigorous criteria for donor selection to minimize the possibility of disease transmission due to corneal grafting. Donor selection focuses on analysis of medical records and specific post-mortem serological tests. To guide and standardize procedures, eye bank associations and government agencies provide lists of absolute and relative contraindications for use of the tissue based on donor health history. These lists are guardians of the Hippocratic principle "primum non nocere." However, each transplantation carries risk of transmission of potentially harmful agents to the recipient. The aim of the procedures is not to eliminate risk, but limit it to a reasonable level. The balance between safety and corneal availability needs to be maintained by exercising prudence without disproportionate rigor.

  19. Static Validation of Security Protocols

    DEFF Research Database (Denmark)

    Bodei, Chiara; Buchholtz, Mikael; Degano, P.

    2005-01-01

    We methodically expand protocol narrations into terms of a process algebra in order to specify some of the checks that need to be made in a protocol. We then apply static analysis technology to develop an automatic validation procedure for protocols. Finally, we demonstrate that these techniques ...... suffice to identify several authentication flaws in symmetric and asymmetric key protocols such as Needham-Schroeder symmetric key, Otway-Rees, Yahalom, Andrew secure RPC, Needham-Schroeder asymmetric key, and Beller-Chang-Yacobi MSR...

  20. Development of Small UAS Beyond-Visual-Line-of-Sight (BVLOS Flight Operations: System Requirements and Procedures

    Directory of Open Access Journals (Sweden)

    Scott Xiang Fang

    2018-04-01

    Full Text Available Due to safety concerns of integrating small unmanned aircraft systems (UAS into non-segregated airspace, aviation authorities have required a set of detect and avoid (DAA systems to be equipped on small UAS for beyond-visual-line-of-sight (BVLOS flight operations in civil airspace. However, the development of small UAS DAA systems also requires BVLOS flights for testing and validation. To mitigate operational risks for small UAS BVLOS flight operations, this paper proposes to initially test small UAS DAA systems in BVLOS flights in a restricted airspace with additional safety features. Later, this paper further discusses the operating procedures and emergency action plans for small UAS BVLOS flight operations. The testing results show that these safety systems developed can help improve operational safety for small UAS BVLOS flight operations.

  1. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    Energy Technology Data Exchange (ETDEWEB)

    Ekechukwu, A

    2009-05-27

    Method validation is the process of evaluating whether an analytical method is acceptable for its intended purpose. For pharmaceutical methods, guidelines from the United States Pharmacopeia (USP), International Conference on Harmonisation (ICH), and the United States Food and Drug Administration (USFDA) provide a framework for performing such valications. In general, methods for regulatory compliance must include studies on specificity, linearity, accuracy, precision, range, detection limit, quantitation limit, and robustness. Elements of these guidelines are readily adapted to the issue of validation for beryllium sampling and analysis. This document provides a listing of available sources which can be used to validate analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers and books reviewed is given in the Appendix. Available validation documents and guides are listed therein; each has a brief description of application and use. In the referenced sources, there are varying approches to validation and varying descriptions of the valication process at different stages in method development. This discussion focuses on valication and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all referenced documents were published in English.

  2. Assessment of the MPACT Resonance Data Generation Procedure

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Kang Seog [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Williams, Mark L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-12-26

    Currently, heterogeneous models are being used to generate resonance self-shielded cross-section tables as a function of background cross sections for important nuclides such as 235U and 238U by performing the CENTRM (Continuous Energy Transport Model) slowing down calculation with the MOC (Method of Characteristics) spatial discretization and ESSM (Embedded Self-Shielding Method) calculations to obtain background cross sections. And then the resonance self-shielded cross section tables are converted into subgroup data which are to be used in estimating problem-dependent self-shielded cross sections in MPACT (Michigan Parallel Characteristics Transport Code). Although this procedure has been developed and thus resonance data have been generated and validated by benchmark calculations, assessment has never been performed to review if the resonance data are properly generated by the procedure and utilized in MPACT. This study focuses on assessing the procedure and a proper use in MPACT.

  3. Validating and comparing GNSS antenna calibrations

    Science.gov (United States)

    Kallio, Ulla; Koivula, Hannu; Lahtinen, Sonja; Nikkonen, Ville; Poutanen, Markku

    2018-03-01

    GNSS antennas have no fixed electrical reference point. The variation of the phase centre is modelled and tabulated in antenna calibration tables, which include the offset vector (PCO) and phase centre variation (PCV) for each frequency according to the elevations and azimuths of the incoming signal. Used together, PCV and PCO reduce the phase observations to the antenna reference point. The remaining biases, called the residual offsets, can be revealed by circulating and rotating the antennas on pillars. The residual offsets are estimated as additional parameters when combining the daily GNSS network solutions with full covariance matrix. We present a procedure for validating the antenna calibration tables. The dedicated test field, called Revolver, was constructed at Metsähovi. We used the procedure to validate the calibration tables of 17 antennas. Tables from the IGS and three different calibration institutions were used. The tests show that we were able to separate the residual offsets at the millimetre level. We also investigated the influence of the calibration tables from the different institutions on site coordinates by performing kinematic double-difference baseline processing of the data from one site with different antenna tables. We found small but significant differences between the tables.

  4. The child-Langmuir limit for semiconductors: a numerical validation

    International Nuclear Information System (INIS)

    Caceres, M.J.; Carrillo, J.A.; Degond, P.

    2002-01-01

    The Boltzmann-Poisson system modeling the electron flow in semiconductors is used to discuss the validity of the Child-Langmuir asymptotics. The scattering kernel is approximated by a simple relaxation time operator. The Child-Langmuir limit gives an approximation of the current-voltage characteristic curves by means of a scaling procedure in which the ballistic velocity is much larger that the thermal one. We discuss the validity of the Child-Langmuir regime by performing detailed numerical comparisons between the simulation of the Boltzmann-Poisson system and the Child-Langmuir equations in test problems. (authors)

  5. Radiation doses to patients in haemodynamic procedures

    International Nuclear Information System (INIS)

    Canadillas-Perdomo, B.; Catalan-Acosta, A.; Hernandez-Armas, J.; Perez-Martin, C.; Armas-Trujillo, D. de

    2001-01-01

    Interventional radio-cardiology gives high doses to patients due to high values of fluoroscopy times and large series of radiographic images. The main objective of the present work is the determination of de dose-area product (DAP) in patients of three different types of cardiology procedures with X-rays. The effective doses were estimated trough the organ doses values measured with thermoluminescent dosimeters (TLDs-100), suitable calibrated, placed in a phantom type Rando which was submitted to the same radiological conditions corresponding to the procedures made on patients. The values for the effective doses in the procedures CAD Seldinger was 6.20 mSv on average and 1.85mSv for pacemaker implants. (author)

  6. Radiation doses to patients in haemodynamic procedures

    Energy Technology Data Exchange (ETDEWEB)

    Canadillas-Perdomo, B; Catalan-Acosta, A; Hernandez-Armas, J [Servicio de Fisica Medica, Hospital Universitario de Canarias, La Laguna, Tenerife (Spain); Perez-Martin, C [Servicio de Ingenieria Biomedica, Hospital Universitario de Canarias, La Laguna, Tenerife (Spain); Armas-Trujillo, D de [Servicio de Cardiologia, Hospital Universitario de Canarias, La Laguna, Tenerife (Spain)

    2001-03-01

    Interventional radio-cardiology gives high doses to patients due to high values of fluoroscopy times and large series of radiographic images. The main objective of the present work is the determination of de dose-area product (DAP) in patients of three different types of cardiology procedures with X-rays. The effective doses were estimated trough the organ doses values measured with thermoluminescent dosimeters (TLDs-100), suitable calibrated, placed in a phantom type Rando which was submitted to the same radiological conditions corresponding to the procedures made on patients. The values for the effective doses in the procedures CAD Seldinger was 6.20 mSv on average and 1.85mSv for pacemaker implants. (author)

  7. A procedure to validate and correct the {sup 13}C chemical shift calibration of RNA datasets

    Energy Technology Data Exchange (ETDEWEB)

    Aeschbacher, Thomas; Schubert, Mario, E-mail: schubert@mol.biol.ethz.ch; Allain, Frederic H.-T., E-mail: allain@mol.biol.ethz.ch [ETH Zuerich, Institute for Molecular Biology and Biophysics (Switzerland)

    2012-02-15

    Chemical shifts reflect the structural environment of a certain nucleus and can be used to extract structural and dynamic information. Proper calibration is indispensable to extract such information from chemical shifts. Whereas a variety of procedures exist to verify the chemical shift calibration for proteins, no such procedure is available for RNAs to date. We present here a procedure to analyze and correct the calibration of {sup 13}C NMR data of RNAs. Our procedure uses five {sup 13}C chemical shifts as a reference, each of them found in a narrow shift range in most datasets deposited in the Biological Magnetic Resonance Bank. In 49 datasets we could evaluate the {sup 13}C calibration and detect errors or inconsistencies in RNA {sup 13}C chemical shifts based on these chemical shift reference values. More than half of the datasets (27 out of those 49) were found to be improperly referenced or contained inconsistencies. This large inconsistency rate possibly explains that no clear structure-{sup 13}C chemical shift relationship has emerged for RNA so far. We were able to recalibrate or correct 17 datasets resulting in 39 usable {sup 13}C datasets. 6 new datasets from our lab were used to verify our method increasing the database to 45 usable datasets. We can now search for structure-chemical shift relationships with this improved list of {sup 13}C chemical shift data. This is demonstrated by a clear relationship between ribose {sup 13}C shifts and the sugar pucker, which can be used to predict a C2 Prime - or C3 Prime -endo conformation of the ribose with high accuracy. The improved quality of the chemical shift data allows statistical analysis with the potential to facilitate assignment procedures, and the extraction of restraints for structure calculations of RNA.

  8. Study Protocol for the Preschooler Regulation of Emotional Stress (PRES Procedure

    Directory of Open Access Journals (Sweden)

    Livio Provenzi

    2017-09-01

    Full Text Available Background: Emotional stress regulation (ESR rapidly develops during the first months of age and includes different behavioral strategies which largely contribute to children’s behavioral and emotional adjustment later in life. The assessment of ESR during the first years of life is critical to identify preschool children who are at developmental risk. Although ESR is generally included in larger temperament batteries [e.g., the Laboratory Temperament Assessment Battery (Lab-TAB], there is no standardized observational procedure to specifically assess and measure ESR in preschool aged children.Aim: Here, we describe the development of an observational procedure to assess ESR in preschool aged children [i.e., the Preschooler Regulation of Emotional Stress (PRES Procedure] and the related coding system.Methods: Four Lab-TAB emotional stress episodes (i.e., the Stranger, the Perfect Circle, the Missing Sticker, and the Transparent Box have been selected. Independent coders developed a list of ESR codes resulting in two general indexes (i.e., active engagement and stress level and five specific indexes (i.e., anger, control, fear, inhibition, sadness. Finally, specific actions have been planned to assess the validity and the coding system reliability of PRES procedure.Ethics and Dissemination: The study has been approved by the Ethical Committee of the Scientific Institute IRCCS Eugenio Medea, Bosisio Parini (Italy. The PRES validation and reliability assessment as well as its use with healthy and at-risk populations of preschool children will be object of future scientific publications and international conference presentations.

  9. Validation of novel recipes for double-blind, placebo-controlled food challenges in children and adults

    NARCIS (Netherlands)

    Vlieg-Boerstra, B. J.; Herpertz, I.; Pasker, L.; van der Heide, S.; Kukler, J.; Jansink, C.; Vaessen, W.; Beusekamp, B. J.; Dubois, A. E. J.

    2011-01-01

    In double-blind, placebo-controlled food challenges (DBPCFCs), the use of challenge materials in which blinding is validated is a prerequisite for obtaining true blinded conditions during the test procedure. Therefore, the aim of this study was to enlarge the available range of validated recipes for

  10. A simple method for validation and verification of pipettes mounted on automated liquid handlers

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Hansen, Anders Johannes; Frøslev, Tobias G

    2011-01-01

    We have implemented a simple, inexpensive, and fast procedure for validation and verification of the performance of pipettes mounted on automated liquid handlers (ALHs) as necessary for laboratories accredited under ISO 17025. A six- or seven-step serial dilution of OrangeG was prepared in quadru......We have implemented a simple, inexpensive, and fast procedure for validation and verification of the performance of pipettes mounted on automated liquid handlers (ALHs) as necessary for laboratories accredited under ISO 17025. A six- or seven-step serial dilution of OrangeG was prepared...... are freely available. In conclusion, we have set up a simple, inexpensive, and fast solution for the continuous validation of ALHs used for accredited work according to the ISO 17025 standard. The method is easy to use for aqueous solutions but requires a spectrophotometer that can read microtiter plates....

  11. The management of unimplantable stent during endovascular procedure:report of three cases

    International Nuclear Information System (INIS)

    Xiong Jiang; Wang Lijun; Guo Wei; Liu Xiaoping; Yin Tai; Jia Xin; Ma Xiaohui; Liu Meng; Zhang Hongpeng; Zhang Minhong

    2010-01-01

    Objective: To summarize the experience of dealing with the difficulty of the stent implantation encountered in the endovascular procedure. Methods: The causes of unimplantable stent encountered in the endovascular procedure included the delivery system entraping due to the stenosis and shrinking of peripheral self-expandable stent, the balloon expandable stent implantation and retrievement failure due to the rupture-balloon or stent edge opening, and the delivery system entraping due to aortic stent graft for aorta kinking. The balloon dilation for the stenosis and shrinking stent, the large caliber introducer sheath for removal of the rupture-balloon and edge opening, the expandable stent and balloon-assisted delivery system retrieve were used to solve the above three dilemma of unimplantable stent occurred in the endovascular procedure. Results: These three dilemma of stent unimplantable problem in the endovascular therapy were solved by endovascular method while little additional incision injury was added to the patients. Conclusion: For solving stent unimplantable problem the endovascular technique is the method of first choice, nevertherless, it is very important for the operator to be highly skilled in manipulating endovascular procedure. (authors)

  12. Quality assurance procedures for the CONTAIN severe reactor accident computer code

    International Nuclear Information System (INIS)

    Russell, N.A.; Washington, K.E.; Bergeron, K.D.; Murata, K.K.; Carroll, D.E.; Harris, C.L.

    1991-01-01

    The CONTAIN quality assurance program follows a strict set of procedures designed to ensure the integrity of the code, to avoid errors in the code, and to prolong the life of the code. The code itself is maintained under a code-configuration control system that provides a historical record of changes. All changes are incorporated using an update processor that allows separate identification of improvements made to each successive code version. Code modifications and improvements are formally reviewed and checked. An exhaustive, multilevel test program validates the theory and implementation of all codes changes through assessment calculations that compare the code-predicted results to standard handbooks of idealized test cases. A document trail and archive establish the problems solved by the software, the verification and validation of the software, software changes and subsequent reverification and revalidation, and the tracking of software problems and actions taken to resolve those problems. This document describes in detail the CONTAIN quality assurance procedures. 4 refs., 21 figs., 4 tabs

  13. New procedure for departure formalities

    CERN Multimedia

    HR & GS Departments

    2011-01-01

    As part of the process of simplifying procedures and rationalising administrative processes, the HR and GS Departments have introduced new personalised departure formalities on EDH. These new formalities have applied to students leaving CERN since last year and from 17 October 2011 this procedure will be extended to the following categories of CERN personnel: Staff members, Fellows and Associates. It is planned to extend this electronic procedure to the users in due course. What purpose do departure formalities serve? The departure formalities are designed to ensure that members of the personnel contact all the relevant services in order to return any necessary items (equipment, cards, keys, dosimeter, electronic equipment, books, etc.) and are aware of all the benefits to which they are entitled on termination of their contract. The new departure formalities on EDH have the advantage of tailoring the list of services that each member of the personnel must visit to suit his individual contractual and p...

  14. Validation of the process control system of an automated large scale manufacturing plant.

    Science.gov (United States)

    Neuhaus, H; Kremers, H; Karrer, T; Traut, R H

    1998-02-01

    The validation procedure for the process control system of a plant for the large scale production of human albumin from plasma fractions is described. A validation master plan is developed, defining the system and elements to be validated, the interfaces with other systems with the validation limits, a general validation concept and supporting documentation. Based on this master plan, the validation protocols are developed. For the validation, the system is subdivided into a field level, which is the equipment part, and an automation level. The automation level is further subdivided into sections according to the different software modules. Based on a risk categorization of the modules, the qualification activities are defined. The test scripts for the different qualification levels (installation, operational and performance qualification) are developed according to a previously performed risk analysis.

  15. Internal Control Organization Procedure

    OpenAIRE

    Radu Dorin Lenghel

    2013-01-01

    Internal control represents the totality of policies and procedures adopted by management, which contribute: to the fulfilment of managerial objectives, to the prevention and detection of frauds or errors, to the accuracy and exhaustiveness of accounting entries, as well as to the preparation in due course of financial accounting information. Internal control represents a managerial instrument which assures the fulfilment of objectives of the entity, being an ongoing process in which administ...

  16. Structural system identification: Structural dynamics model validation

    Energy Technology Data Exchange (ETDEWEB)

    Red-Horse, J.R.

    1997-04-01

    Structural system identification is concerned with the development of systematic procedures and tools for developing predictive analytical models based on a physical structure`s dynamic response characteristics. It is a multidisciplinary process that involves the ability (1) to define high fidelity physics-based analysis models, (2) to acquire accurate test-derived information for physical specimens using diagnostic experiments, (3) to validate the numerical simulation model by reconciling differences that inevitably exist between the analysis model and the experimental data, and (4) to quantify uncertainties in the final system models and subsequent numerical simulations. The goal of this project was to develop structural system identification techniques and software suitable for both research and production applications in code and model validation.

  17. Toward feasible, valid, and reliable video-based assessments of technical surgical skills in the operating room

    DEFF Research Database (Denmark)

    Aggarwal, R.; Grantcharov, T.; Moorthy, K.

    2008-01-01

    .72). Conclusions: Video-based technical skills evaluation in the operating room is feasible, valid and reliable. Global rating scales hold promise for summative assessment, though further work is necessary to elucidate the value of procedural rating scales Udgivelsesdato: 2008/2......Objective: To determine the feasibility, validity, inter-rater, and intertest reliability of 4 previously published video-based rating scales, for technical skills assessment on a benchmark laparoscopic procedure. Summary Background Data: Assessment of technical skills is crucial...... to the demonstration and maintenance of competent healthcare practitioners. Traditional assessment methods are prone to subjectivity through a lack of proven validity and reliability. Methods: Nineteen surgeons (6 novice and 13 experienced) performed a median of 2 laparoscopic cholecystectomies each (range 1-5) on 53...

  18. Approaches for accounting and prediction of fast neutron fluence on WWER pressure vessels and results of validation of calculational procedure

    International Nuclear Information System (INIS)

    Borodkin, P.G.; Khrennikov, N.N.; Ryabinin, Yu.A.; Adeev, V.A.

    2015-01-01

    A description is given of the universal procedure for calculation of fast neutron fluence (FNF) on WWER vessels. Approbation of the calculation procedure was carried out by comparing the calculation results for this procedure and measurements on the outer surface of the WWER-440 and WWER-1000 vessels. In addition, an estimation of the uncertainty of the settlement procedure was made in accordance with the requirements of regulatory documents. The developed procedure is applied at Kola NPP for independent fast neutron fluence estimates on the WWER-440 reactor vessels when planning core loads taking into account the introduction of new fuels. The results of the pilot operation of the procedure for calculating FNF at the Kola NPP were taken into account when improving the procedure and its application to the calculations of FNF on the WWER-1000 vessels [ru

  19. Selenium speciation in phosphate mine soils and evaluation of a sequential extraction procedure using XAFS

    Energy Technology Data Exchange (ETDEWEB)

    Favorito, Jessica E.; Luxton, Todd P.; Eick, Matthew J.; Grossl, Paul R. (VP); (Utah SU); (EPA)

    2017-10-01

    Selenium is a trace element found in western US soils, where ingestion of Se-accumulating plants has resulted in livestock fatalities. Therefore, a reliable understanding of Se speciation and bioavailability is critical for effective mitigation. Sequential extraction procedures (SEP) are often employed to examine Se phases and speciation in contaminated soils but may be limited by experimental conditions. We examined the validity of a SEP using X-ray absorption spectroscopy (XAS) for both whole and a sequence of extracted soils. The sequence included removal of soluble, PO4-extractable, carbonate, amorphous Fe-oxide, crystalline Fe-oxide, organic, and residual Se forms. For whole soils, XANES analyses indicated Se(0) and Se(-II) predominated, with lower amounts of Se(IV) present, related to carbonates and Fe-oxides. Oxidized Se species were more elevated and residual/elemental Se was lower than previous SEP results from ICP-AES suggested. For soils from the SEP sequence, XANES results indicated only partial recovery of carbonate, Fe-oxide and organic Se. This suggests Se was incompletely removed during designated extractions, possibly due to lack of mineral solubilization or reagent specificity. Selenium fractions associated with Fe-oxides were reduced in amount or removed after using hydroxylamine HCl for most soils examined. XANES results indicate partial dissolution of solid-phases may occur during extraction processes. This study demonstrates why precautions should be taken to improve the validity of SEPs. Mineralogical and chemical characterizations should be completed prior to SEP implementation to identify extractable phases or mineral components that may influence extraction effectiveness. Sequential extraction procedures can be appropriately tailored for reliable quantification of speciation in contaminated soils.

  20. EXTRAPOLATING THE SUITABILITY OF SOILS AS NATURAL REACTORS USING AN EXISTING SOIL MAP: APPLICATION OF PEDOTRANSFER FUNCTIONS, SPATIAL INTEGRATION AND VALIDATION PROCEDURES

    Directory of Open Access Journals (Sweden)

    Yameli Guadalupe Aguilar Duarte

    2011-04-01

    Full Text Available The aim of this study was the spatial identification of the suitability of soils as reactors in the treatment of swine wastewater in the Mexican state of Yucatan, as well as the development of a map with validation procedures. Pedotransfer functions were applied to the existing soils database. A methodological approach was adopted that allowed the spatialization of pedotransfer function data points. A map of the suitability of soil associations as reactors was produced, as well as a map of the level of accuracy of the associations using numerical classification technique, such as discriminant analysis. Soils with the highest suitability indices were found to be Vertisols, Stagnosols, Nitisols and Luvisols. Some 83.9% of the area of Yucatan is marginally suitable for the reception of swine wastewater, 6.5% is moderately suitable, while 6% is suitable. The percentages of the spatial accuracy of the pedotransfer functions range from 62% to 95% with an overall value of 71.5%. The methodological approach proved to be practical, accurate and inexpensive.

  1. Service Providers’ Willingness to Change as Innovation Inductor in Services: Validating a Scale

    Directory of Open Access Journals (Sweden)

    Marina Figueiredo Moreir

    2016-12-01

    Full Text Available This study explores the willingness of service providers to incorporate changes suggested by clients altering previously planned services during its delivery, hereby named Willingness to Change in Services [WCS]. We apply qualitative research techniques to map seven dimensions related to this phenomenon: Client relationship management; Organizational conditions for change; Software characteristics and development; Conditions affecting teams; Administrative procedures and decision-making conditions; Entrepreneurial behavior; Interaction with supporting organizations. These dimensions have been converted into variables composing a WCS scale later submitted to theoretical and semantic validations. A scale with 26 variables resulted from such procedures was applied on a large survey carried out with 351 typical Brazilian software development service companies operating all over the country. Data from our sample have been submitted to multivariate statistical analysis to provide validation for the scale. After factorial analysis procedures, 24 items have been validated and assigned to three factors representative of WCS: Organizational Routines and Values – 12 variables; Organizational Structure for Change – 6 variables; and Service Specificities – 6 variables. As future contributions, we expect to see further testing for the WCS scale on alternative service activities to provide evidence about its limits and contributions to general service innovation theory.

  2. Exposure to Surgery and Anesthesia After Concussion Due to Mild Traumatic Brain Injury.

    Science.gov (United States)

    Abcejo, Arnoley S; Savica, Rodolfo; Lanier, William L; Pasternak, Jeffrey J

    2017-07-01

    To describe the epidemiology of surgical and anesthetic procedures in patients recently diagnosed as having a concussion due to mild traumatic brain injury. Study patients presented to a tertiary care center after a concussion due to mild traumatic brain injury from July 1, 2005, through June 30, 2015, and underwent a surgical procedure and anesthesia support under the direct or indirect care of a physician anesthesiologist. During the study period, 1038 patients met all the study inclusion criteria and subsequently received 1820 anesthetics. In this population of anesthetized patients, rates of diagnosed concussions due to sports injuries, falls, and assaults, but not motor vehicle accidents, increased during 2010-2011. Concussions were diagnosed in 965 patients (93%) within 1 week after injury. In the 552 patients who had surgery within 1 week after concussive injury, 29 (5%) had anesthesia and surgical procedures unrelated to their concussion-producing traumatic injury. The highest use of surgery occurred early after injury and most frequently required general anesthesia. Orthopedic and general surgical procedures accounted for 57% of procedures. Nine patients received 29 anesthetics before a concussion diagnosis, and all of these patients had been involved in motor vehicle accidents and received at least 1 anesthetic within 1 week of injury. Surgical and anesthesia use are common in patients after concussion. Clinicians should have increased awareness for concussion in patients who sustain a trauma and may need to take measures to avoid potentially injury-augmenting cerebral physiology in these patients. Copyright © 2017 Mayo Foundation for Medical Education and Research. Published by Elsevier Inc. All rights reserved.

  3. A pilot study on the validity of using pictures and videos for individualized symptom provocation in obsessive-compulsive disorder.

    Science.gov (United States)

    Simon, Daniela; Kischkel, Eva; Spielberg, Rüdiger; Kathmann, Norbert

    2012-06-30

    Distressing symptom-related anxiety is difficult to study in obsessive-compulsive disorder (OCD) due to the disorder's heterogeneity. Our aim was to develop and validate a set of pictures and films comprising a variety of prominent OCD triggers that can be used for individually tailored symptom provocation in experimental studies. In a two-staged production procedure a large pool of OCD triggers and neutral contents was produced and preselected by three psychotherapists specialized in OCD. A sample of 13 OCD patients and 13 controls rated their anxiety, aversiveness and arousal during exposure to OCD-relevant, aversive and neutral control stimuli. Our findings demonstrate differences between the responses of patients and controls to OCD triggers only. Symptom-related anxiety was stronger in response to dynamic compared with static OCD-relevant stimuli. Due to the small number of 13 patients included in the study, only tentative conclusions can be drawn and this study merely provides a first step of validation. These standardized sets constitute valuable tools that can be used in experimental studies on the brain correlates of OCD symptoms and for the study of therapeutic interventions in order to contribute to future developments in the field. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Validated measurements of microbial loads on environmental surfaces in intensive care units before and after disinfecting cleaning.

    Science.gov (United States)

    Frickmann, H; Bachert, S; Warnke, P; Podbielski, A

    2018-03-01

    Preanalytic aspects can make results of hygiene studies difficult to compare. Efficacy of surface disinfection was assessed with an evaluated swabbing procedure. A validated microbial screening of surfaces was performed in the patients' environment and from hands of healthcare workers on two intensive care units (ICUs) prior to and after a standardized disinfection procedure. From a pure culture, the recovery rate of the swabs for Staphylococcus aureus was 35%-64% and dropped to 0%-22% from a mixed culture with 10-times more Staphylococcus epidermidis than S. aureus. Microbial surface loads 30 min before and after the cleaning procedures were indistinguishable. The quality-ensured screening procedure proved that adequate hygiene procedures are associated with a low overall colonization of surfaces and skin of healthcare workers. Unchanged microbial loads before and after surface disinfection demonstrated the low additional impact of this procedure in the endemic situation when the pathogen load prior to surface disinfection is already low. Based on a validated screening system ensuring the interpretability and reliability of the results, the study confirms the efficiency of combined hand and surface hygiene procedures to guarantee low rates of bacterial colonization. © 2017 The Society for Applied Microbiology.

  5. Procedures for sampling radium-contaminated soils

    International Nuclear Information System (INIS)

    Fleischhauer, H.L.

    1985-10-01

    Two procedures for sampling the surface layer (0 to 15 centimeters) of radium-contaminated soil are recommended for use in remedial action projects. Both procedures adhere to the philosophy that soil samples should have constant geometry and constant volume in order to ensure uniformity. In the first procedure, a ''cookie cutter'' fashioned from pipe or steel plate, is driven to the desired depth by means of a slide hammer, and the sample extracted as a core or plug. The second procedure requires use of a template to outline the sampling area, from which the sample is obtained using a trowel or spoon. Sampling to the desired depth must then be performed incrementally. Selection of one procedure over the other is governed primarily by soil conditions, the cookie cutter being effective in nongravelly soils, and the template procedure appropriate for use in both gravelly and nongravelly soils. In any event, a minimum sample volume of 1000 cubic centimeters is recommended. The step-by-step procedures are accompanied by a description of the minimum requirements for sample documentation. Transport of the soil samples from the field is then addressed in a discussion of the federal regulations for shipping radioactive materials. Interpretation of those regulations, particularly in light of their application to remedial action soil-sampling programs, is provided in the form of guidance and suggested procedures. Due to the complex nature of the regulations, however, there is no guarantee that our interpretations of them are complete or entirely accurate. Preparation of soil samples for radium-226 analysis by means of gamma-ray spectroscopy is described

  6. Reliability and Validity of Curriculum-Based Informal Reading Inventories.

    Science.gov (United States)

    Fuchs, Lynn; And Others

    A study was conducted to explore the reliability and validity of three prominent procedures used in informal reading inventories (IRIs): (1) choosing a 95% word recognition accuracy standard for determining student instructional level, (2) arbitrarily selecting a passage to represent the difficulty level of a basal reader, and (3) employing…

  7. Face, content, and construct validity of human placenta as a haptic training tool in neurointerventional surgery.

    Science.gov (United States)

    Ribeiro de Oliveira, Marcelo Magaldi; Nicolato, Arthur; Santos, Marcilea; Godinho, Joao Victor; Brito, Rafael; Alvarenga, Alexandre; Martins, Ana Luiza Valle; Prosdocimi, André; Trivelato, Felipe Padovani; Sabbagh, Abdulrahman J; Reis, Augusto Barbosa; Maestro, Rolando Del

    2016-05-01

    OBJECT The development of neurointerventional treatments of central nervous system disorders has resulted in the need for adequate training environments for novice interventionalists. Virtual simulators offer anatomical definition but lack adequate tactile feedback. Animal models, which provide more lifelike training, require an appropriate infrastructure base. The authors describe a training model for neurointerventional procedures using the human placenta (HP), which affords haptic training with significantly fewer resource requirements, and discuss its validation. METHODS Twelve HPs were prepared for simulated endovascular procedures. Training exercises performed by interventional neuroradiologists and novice fellows were placental angiography, stent placement, aneurysm coiling, and intravascular liquid embolic agent injection. RESULTS The endovascular training exercises proposed can be easily reproduced in the HP. Face, content, and construct validity were assessed by 6 neurointerventional radiologists and 6 novice fellows in interventional radiology. CONCLUSIONS The use of HP provides an inexpensive training model for the training of neurointerventionalists. Preliminary validation results show that this simulation model has face and content validity and has demonstrated construct validity for the interventions assessed in this study.

  8. Maternal Near-Miss Due to Unsafe Abortion and Associated Short ...

    African Journals Online (AJOL)

    AJRH Managing Editor

    On average, treatment of MNM due to abortion costs six times more than induced abortion procedures. .... rooms and intensive care units to check whether there have been .... four others went to a pharmacy as soon as health problems arose.

  9. Behavior changes after minor emergency procedures.

    Science.gov (United States)

    Brodzinski, Holly; Iyer, Srikant

    2013-10-01

    Procedures are common in pediatric emergency departments and frequently cause distress from pain and/or anxiety. The objective of this study was to describe the incidence, types, and magnitude of long-term behavior changes after procedures in the emergency setting. This is a descriptive pilot study to determine if children display negative behavioral changes after a minor emergency department procedure (abscess drainage or laceration repair). Behavior change was measured at 1 week by telephone follow-up using the 27-item Post Hospitalization Behavior Questionnaire, a well-validated instrument that measures behavior changes across 6 categories: general anxiety, separation anxiety, anxiety about sleep, eating disturbances, aggression toward authority, and apathy/withdrawal. Significant behavior change was defined as 5 or more negative behavior changes on the 27-item questionnaire. Twenty percent of children who underwent abscess drainage (n = 30) and 20% who underwent laceration repair (n = 30) displayed significant negative behavior change at 1 week. Children who displayed significant negative behavior change tended to be younger (3.6 vs 5.9 years) and trended toward being more likely to have received anxiolysis or sedation (16.7% vs 8.3%). Separation anxiety, sleep difficulties, and aggression toward authority were the most common behavior changes. In this pilot study, a significant percentage of children undergoing common emergency procedures exhibited an appreciable burden of negative behavior change at 1 week; these results demonstrate the need for further rigorous investigation of predictors of these changes and interventions, which can ameliorate these changes.

  10. A procedure for the rapid determination of Pu isotopes and Am-241 in soil and sediment samples by alpha spectrometry

    International Nuclear Information System (INIS)

    2009-01-01

    Reliable, comparable and 'fit for purpose' results are an essential requirement for any decision based on analytical measurements. For the analyst, the availability of tested and validated analytical procedures is a extremely important tool for production of such analytical measurements. For maximum utility, such procedures should be comprehensive, clearly formulated, and readily available to both the analyst and the customer for reference. Since 2004 the Environment Programme of the IAEA has included activities aimed at the development of a set of procedures for the determination of radionuclides in terrestrial environmental samples. In this report, a rapid procedure for the determination of Pu and Am radionuclides in soil and sediment samples is described that can be used in emergency situations. The method provides accurate and reliable results for the activity concentrations of elevated levels of 239,240 Pu, 238 Pu and 241 Am in soil and sediment samples over the course of 24 hours. The procedure has been validated in accordance with ISO guidelines

  11. French validation of the Foot Function Index (FFI).

    Science.gov (United States)

    Pourtier-Piotte, C; Pereira, B; Soubrier, M; Thomas, E; Gerbaud, L; Coudeyre, E

    2015-10-01

    French validation of the Foot Function Index (FFI), self-questionnaire designed to evaluate rheumatoid foot according to 3 domains: pain, disability and activity restriction. The first step consisted of translation/back translation and cultural adaptation according to the validated methodology. The second stage was a prospective validation on 53 patients with rheumatoid arthritis who filled out the FFI. The following data were collected: pain (Visual Analog Scale), disability (Health Assessment Questionnaire) and activity restrictions (McMaster Toronto Arthritis questionnaire). A test/retest procedure was performed 15 days later. The statistical analyses focused on acceptability, internal consistency (Cronbach's alpha and Principal Component Analysis), test-retest reproducibility (concordance coefficients), external validity (correlation coefficients) and responsiveness to change. The FFI-F is a culturally acceptable version for French patients with rheumatoid arthritis. The Cronbach's alpha ranged from 0.85 to 0.97. Reproducibility was correct (correlation coefficients>0.56). External validity and responsiveness to change were good. The use of a rigorous methodology allowed the validation of the FFI in the French language (FFI-F). This tool can be used in routine practice and clinical research for evaluating the rheumatoid foot. The FFI-F could be used in other pathologies with foot-related functional impairments. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  12. 31 CFR 501.806 - Procedures for unblocking funds believed to have been blocked due to mistaken identity.

    Science.gov (United States)

    2010-07-01

    ... believed to have been blocked due to mistaken identity. 501.806 Section 501.806 Money and Finance: Treasury... funds believed to have been blocked due to mistaken identity. When a transaction results in the blocking... party to the transaction believes the funds have been blocked due to mistaken identity, that party may...

  13. A generic validation methodology and its application to a set of multi-axial creep damage constitutive equations

    International Nuclear Information System (INIS)

    Xu Qiang

    2005-01-01

    A generic validation methodology for a set of multi-axial creep damage constitutive equations is proposed and its use is illustrated with 0.5Cr0.5Mo0.25V ferritic steel which is featured as brittle or intergranular rupture. The objective of this research is to develop a methodology to guide systematically assess the quality of a set of multi-axial creep damage constitutive equations in order to ensure its general applicability. This work adopted a total quality assurance approach and expanded as a Four Stages procedure (Theories and Fundamentals, Parameter Identification, Proportional Load, and Non-proportional load). Its use is illustrated with 0.5Cr0.5Mo0.25V ferritic steel and this material is chosen due to its industry importance, the popular use of KRH type of constitutive equations, and the available qualitative experimental data including damage distribution from notched bar test. The validation exercise clearly revealed the deficiencies existed in the KRH formulation (in terms of mathematics and physics of damage mechanics) and its incapability to predict creep deformation accurately. Consequently, its use should be warned, which is particularly important due to its wide use as indicated in literature. This work contributes to understand the rational for formulation and the quality assurance of a set of constitutive equations in creep damage mechanics as well as in general damage mechanics. (authors)

  14. On the validity of localized approximation for an on-axis zeroth-order Bessel beam

    International Nuclear Information System (INIS)

    Gouesbet, Gérard; Lock, J.A.; Ambrosio, L.A.; Wang, J.J.

    2017-01-01

    Localized approximation procedures are efficient ways to evaluate beam shape coefficients of laser beams, and are particularly useful when other methods are ineffective or inefficient. Several papers in the literature have reported the use of such procedures to evaluate the beam shape coefficients of Bessel beams. Examining the specific case of an on-axis zeroth-order Bessel beam, we demonstrate that localized approximation procedures are valid only for small axicon angles. - Highlights: • The localized approximation has been widely used to evaluate the Beam Shape Coefficients (BSCs) of Bessel beams. • The validity of this approximation is examined in the case of an on-axis zeroth-order Bessel beam. • It is demonstrated, in this specific example, that the localized approximation is efficient only for small enough axicon angles. • It is easily argued that this result must remain true for any kind of Bessel beams.

  15. Anaphylaxis Due to Head Injury

    Directory of Open Access Journals (Sweden)

    Bruner, Heather C.

    2015-05-01

    Full Text Available Both anaphylaxis and head injury are often seen in the emergency department, but they are rarely seen in combination. We present a case of a 30-year-old woman who presented with anaphylaxis with urticaria and angioedema following a minor head injury. The patient responded well to intramuscular epinephrine without further complications or airway compromise. Prior case reports have reported angioedema from hereditary angioedema during dental procedures and maxillofacial surgery, but there have not been any cases of first-time angioedema or anaphylaxis due to head injury. [West J Emerg Med. 2015;16(3:435–437.

  16. Concepts of Model Verification and Validation

    International Nuclear Information System (INIS)

    Thacker, B.H.; Doebling, S.W.; Hemez, F.M.; Anderson, M.C.; Pepin, J.E.; Rodriguez, E.A.

    2004-01-01

    Model verification and validation (VandV) is an enabling methodology for the development of computational models that can be used to make engineering predictions with quantified confidence. Model VandV procedures are needed by government and industry to reduce the time, cost, and risk associated with full-scale testing of products, materials, and weapon systems. Quantifying the confidence and predictive accuracy of model calculations provides the decision-maker with the information necessary for making high-consequence decisions. The development of guidelines and procedures for conducting a model VandV program are currently being defined by a broad spectrum of researchers. This report reviews the concepts involved in such a program. Model VandV is a current topic of great interest to both government and industry. In response to a ban on the production of new strategic weapons and nuclear testing, the Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship Program (SSP). An objective of the SSP is to maintain a high level of confidence in the safety, reliability, and performance of the existing nuclear weapons stockpile in the absence of nuclear testing. This objective has challenged the national laboratories to develop high-confidence tools and methods that can be used to provide credible models needed for stockpile certification via numerical simulation. There has been a significant increase in activity recently to define VandV methods and procedures. The U.S. Department of Defense (DoD) Modeling and Simulation Office (DMSO) is working to develop fundamental concepts and terminology for VandV applied to high-level systems such as ballistic missile defense and battle management simulations. The American Society of Mechanical Engineers (ASME) has recently formed a Standards Committee for the development of VandV procedures for computational solid mechanics models. The Defense Nuclear Facilities Safety Board (DNFSB) has been a proponent of model

  17. Conception and validation of the Behavioral Intentions Scale of Organizational Citizenship (BISOC

    Directory of Open Access Journals (Sweden)

    Ana Cristina Passos Gomes Menezes

    2016-01-01

    Full Text Available Abstract This study aimed to construct and validate the Behavioral Intentions of Organizational Citizenship Scale (BISOC. Organizational citizenship consists of measures of voluntary behaviors, which are beneficial to organizations and are not explicit in employment contracts. To investigate the psychometric properties of BISOC, we selected 767 employees in different cities from the states of Bahia and Pernambuco (Brazil. The validation procedures adopted, which used techniques from both Classical Test Theory and Item Response Theory, showed that the BISOC has a unidimensional structure. From the initial set of 42 items, 35 items met the validation criteria. By presenting suitable psychometric parameters, BISOC is the first measure of organizational citizenship behaviors developed and validated to assess behavioral intentions.

  18. Construct validation of teacher portfolio assessment : Procedures for improving teacher competence assessment illustrated by teaching students research skills

    NARCIS (Netherlands)

    Schaaf, M.F. van der

    2005-01-01

    The study aims to design and test procedures for teacher portfolio assessments. What are suitable procedures to assess teachers' competencies in developing students' research skills? We first searched into the tasks teachers have in teaching students research skills and the competencies needed to

  19. Results from the First Validation Phase of CAP code

    International Nuclear Information System (INIS)

    Choo, Yeon Joon; Hong, Soon Joon; Hwang, Su Hyun; Kim, Min Ki; Lee, Byung Chul; Ha, Sang Jun; Choi, Hoon

    2010-01-01

    The second stage of Safety Analysis Code Development for Nuclear Power Plants was lunched on Apirl, 2010 and is scheduled to be through 2012, of which the scope of work shall cover from code validation to licensing preparation. As a part of this project, CAP(Containment Analysis Package) will follow the same procedures. CAP's validation works are organized hieratically into four validation steps using; 1) Fundamental phenomena. 2) Principal phenomena (mixing and transport) and components in containment. 3) Demonstration test by small, middle, large facilities and International Standard Problems. 4) Comparison with other containment codes such as GOTHIC or COMTEMPT. In addition, collecting the experimental data related to containment phenomena and then constructing the database is one of the major works during the second stage as a part of this project. From the validation process of fundamental phenomenon, it could be expected that the current capability and the future improvements of CAP code will be revealed. For this purpose, simple but significant problems, which have the exact analytical solution, were selected and calculated for validation of fundamental phenomena. In this paper, some results of validation problems for the selected fundamental phenomena will be summarized and discussed briefly

  20. Using qualitative methods to improve questionnaires for Spanish speakers: assessing face validity of a food behavior checklist.

    Science.gov (United States)

    Banna, Jinan C; Vera Becerra, Luz E; Kaiser, Lucia L; Townsend, Marilyn S

    2010-01-01

    Development of outcome measures relevant to health nutrition behaviors requires a rigorous process of testing and revision. Whereas researchers often report performance of quantitative data collection to assess questionnaire validity and reliability, qualitative testing procedures are often overlooked. This report outlines a procedure for assessing face validity of a Spanish-language dietary assessment tool. Reviewing the literature produced no rigorously validated Spanish-language food behavior assessment tools for the US Department of Agriculture's food assistance and education programs. In response to this need, this study evaluated the face validity of a Spanish-language food behavior checklist adapted from a 16-item English version of a food behavior checklist shown to be valid and reliable for limited-resource English speakers. The English version was translated using rigorous methods involving initial translation by one party and creation of five possible versions. Photos were modified based on client input and new photos were taken as necessary. A sample of low-income, Spanish-speaking women completed cognitive interviews (n=20). Spanish translation experts (n=7) fluent in both languages and familiar with both cultures made minor modifications but essentially approved client preferences. The resulting checklist generated a readability score of 93, indicating low reading difficulty. The Spanish-language checklist has adequate face validity in the target population and is ready for further validation using convergent measures. At the conclusion of testing, this instrument may be used to evaluate nutrition education interventions in California. These qualitative procedures provide a framework for designing evaluation tools for low-literate audiences participating in the US Department of Agriculture food assistance and education programs. Copyright 2010 American Dietetic Association. Published by Elsevier Inc. All rights reserved.

  1. Developments in the preparation of operating procedures for emergency conditions of nuclear power plants

    International Nuclear Information System (INIS)

    1985-06-01

    In recent years a substantial effort has been devoted by the nuclear community to extend Emergency Operating Procedures (EOPs) to cover all conceivable events and to develop procedure formats that transmit the essential guidance to operators in an optimum way. The information given in this report is based upon the most recent developments in formulating and applying EOPs. It should therefore provide guidance to those involved in preparing or reviewing EOPs on the scope, technical basis, organization and format of such procedures. It also outlines the actions required to validate the adequacy and applicability of these procedures so that the correct operator actions are achieved. Examples are given to illustrate the developments in some Member States

  2. A long-term validation of the modernised DC-ARC-OES solid-sample method.

    Science.gov (United States)

    Flórián, K; Hassler, J; Förster, O

    2001-12-01

    The validation procedure based on ISO 17025 standard has been used to study and illustrate both the longterm stability of the calibration process of the DC-ARC solid sample spectrometric method and the main validation criteria of the method. In the calculation of the validation characteristics depending on the linearity(calibration), also the fulfilment of predetermining criteria such as normality and homoscedasticity was checked. In order to decide whether there are any trends in the time-variation of the analytical signal or not, also the Neumann test of trend was applied and evaluated. Finally, a comparison with similar validation data of the ETV-ICP-OES method was carried out.

  3. Radiocontamination of agricultural workers due to nuclear accidents

    International Nuclear Information System (INIS)

    Petrovic, B.; Smelcerovic, M.; Djuric, G.; Popovic, D.

    1989-01-01

    In the radiocontamination of the environment due to nuclear accidents, agricultural workers should be considered as a critical group of population. The presented paper discusses this problem from the aspects of folder production. The values of the effective dose equivalent are estimated for different phases of the production process and certain procedures aimed to reduce the radiation risk are proposed (author)

  4. Radiocontamination of agricultural workers due to nuclear accidents

    Energy Technology Data Exchange (ETDEWEB)

    Petrovic, B [Faculty of Veterinary Medicine, Beograd, (Serbia and Montenegro); Smelcerovic, M; Djuric, G; Popovic, D [Institute of Nuclear Sciences Boris Kidric, Vinca, Beograd (Serbia and Montenegro)

    1989-07-01

    In the radiocontamination of the environment due to nuclear accidents, agricultural workers should be considered as a critical group of population. The presented paper discusses this problem from the aspects of folder production. The values of the effective dose equivalent are estimated for different phases of the production process and certain procedures aimed to reduce the radiation risk are proposed (author)

  5. Validation of the sterile manufacture of the AAEC MARK III molybdenum-99/techtnetium-99m generator

    International Nuclear Information System (INIS)

    Saunders, M.T.; Drummond, C.M.; Harrison, M.A.

    1982-07-01

    The Mark II molybdenum-99/technetium-99m generator now supplied to hospitals by the Australian Atomic Energy Commission is a non-sterile elution system. The Mark III version will be supplied as a sterile elution system. A validation study has been undertaken to assess the capability of the new production facility, to evaluate up-to-date procedures for manufacturing sterile generators and to demonstrate that a sterile radionuclide generator can be made. Generator manufacturing procedures and a time study of the validation are described. Microbiological methods for monitoring in-process aspects of manufacture, disinfectant efficacy and generator sterility are defined

  6. Development of a diagnosis- and procedure-based risk model for 30-day outcome after pediatric cardiac surgery.

    Science.gov (United States)

    Crowe, Sonya; Brown, Kate L; Pagel, Christina; Muthialu, Nagarajan; Cunningham, David; Gibbs, John; Bull, Catherine; Franklin, Rodney; Utley, Martin; Tsang, Victor T

    2013-05-01

    The study objective was to develop a risk model incorporating diagnostic information to adjust for case-mix severity during routine monitoring of outcomes for pediatric cardiac surgery. Data from the Central Cardiac Audit Database for all pediatric cardiac surgery procedures performed in the United Kingdom between 2000 and 2010 were included: 70% for model development and 30% for validation. Units of analysis were 30-day episodes after the first surgical procedure. We used logistic regression for 30-day mortality. Risk factors considered included procedural information based on Central Cardiac Audit Database "specific procedures," diagnostic information defined by 24 "primary" cardiac diagnoses and "univentricular" status, and other patient characteristics. Of the 27,140 30-day episodes in the development set, 25,613 were survivals, 834 were deaths, and 693 were of unknown status (mortality, 3.2%). The risk model includes procedure, cardiac diagnosis, univentricular status, age band (neonate, infant, child), continuous age, continuous weight, presence of non-Down syndrome comorbidity, bypass, and year of operation 2007 or later (because of decreasing mortality). A risk score was calculated for 95% of cases in the validation set (weight missing in 5%). The model discriminated well; the C-index for validation set was 0.77 (0.81 for post-2007 data). Removal of all but procedural information gave a reduced C-index of 0.72. The model performed well across the spectrum of predicted risk, but there was evidence of underestimation of mortality risk in neonates undergoing operation from 2007. The risk model performs well. Diagnostic information added useful discriminatory power. A future application is risk adjustment during routine monitoring of outcomes in the United Kingdom to assist quality assurance. Copyright © 2013 The American Association for Thoracic Surgery. Published by Mosby, Inc. All rights reserved.

  7. Validation gets underway on Sizewell ''Incredibility of Failure'' components

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    The Inspection Validation Centre (IVC) of AEA Reactor Services in the UK has begun an eighteen month programme to validate the procedures and personnel of OIS plc, the inspection agents chosen by Nuclear Electric to carry out the pre-service ultrasonic inspection of the Sizewell B Pressurized Water Reactor components assigned to the ''Incredibility of Failure'' (IoF) category. The work involves several Sizewell B primary circuit components - the steam generators, pressurizer, and primary pumps - and will consider the inspections to be applied to the circumferential and nozzle-to-shell welds, nozzle inner radii and the pump fly-wheel forging. The validation will provide independent confirmation that OIS personnel are capable of using manual and automated methods to find and size any flaws of structural concern in these components. (author)

  8. The Closing of the Insolvency Procedure

    Directory of Open Access Journals (Sweden)

    Cornelia Lefter

    2007-12-01

    Full Text Available The achievment of the balance between the offer and the demand in a market economy makes that some merchants win and others lose. Losing in business is a normal risk, usually assumed by any merchant. But when the merchant record losses, the issue is of engaging his responsibility before all those that may be damaged due to his negative results. Faced to this reality, the commercial legislation, by way of the collective procedure, has tried to found the most adequate means to reduce up to the maximum the negative influences that the losses beared by a merchant may have on his creditors. According to this, from the new law of the collective procedure (Law no. 85/2006 there have been analized those cases of closing the procedure and their effects which raised already problems in practice and aroused interesting doctrinal controversies.

  9. Detailed validation in PCDDF analysis. ISO17025 data from Brazil

    Energy Technology Data Exchange (ETDEWEB)

    Kernick Carvalhaes, G.; Azevedo, J.A.; Azevedo, G.; Machado, M.; Brooks, P. [Analytical Solutions, Rio de Janeiro (Brazil)

    2004-09-15

    When we define validation method we can use the ISO standard 8402, in reference to this, 'validation' is the 'confirmation by the examination and supplying of objective evidences that the particular requirements for a specific intended use are fulfilled'. This concept is extremely important to guarantee the quality of results. Validation method is based on the combined use of different validation procedures, but in this selection we have to analyze the cost benefit conditions. We must focus on the critical elements, and these critical factors must be the essential elements for providing good properties and results. If we have a solid validation methodology and a research of the source of uncertainty of our analytical method, we can generate results with confidence and veracity. When analyzing these two considerations, validation method and uncertainty calculations, we found out that there are very few articles and papers about these subjects, and it is even more difficult to find such materials on dioxins and furans. This short paper describes a validation and uncertainty calculation methodology using traditional studies with a few adaptations, yet it shows a new idea of recovery study as a source of uncertainty.

  10. The risk of sequelae due to pneumococcal meningitis in high-income countries: a systematic review and meta-analysis.

    Science.gov (United States)

    Jit, Mark

    2010-07-01

    To determine the risk of various kinds of sequelae in survivors of meningitis due to Streptococcus pneumoniae, as well as the influence of co-factors such as study design, study population and treatment on this risk. MEDLINE, EMBASE and the Cochrane Central Register of Controlled Trials (CENTRAL) were searched from 1 September 1991 to 18 June 2009 for original articles on pneumococcal meningitis sequelae. Prevalence of sequelae was pooled using random effects meta-analysis. Studies were appraised for the influence of referral bias, external validity of study populations, testing procedure and publication bias. Data were extracted from 63 studies involving 3408 pneumococcal meningitis survivors. The pooled prevalence of any reported sequelae from 48 studies was 31.7% (95% confidence interval 27.2-36.3%) using a random effects model (Cochran-Q = 277, p < 0.01). Differences in studies due to design, study population and treatment were not significant. The pooled prevalence of hearing loss, seizures, hydrocephalus, spasticity/paresis, cranial nerve palsies and visual impairment was 20.9% (17.1-24.7%), 6.5% (3.3-9.7%), 6.8% (3.3-10.2%), 8.7% (6.4-11.0%), 12.2% (5.3-19.1%) and 2.4% (0-5.7%) respectively. The burden of sequelae due to pneumococcal meningitis remains high in the reviewed studies.

  11. Procedure for validating the life extension for the WWER-440 internals at NV NPP unit 3

    International Nuclear Information System (INIS)

    Filatov, V.M.; Evropin, S.V.

    2001-01-01

    The design lifetime (30 years) of the 1st generation WWER-440 reactor facilities is nearing completion in Russia. One of the major problems is to validate the life extension (LE) of the reactor internals ensuring the core arrangement and free passage of the control and protection system components during different operating modes, emergency modes included. The internals at the 1st generation units are designed so that to enable their replacement. But it requires a lot of funds and time. The work has been done to demonstrate that the internals may be further safely operated without their components being replaced provided their strength, longevity and serviceability are sufficiently validated. (author)

  12. Due Process in the Realm of Higher Education: Considerations for Dealing with Students Rights

    Science.gov (United States)

    Fishner, Jason T.

    2006-01-01

    Court decisions have laid out expectations of what due process procedures need to be followed in student disciplinary cases and academic dismissal cases due to poor academic performance. This paper will give show where due process comes from and how it found its way into higher education. It will show that there are differences in the ways public…

  13. 42 CFR 476.94 - Notice of QIO initial denial determination and changes as a result of a DRG validation.

    Science.gov (United States)

    2010-10-01

    ... changes as a result of a DRG validation. 476.94 Section 476.94 Public Health CENTERS FOR MEDICARE... changes as a result of a DRG validation. (a) Notice of initial denial determination—(1) Parties to be... retrospective review, (excluding DRG validation and post procedure review), within 3 working days of the initial...

  14. 42 CFR 476.85 - Conclusive effect of QIO initial denial determinations and changes as a result of DRG validations.

    Science.gov (United States)

    2010-10-01

    ... determinations and changes as a result of DRG validations. 476.85 Section 476.85 Public Health CENTERS FOR... denial determinations and changes as a result of DRG validations. A QIO initial denial determination or change as a result of DRG validation is final and binding unless, in accordance with the procedures in...

  15. Development of a tool to support holistic generic assessment of clinical procedure skills.

    Science.gov (United States)

    McKinley, Robert K; Strand, Janice; Gray, Tracey; Schuwirth, Lambert; Alun-Jones, Tom; Miller, Helen

    2008-06-01

    The challenges of maintaining comprehensive banks of valid checklists make context-specific checklists for assessment of clinical procedural skills problematic. This paper reports the development of a tool which supports generic holistic assessment of clinical procedural skills. We carried out a literature review, focus groups and non-participant observation of assessments with interview of participants, participant evaluation of a pilot objective structured clinical examination (OSCE), a national modified Delphi study with prior definitions of consensus and an OSCE. Participants were volunteers from a large acute teaching trust, a teaching primary care trust and a national sample of National Health Service staff. Results In total, 86 students, trainees and staff took part in the focus groups, observation of assessments and pilot OSCE, 252 in the Delphi study and 46 candidates and 50 assessors in the final OSCE. We developed a prototype tool with 5 broad categories amongst which were distributed 38 component competencies. There was > 70% agreement (our prior definition of consensus) at the first round of the Delphi study for inclusion of all categories and themes and no consensus for inclusion of additional categories or themes. Generalisability was 0.76. An OSCE based on the instrument has a predicted reliability of 0.79 with 12 stations and 1 assessor per station or 10 stations and 2 assessors per station. This clinical procedural skills assessment tool enables reliable assessment and has content and face validity for the assessment of clinical procedural skills. We have designated it the Leicester Clinical Procedure Assessment Tool (LCAT).

  16. Implementing Distributed Algorithms using Remote Procedure Call

    NARCIS (Netherlands)

    Bal, H.E.; van Renesse, R.; Tanenbaum, A.S.

    1987-01-01

    Remote procedure call (RPC) is a simple yet powerful primitiv~ for communication and synchronization between distributed processes. A problem with RPC is that it tends to decrease the amount of parallelism in an application due to its synchronous nature. This paper shows how light-weight processes

  17. Static validation of licence conformance policies

    DEFF Research Database (Denmark)

    Hansen, Rene Rydhof; Nielson, Flemming; Nielson, Hanne Riis

    2008-01-01

    Policy conformance is a security property gaining importance due to commercial interest like Digital Rights Management. It is well known that static analysis can be used to validate a number of more classical security policies, such as discretionary and mandatory access control policies, as well...... as communication protocols using symmetric and asymmetric cryptography. In this work we show how to develop a Flow Logic for validating the conformance of client software with respect to a licence conformance policy. Our approach is sufficiently flexible that it extends to fully open systems that can admit new...

  18. COVERS Neonatal Pain Scale: Development and Validation

    Directory of Open Access Journals (Sweden)

    Ivan L. Hand

    2010-01-01

    Full Text Available Newborns and infants are often exposed to painful procedures during hospitalization. Several different scales have been validated to assess pain in specific populations of pediatric patients, but no single scale can easily and accurately assess pain in all newborns and infants regardless of gestational age and disease state. A new pain scale was developed, the COVERS scale, which incorporates 6 physiological and behavioral measures for scoring. Newborns admitted to the Neonatal Intensive Care Unit or Well Baby Nursery were evaluated for pain/discomfort during two procedures, a heel prick and a diaper change. Pain was assessed using indicators from three previously established scales (CRIES, the Premature Infant Pain Profile, and the Neonatal Infant Pain Scale, as well as the COVERS Scale, depending upon gestational age. Premature infant testing resulted in similar pain assessments using the COVERS and PIPP scales with an r=0.84. For the full-term infants, the COVERS scale and NIPS scale resulted in similar pain assessments with an r=0.95. The COVERS scale is a valid pain scale that can be used in the clinical setting to assess pain in newborns and infants and is universally applicable to all neonates, regardless of their age or physiological state.

  19. A simple but accurate procedure for solving the five-parameter model

    International Nuclear Information System (INIS)

    Mares, Oana; Paulescu, Marius; Badescu, Viorel

    2015-01-01

    Highlights: • A new procedure for extracting the parameters of the one-diode model is proposed. • Only the basic information listed in the datasheet of PV modules are required. • Results demonstrate a simple, robust and accurate procedure. - Abstract: The current–voltage characteristic of a photovoltaic module is typically evaluated by using a model based on the solar cell equivalent circuit. The complexity of the procedure applied for extracting the model parameters depends on data available in manufacture’s datasheet. Since the datasheet is not detailed enough, simplified models have to be used in many cases. This paper proposes a new procedure for extracting the parameters of the one-diode model in standard test conditions, using only the basic data listed by all manufactures in datasheet (short circuit current, open circuit voltage and maximum power point). The procedure is validated by using manufacturers’ data for six commercially crystalline silicon photovoltaic modules. Comparing the computed and measured current–voltage characteristics the determination coefficient is in the range 0.976–0.998. Thus, the proposed procedure represents a feasible tool for solving the five-parameter model applied to crystalline silicon photovoltaic modules. The procedure is described in detail, to guide potential users to derive similar models for other types of photovoltaic modules.

  20. Validating Teacher Commitment Scale Using a Malaysian Sample

    Directory of Open Access Journals (Sweden)

    Lei Mee Thien

    2014-05-01

    Full Text Available This study attempts to validate an integrative Teacher Commitment scale using rigorous scale validation procedures. An adapted questionnaire with 17 items was administered to 600 primary school teachers in Penang, Malaysia. Data were analyzed using exploratory factor analysis (EFA and confirmatory factor analysis (CFA with SPSS 19.0 and AMOS 19.0, respectively. The results support Teacher Commitment as a multidimensional construct with its four underlying dimensions: Commitment to Student, Commitment to Teaching, Commitment to School, and Commitment to Profession. A validated Teacher Commitment scale with 13 items measured can be proposed to be used as an evaluative tool to assess the level to which teachers are committed to their students’ learning, teaching, school, and profession. The Teacher Commitment scale would also facilitate the identifications of factors that influence teachers’ quality of work life and school effectiveness. The practical implications, school cultural influence, and methodological limitations are discussed.

  1. Logistic Costs of Privileged Procedures in the Republic of Croatia

    Directory of Open Access Journals (Sweden)

    Čedomir Ivaković

    2006-07-01

    Full Text Available Logistic processes condition more and more the rationalizationof time required for manipulation of goods (loading,unloading, storage. The customs representation costs that arethe result of loss of time due to the customs procedures exclusivelyduring the working hours of the customs office affect alsothe total logistic costs, and may be significantly reduced by applyingthe privileged procedures in import and export.

  2. Validation and practical implementation of a multidisciplinary cancer distress screening questionnaire

    Energy Technology Data Exchange (ETDEWEB)

    Kirchheiner, K.; Czajka, A.; Komarek, E.; Hohenberg, G.; Poetter, R. [Medical University of Vienna (Austria). Dept. of Radiation Oncology; Ponocny-Seliger, E. [Sigmund Freud Private University, Vienna (Austria). Dept. of Psychology; Doerr, W. [Medical University of Vienna (Austria). Dept. of Radiation Oncology; Medical University of Vienna (Austria). Christian Doppler Laboratory for Medical Radiation Research for Radiation Oncology

    2013-07-15

    Background: In order to identify cancer patients with psychosocial needs during radiotherapy, a routine screening questionnaire is widely recommended in the literature. Several tools focusing mainly on psychological issues have been developed during the past decade. However, problems with their implementation into clinical routine have been repeatedly reported, due to a lack of practicability for clinicians and nurses. This study reports the compilation of a multidisciplinary screening questionnaire and an analysis of the effectiveness of its implementation into clinical routine at the Department of Radiotherapy, Medical University of Vienna. Materials and methods: The screening questionnaire is based on a compilation of several subscales from established and validated assessment tools. It focuses on comprehensive information with high a clinical relevance for all professions. In a pilot study, patients' acceptance was assessed qualitatively. Analysis of missing screening data in consecutively admitted patients reflects the effectiveness of implementation and representativity of the data. A validation analysis of the psychological subscales was performed using external criteria and its internal consistency was tested with Cronbachs' {alpha}. Results: Qualitative patient acceptance of the screening questionnaire is good. The overall response rate in the screening procedure was 75 %. Missing patient screening data sets arose randomly - mainly due to organizational problems - and did not result in systematic errors. The psychological subscales identify highly distressed patients with a sensitivity of 89 and 78 %, and an internal consistency of 0.843 and 0.617. Conclusion: The multidisciplinary screening questionnaire compiled in this study has a high patient acceptance, provides reliable and representative data and identifies highly distressed patients with excellent sensitivity. Although requiring additional personnel resources, it can be implemented

  3. Validation and practical implementation of a multidisciplinary cancer distress screening questionnaire

    International Nuclear Information System (INIS)

    Kirchheiner, K.; Czajka, A.; Komarek, E.; Hohenberg, G.; Poetter, R.; Ponocny-Seliger, E.; Doerr, W.; Medical University of Vienna

    2013-01-01

    Background: In order to identify cancer patients with psychosocial needs during radiotherapy, a routine screening questionnaire is widely recommended in the literature. Several tools focusing mainly on psychological issues have been developed during the past decade. However, problems with their implementation into clinical routine have been repeatedly reported, due to a lack of practicability for clinicians and nurses. This study reports the compilation of a multidisciplinary screening questionnaire and an analysis of the effectiveness of its implementation into clinical routine at the Department of Radiotherapy, Medical University of Vienna. Materials and methods: The screening questionnaire is based on a compilation of several subscales from established and validated assessment tools. It focuses on comprehensive information with high a clinical relevance for all professions. In a pilot study, patients' acceptance was assessed qualitatively. Analysis of missing screening data in consecutively admitted patients reflects the effectiveness of implementation and representativity of the data. A validation analysis of the psychological subscales was performed using external criteria and its internal consistency was tested with Cronbachs' α. Results: Qualitative patient acceptance of the screening questionnaire is good. The overall response rate in the screening procedure was 75 %. Missing patient screening data sets arose randomly - mainly due to organizational problems - and did not result in systematic errors. The psychological subscales identify highly distressed patients with a sensitivity of 89 and 78 %, and an internal consistency of 0.843 and 0.617. Conclusion: The multidisciplinary screening questionnaire compiled in this study has a high patient acceptance, provides reliable and representative data and identifies highly distressed patients with excellent sensitivity. Although requiring additional personnel resources, it can be implemented successfully in

  4. How to develop a Standard Operating Procedure for sorting unfixed cells

    Science.gov (United States)

    Schmid, Ingrid

    2012-01-01

    Written Standard Operating Procedures (SOPs) are an important tool to assure that recurring tasks in a laboratory are performed in a consistent manner. When the procedure covered in the SOP involves a high-risk activity such as sorting unfixed cells using a jet-in-air sorter, safety elements are critical components of the document. The details on sort sample handling, sorter set-up, validation, operation, troubleshooting, and maintenance, personal protective equipment (PPE), and operator training, outlined in the SOP are to be based on careful risk assessment of the procedure. This review provides background information on the hazards associated with sorting of unfixed cells and the process used to arrive at the appropriate combination of facility design, instrument placement, safety equipment, and practices to be followed. PMID:22381383

  5. How to develop a standard operating procedure for sorting unfixed cells.

    Science.gov (United States)

    Schmid, Ingrid

    2012-07-01

    Written standard operating procedures (SOPs) are an important tool to assure that recurring tasks in a laboratory are performed in a consistent manner. When the procedure covered in the SOP involves a high-risk activity such as sorting unfixed cells using a jet-in-air sorter, safety elements are critical components of the document. The details on sort sample handling, sorter set-up, validation, operation, troubleshooting, and maintenance, personal protective equipment (PPE), and operator training, outlined in the SOP are to be based on careful risk assessment of the procedure. This review provides background information on the hazards associated with sorting of unfixed cells and the process used to arrive at the appropriate combination of facility design, instrument placement, safety equipment, and practices to be followed. Copyright © 2012 Elsevier Inc. All rights reserved.

  6. 25 CFR 42.6 - When does due process require a formal disciplinary hearing?

    Science.gov (United States)

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false When does due process require a formal disciplinary... RIGHTS § 42.6 When does due process require a formal disciplinary hearing? Unless local school policies and procedures provide for less, a formal disciplinary hearing is required before a suspension in...

  7. Design and validation of a CT-guided robotic system for lung cancer brachytherapy.

    Science.gov (United States)

    Dou, Huaisu; Jiang, Shan; Yang, Zhiyong; Sun, Luqing; Ma, Xiaodong; Huo, Bin

    2017-09-01

    Currently, lung brachytherapy in clinical setting is a complex procedure. Operation accuracy depends on accurate positioning of the template; however, it is difficult to guarantee the positioning accuracy manually. Application of robotic-assisted systems can simplify the procedure and improve the manual positioning accuracy. Therefore, a novel CT-guided robotic system was developed to assist the lung cancer brachytherapy. A four degree-of-freedom (DOF) robot, controlled by a lung brachytherapy treatment planning system (TPS) software, was designed and manufactured to assist the template positioning. Target position of the template can be obtained from the treatment plan, thus the robot is driven to the target position automatically. The robotic system was validated in both the laboratory and the CT environment. In laboratory environment, a 3D laser tracker and an inertial measurement unit (IMU) were used to measure the mechanical accuracy in air, which includes positioning accuracy and position repeatability. Working reliability was also validated in this procedure by observing the response reliability and calculating the position repeatability. Imaging artifacts and accuracy of the robot registration were validated in the CT environment by using an artificial phantom with fiducial markers. CT images were obtained and used to test the image artifact and calculate the registration accuracy. Phantom experiments were conducted to test the accuracy of needle insertion by using a transparent hydrogel phantom with a high imitation artificial phantom. Also, the efficiency was validated in this procedure by comparing time costs in manual positioning with robotic positioning under the same experimental conditions. The robotic system achieved the positioning accuracy of 0.28 ± 0.25 mm and the position repeatability of 0.09 ± 0.11 mm. Experimental results showed that the robot was CT-compatible and responded reliably to the control commands. The mean registration accuracy

  8. CANDU radiotoxicity inventories estimation: A calculated experiment cross-check for data verification and validation

    International Nuclear Information System (INIS)

    Pavelescu, Alexandru Octavian; Cepraga, Dan Gabriel

    2007-01-01

    This paper is related to the Clearance Potential Index, Ingestion and Inhalation Hazard Factors of the nuclear spent fuel and radioactive wastes. This study required a complex activity that consisted of various phases such us: the acquisition, setting up, validation and application of procedures, codes and libraries. The paper reflects the validation phase of this study. Its objective was to compare the measured inventories of selected actinide and fission products radionuclides in an element from a Pickering CANDU reactor with inventories predicted using a recent version of the ORIGEN-ARP from SCALE 5 coupled with the time dependent cross sections library, CANDU 28.lib, produced by the sequence SAS2H of SCALE 4.4a. In this way, the procedures, codes and libraries for the characterization of radioactive material in terms of radioactive inventories, clearance, and biological hazard factors are being qualified and validated, in support for the safety management of the radioactive wastes. (authors)

  9. Using Cell Phone Technology for Self-Monitoring Procedures in Inclusive Settings

    Science.gov (United States)

    Bedesem, Pena L.

    2012-01-01

    The purpose of this study was to determine the effects and social validity of an innovative method of self-monitoring for middle school students with high-incidence disabilities in inclusive settings. An updated self-monitoring procedure, called CellF-Monitoring, utilized a cell phone as an all-inclusive self-monitoring device. The study took…

  10. Image-guidance for surgical procedures

    International Nuclear Information System (INIS)

    Peters, Terry M

    2006-01-01

    Contemporary imaging modalities can now provide the surgeon with high quality three- and four-dimensional images depicting not only normal anatomy and pathology, but also vascularity and function. A key component of image-guided surgery (IGS) is the ability to register multi-modal pre-operative images to each other and to the patient. The other important component of IGS is the ability to track instruments in real time during the procedure and to display them as part of a realistic model of the operative volume. Stereoscopic, virtual- and augmented-reality techniques have been implemented to enhance the visualization and guidance process. For the most part, IGS relies on the assumption that the pre-operatively acquired images used to guide the surgery accurately represent the morphology of the tissue during the procedure. This assumption may not necessarily be valid, and so intra-operative real-time imaging using interventional MRI, ultrasound, video and electrophysiological recordings are often employed to ameliorate this situation. Although IGS is now in extensive routine clinical use in neurosurgery and is gaining ground in other surgical disciplines, there remain many drawbacks that must be overcome before it can be employed in more general minimally-invasive procedures. This review overviews the roots of IGS in neurosurgery, provides examples of its use outside the brain, discusses the infrastructure required for successful implementation of IGS approaches and outlines the challenges that must be overcome for IGS to advance further. (topical review)

  11. The operators' non-compliance behavior to conduct emergency operating procedures--comparing with the work experience and the complexity of procedural steps

    International Nuclear Information System (INIS)

    Park, Jinkyun; Jung, Wondea

    2003-01-01

    Many kinds of procedures have been used to reduce the operators' workload throughout various industries, such as in the aviation, the chemical and the nuclear industry. It is remarkable that, however, significant portion of accidents or incidents was caused by procedure related human error due to non-compliance of procedures. In this study, to investigate the operators' non-compliance behavior, emergency-training records were collected using a full scope simulator. And three types of the operators' behavior (such as strict adherence, skipping redundant actions and modifying action sequences) observed from collected emergency training records were compared with both their work experience and the complexity of procedural steps. As the results, three remarkable relationships are obtained. They are: (1) the operators who have an intermediate work experience seem to frequently adopt non-compliance behavior to conduct the procedural steps, (2) the operators seem to frequently adopt non-compliance behavior to conduct the procedural steps that have an intermediate procedural complexity, and (3) the senior reactor operators seem to accommodate their non-compliance behavior based on the complexity of procedural steps. Therefore, it is expected that these relationships can be used as meaningful clues not only to scrutinize the reason for non-compliance behavior but also to suggest appropriate remedies for the reduction of non-compliance behavior that can result in procedure related human error

  12. Towards Validating Risk Indicators Based on Measurement Theory (Extended version)

    NARCIS (Netherlands)

    Morali, A.; Wieringa, Roelf J.

    Due to the lack of quantitative information and for cost-efficiency, most risk assessment methods use partially ordered values (e.g. high, medium, low) as risk indicators. In practice it is common to validate risk indicators by asking stakeholders whether they make sense. This way of validation is

  13. Evaluation of the Validity and Reliability of the Waterlow Pressure Ulcer Risk Assessment Scale.

    Science.gov (United States)

    Charalambous, Charalambos; Koulori, Agoritsa; Vasilopoulos, Aristidis; Roupa, Zoe

    2018-04-01

    Prevention is the ideal strategy to tackle the problem of pressure ulcers. Pressure ulcer risk assessment scales are one of the most pivotal measures applied to tackle the problem, much criticisms has been developed regarding the validity and reliability of these scales. To investigate the validity and reliability of the Waterlow pressure ulcer risk assessment scale. The methodology used is a narrative literature review, the bibliography was reviewed through Cinahl, Pubmed, EBSCO, Medline and Google scholar, 26 scientific articles where identified. The articles where chosen due to their direct correlation with the objective under study and their scientific relevance. The construct and face validity of the Waterlow appears adequate, but with regards to content validity changes in the category age and gender can be beneficial. The concurrent validity cannot be assessed. The predictive validity of the Waterlow is characterized by high specificity and low sensitivity. The inter-rater reliability has been demonstrated to be inadequate, this may be due to lack of clear definitions within the categories and differentiating level of knowledge between the users. Due to the limitations presented regarding the validity and reliability of the Waterlow pressure ulcer risk assessment scale, the scale should be used in conjunction with clinical assessment to provide optimum results.

  14. Evaluation of the Validity and Reliability of the Waterlow Pressure Ulcer Risk Assessment Scale

    Science.gov (United States)

    Charalambous, Charalambos; Koulori, Agoritsa; Vasilopoulos, Aristidis; Roupa, Zoe

    2018-01-01

    Introduction Prevention is the ideal strategy to tackle the problem of pressure ulcers. Pressure ulcer risk assessment scales are one of the most pivotal measures applied to tackle the problem, much criticisms has been developed regarding the validity and reliability of these scales. Objective To investigate the validity and reliability of the Waterlow pressure ulcer risk assessment scale. Method The methodology used is a narrative literature review, the bibliography was reviewed through Cinahl, Pubmed, EBSCO, Medline and Google scholar, 26 scientific articles where identified. The articles where chosen due to their direct correlation with the objective under study and their scientific relevance. Results The construct and face validity of the Waterlow appears adequate, but with regards to content validity changes in the category age and gender can be beneficial. The concurrent validity cannot be assessed. The predictive validity of the Waterlow is characterized by high specificity and low sensitivity. The inter-rater reliability has been demonstrated to be inadequate, this may be due to lack of clear definitions within the categories and differentiating level of knowledge between the users. Conclusion Due to the limitations presented regarding the validity and reliability of the Waterlow pressure ulcer risk assessment scale, the scale should be used in conjunction with clinical assessment to provide optimum results. PMID:29736104

  15. Validation Test Results for Orthogonal Probe Eddy Current Thruster Inspection System

    Science.gov (United States)

    Wincheski, Russell A.

    2007-01-01

    Recent nondestructive evaluation efforts within NASA have focused on an inspection system for the detection of intergranular cracking originating in the relief radius of Primary Reaction Control System (PCRS) Thrusters. Of particular concern is deep cracking in this area which could lead to combustion leakage in the event of through wall cracking from the relief radius into an acoustic cavity of the combustion chamber. In order to reliably detect such defects while ensuring minimal false positives during inspection, the Orthogonal Probe Eddy Current (OPEC) system has been developed and an extensive validation study performed. This report describes the validation procedure, sample set, and inspection results as well as comparing validation flaws with the response from naturally occuring damage.

  16. Non-destructive measurements of nuclear wastes. Validation and industrial operating experience

    International Nuclear Information System (INIS)

    Saas, A.; Tchemitciieff, E.

    1993-01-01

    After a short survey of the means employed for the non-destructive measurement of specific activities (γ and X-ray) in waste packages and raw waste, the performances of the device and the ANDRA requirements are presented. The validation of the γ and X-ray measurements on packages is obtained through determining, by destructive means, the same activity on coring samples. The same procedure is used for validating the homogeneity measurements on packages (either homogeneous or heterogeneous). Different operating experiences are then exposed for several kinds of packages and waste. Up to now, about twenty different types of packages have been examined and more than 200 packages have allowed the calibration, validation, and control

  17. Development and validation of a gas chromatography/mass spectrometry procedure for confirmation of para-toluenesulfonamide in edible fish fillet tissue.

    Science.gov (United States)

    Idowu, Olutosin R; Kijak, Philip J; Meinertz, Jeffery R; Schmidt, Larry J

    2004-01-01

    Chloramine-T is a disinfectant being developed as a treatment for bacterial gill disease in cultured fish. As part of the drug approval process, a method is required for the confirmation of chloramine-T residues in edible fish tissue. The marker residue that will be used to determine the depletion of chloramine-T residues from the edible tissue of treated fish is para-toluenesulfonamide (p-TSA), a metabolite of chloramine-T. The development and validation of a procedure for the confirmation of p-TSA is described. Homogenized fish tissue is dried by mixing with anhydrous sodium sulfate, and the mixture is extracted with methylene chloride. The extract is passed through a silica gel solid-phase extraction column, from which p-TSA is subsequently eluted with acetonitrile. The acetonitrile extract is evaporated, and the oily residue is dissolved in hexane. The hexane solution is shaken with fresh acetonitrile. The acetonitrile solution is evaporated and the residue is redissolved in dilute potassium hydroxide solution. The aqueous solution is extracted with methylene chloride to further remove more of the fat co-extractive. The aqueous solution is reacted with pentafluorobenzyl bromide in presence of tetrabutylammonium hydrogensulfate. The resulting di-(pentafluorobenzyl) derivative of p-TSA is analyzed by gas chromatography/mass spectrometry. This method permits the confirmation of p-TSA in edible fish tissue at 20 ppb.

  18. False Negatives, Canter's Background Interference Procedure, the Trail Making Test, and Epileptics.

    Science.gov (United States)

    McKinzey, Ronald K.; And Others

    1985-01-01

    Results of correlation studies of 141 adult epileptics' scores on the Background Interference Procedure (BIP) indicated that the BIP often does not agree with abnormal neurological diagnoses but often does agree with psychiatric diagnoses of Organic Brain Syndrome (OBS). Suggests that future BIP validity studies include a behavioral measure of OBS…

  19. Validating the Changes to Self-identity After Total Laryngectomy.

    Science.gov (United States)

    Bickford, Jane; Coveney, John; Baker, Janet; Hersh, Deborah

    2018-05-25

    A total laryngectomy often prolongs life but results in long-term disablement, disfigurement, and complex care needs. Current clinical practice addresses the surgical options, procedures, and immediate recovery. Less support is available longer-term despite significant changes to aspects of personhood and ongoing medical needs. The aim of this study was to explore the experience of living with and/or supporting individuals with a laryngectomy at least 1 year after surgery. Constructivist grounded theory methods and symbolic interactionism were used to guide collection and analysis of interview data from 28 participants (12 individuals with a laryngectomy, 9 primary supporters, and 7 health professionals). The phenomena of "validating the altered self after total laryngectomy" highlighted how individuals, postlaryngectomy, navigate and negotiate interactions due to the disruption of their self-expression, related competencies, and roles. Several reframing patterns representing validation of the self emerged from the narratives. They were as follows: destabilized, resigned, resolute, and transformed. The data describe the influence of the processes of developing competence and building resilience, combined with contextual factors, for example, timing and turning points; being supported; and personal factors on these reframing patterns. The findings further our understanding of the long-term subjective experience of identity change after laryngectomy and call attention to the persisting need for psychosocial support. This research provides important evidence for evaluating and strengthening the continuum of services (specialist to community) and supporting social participation, regardless of communication method, and for competency training for all involved to optimize person-centered practices.

  20. Adaptation and validation of the patient assessment of chronic illness care in the French context.

    Science.gov (United States)

    Krucien, Nicolas; Le Vaillant, Marc; Pelletier-Fleury, Nathalie

    2014-06-19

    Chronic diseases are major causes of disability worldwide with rising prevalence. Most patients suffering from chronic conditions do not always receive optimal care. The Chronic Care Model (CCM) has been developed to help general practitioners making quality improvements. The Patient Assessment of Chronic Illness Care (PACIC) questionnaire was increasingly used in several countries to appraise the implementation of the CCM from the patients' perspective. The objective of this study was to adapt the PACIC questionnaire in the French context and to test the validity of this adaptation in a sample of patients with multiple chronic conditions. The PACIC was translated into French language using a forward/backward procedure. The French version was validated using a sample of 150 patients treated for obstructive sleep apnea syndrome (OSAS) and having multiple chronic co-morbidities. Several forms of validity were analysed: content; face; construct; and internal consistency. The construct validity was investigated with an exploratory factorial analysis. The French-version of the PACIC consisted in 18 items, after merging two pairs of items due to redundancy. The high number of items exhibiting floor/ceiling effects and the non-normality of the ratings suggested that a 5-points rating scale was somewhat inappropriate to assess the patients' experience of care. The construct validity of the French-PACIC was verified and resulted in a bi-dimensional structure. Overall this structure showed a high level of internal consistency. The PACIC score appeared to be significantly related to the age and self-reported health of the patients. A French-version of the PACIC questionnaire is now available to evaluate the patients' experience of care and to monitor the quality improvements realised by the medical structures. This study also pointed out some methodological issues about the PACIC questionnaire, related to the format of the rating scale and to the structure of the

  1. Validation and evaluation of epistemic uncertainty in rainfall thresholds for regional scale landslide forecasting

    Science.gov (United States)

    Gariano, Stefano Luigi; Brunetti, Maria Teresa; Iovine, Giulio; Melillo, Massimo; Peruccacci, Silvia; Terranova, Oreste Giuseppe; Vennari, Carmela; Guzzetti, Fausto

    2015-04-01

    Prediction of rainfall-induced landslides can rely on empirical rainfall thresholds. These are obtained from the analysis of past rainfall events that have (or have not) resulted in slope failures. Accurate prediction requires reliable thresholds, which need to be validated before their use in operational landslide warning systems. Despite the clear relevance of validation, only a few studies have addressed the problem, and have proposed and tested robust validation procedures. We propose a validation procedure that allows for the definition of optimal thresholds for early warning purposes. The validation is based on contingency table, skill scores, and receiver operating characteristic (ROC) analysis. To establish the optimal threshold, which maximizes the correct landslide predictions and minimizes the incorrect predictions, we propose an index that results from the linear combination of three weighted skill scores. Selection of the optimal threshold depends on the scope and the operational characteristics of the early warning system. The choice is made by selecting appropriately the weights, and by searching for the optimal (maximum) value of the index. We discuss weakness in the validation procedure caused by the inherent lack of information (epistemic uncertainty) on landslide occurrence typical of large study areas. When working at the regional scale, landslides may have occurred and may have not been reported. This results in biases and variations in the contingencies and the skill scores. We introduce two parameters to represent the unknown proportion of rainfall events (above and below the threshold) for which landslides occurred and went unreported. We show that even a very small underestimation in the number of landslides can result in a significant decrease in the performance of a threshold measured by the skill scores. We show that the variations in the skill scores are different for different uncertainty of events above or below the threshold. This

  2. Analisys of Book i of the New Code of Civil Procedure: A Reflex of the Phenomenon of Constitucionalization of Civil Procedure

    Directory of Open Access Journals (Sweden)

    Yvete Flavio da Costa

    2016-10-01

    Full Text Available This article aims to analyze the Book I of the New Code of Civil Procedure, called "The civil procedural rules" linking it to the constitutionalization process that civil process has going by after the enactment of the 1988 Federal Constitution. This is because, since its enactment, the Democratic Constitution became maximum vector of the entire legal system, so that all laws must  comply  with  its  principles  and  rules,  under  penalty  of  being  considered unconstitutional. The previous Code, enacted in 1973, before the current Constitution, had no such concern because at that time, the maximum vector law was the Civil Code of 1916. So in that text, there was not some of the fundamental assumptions for valid and regular development of civil procedure, such as contradictory. With that in mind, the legislator brought in this chapter, a kind of law of introduction to civil procedure rules, regulating the application of the process in time and space, and also brings the constitutional principles that were not explicitly present in the encoded text before.. The present article is justified by the need to carry out a deeper study of the constitution of civil procedure, with a view to the subject nowadays. In order to enable the thematic deepening of the subject, it was employed the logical deductive and inductive methods logical, since the research was based on deductive research of new legislation.

  3. Recent changes in French flaw evaluation procedures: RSE-M

    International Nuclear Information System (INIS)

    Faidy, C.

    2001-01-01

    After a general presentation of the RSE-M, the French Code which describes the rules for in-service inspection of nuclear power plant components, this paper will be focused on the major new developments of the flaw evaluation procedure: critical crack size evaluation, material properties, safety factors and the major validation tasks done to support the RSE-M, edition 2000. The paper will conclude on on-going development in this area. (author)

  4. Symptom-based emergency operating procedures development for Ignalina NPP

    International Nuclear Information System (INIS)

    Kruglov, Y.

    1999-01-01

    In this paper and lecture are presented: (1) Introduction; (2) EOP project work stages and documentation; (3) Selection and justification of accident management strategy; (4) Content of EOP package; (5) Development of EOP package; (6) EOP package verification; (7) EOP package validation; (8) EOP training; (9) EOP implementation; (10) Conditions of symptom-based emergency operating producers package application and its interconnection with event-based emergency operating procedures; (11) Rules of EOP application; EOP maintenance

  5. Recent changes in French flaw evaluation procedures: RSE-M

    Energy Technology Data Exchange (ETDEWEB)

    Faidy, C. [Electricite de France (EDF-SEPTEN), 69 - Villeurbanne (France)

    2001-07-01

    After a general presentation of the RSE-M, the French Code which describes the rules for in-service inspection of nuclear power plant components, this paper will be focused on the major new developments of the flaw evaluation procedure: critical crack size evaluation, material properties, safety factors and the major validation tasks done to support the RSE-M, edition 2000. The paper will conclude on on-going development in this area. (author)

  6. Development and validation of trauma surgical skills metrics: Preliminary assessment of performance after training.

    Science.gov (United States)

    Shackelford, Stacy; Garofalo, Evan; Shalin, Valerie; Pugh, Kristy; Chen, Hegang; Pasley, Jason; Sarani, Babak; Henry, Sharon; Bowyer, Mark; Mackenzie, Colin F

    2015-07-01

    Maintaining trauma-specific surgical skills is an ongoing challenge for surgical training programs. An objective assessment of surgical skills is needed. We hypothesized that a validated surgical performance assessment tool could detect differences following a training intervention. We developed surgical performance assessment metrics based on discussion with expert trauma surgeons, video review of 10 experts and 10 novice surgeons performing three vascular exposure procedures and lower extremity fasciotomy on cadavers, and validated the metrics with interrater reliability testing by five reviewers blinded to level of expertise and a consensus conference. We tested these performance metrics in 12 surgical residents (Year 3-7) before and 2 weeks after vascular exposure skills training in the Advanced Surgical Skills for Exposure in Trauma (ASSET) course. Performance was assessed in three areas as follows: knowledge (anatomic, management), procedure steps, and technical skills. Time to completion of procedures was recorded, and these metrics were combined into a single performance score, the Trauma Readiness Index (TRI). Wilcoxon matched-pairs signed-ranks test compared pretraining/posttraining effects. Mean time to complete procedures decreased by 4.3 minutes (from 13.4 minutes to 9.1 minutes). The performance component most improved by the 1-day skills training was procedure steps, completion of which increased by 21%. Technical skill scores improved by 12%. Overall knowledge improved by 3%, with 18% improvement in anatomic knowledge. TRI increased significantly from 50% to 64% with ASSET training. Interrater reliability of the surgical performance assessment metrics was validated with single intraclass correlation coefficient of 0.7 to 0.98. A trauma-relevant surgical performance assessment detected improvements in specific procedure steps and anatomic knowledge taught during a 1-day course, quantified by the TRI. ASSET training reduced time to complete vascular

  7. Development and empirical validation of symmetric component measures of multi-dimensional constructs

    DEFF Research Database (Denmark)

    Sørensen, Hans Eibe; Slater, Stanley F.

    2008-01-01

    Atheoretical measure purification may lead to construct deficient measures. The purpose of this paper is to provide a theoretically driven procedure for the development and empirical validation of symmetric component measures of multi-dimensional constructs. We place particular emphasis on establ...

  8. Parent-Implemented Procedural Modification of Escape Extinction in the Treatment of Food Selectivity in a Young Child with Autism

    Science.gov (United States)

    Tarbox, Jonathan; Schiff, Averil; Najdowski, Adel C.

    2010-01-01

    Fool selectivity is characterized by the consumption of an inadequate variety of foods. The effectiveness of behavioral treatment procedures, particularly nonremoval of the spoon, is well validated by research. The role of parents in the treatment of feeding disorders and the feasibility of behavioral procedures for parent implementation in the…

  9. Maternal Near-Miss Due to Unsafe Abortion and Associated Short ...

    African Journals Online (AJOL)

    AJRH Managing Editor

    of Obstetrics and Gynaecology, Olabisi Onabanjo University Teaching Hospital, Sagamu, ... induced abortion procedures. ... cases of MNM were identified through a ... system was put in place in each hospital to help ... that were due to unsafe abortion we used a variant ... based on 13 cases that had provider's information.

  10. Emergency Kausch-Whipple procedure: indications and experiences.

    Science.gov (United States)

    Standop, Jens; Glowka, Tim; Schmitz, Volker; Schaefer, Nico; Hirner, Andreas; Kalff, Jörg C

    2010-03-01

    Pancreaticoduodenectomy is a demanding procedure even in selected patients but becomes formidable when performed in cases of emergency. This study analyzed our experience with urgent pancreatoduodenectomies; special emphases were put on the evaluation of diagnostic means and the validation of existing indications for performance of this procedure. Three hundred one patients who underwent pancreatoduodenectomy between 1989 and 2008 were identified from a pancreatic resection database and reviewed for emergency indications. Six patients (2%) underwent emergency pancreatoduodenectomy. Indications included endoscopy-related perforation, postoperative complications, and uncontrollable intraduodenal tumor bleeding. Length of stay and occurrence of nonsurgical complications were increased in emergency compared with elective pancreatoduodenectomies. Although increased, no significant differences were found regarding mortality and surgery-related complications. Indications for emergency pancreatoduodenectomies were based on clinical decisions rather than on radiologic diagnostics. Urgent pancreatic head resections may be considered as an option in selected patients if handling of local complications by interventional measures or limited surgery seems unsafe.

  11. 3. report of the Management Advisory Committee of the Inspection Validation Centre

    International Nuclear Information System (INIS)

    1986-07-01

    The Inspection Validation Centre (IVC) has been established at the UKAEA Risley Nuclear Power Development Laboratories for the purpose of validating procedures, equipment and personnel proposed by the CEGB for use in the ultrasonic inspection at different stages of the fabrication, erection and operation of the CEGB's proposed PWR pressure vessel and such other components as are identified by the CEGB. Technical progress since May 1985 is reported. The number of operators receiving certificates for detection and sizing of flaws is given. (author)

  12. Development and Validation of a Mobile Device-based External Ventricular Drain Simulator.

    Science.gov (United States)

    Morone, Peter J; Bekelis, Kimon; Root, Brandon K; Singer, Robert J

    2017-10-01

    Multiple external ventricular drain (EVD) simulators have been created, yet their cost, bulky size, and nonreusable components limit their accessibility to residency programs. To create and validate an animated EVD simulator that is accessible on a mobile device. We developed a mobile-based EVD simulator that is compatible with iOS (Apple Inc., Cupertino, California) and Android-based devices (Google, Mountain View, California) and can be downloaded from the Apple App and Google Play Store. Our simulator consists of a learn mode, which teaches users the procedure, and a test mode, which assesses users' procedural knowledge. Twenty-eight participants, who were divided into expert and novice categories, completed the simulator in test mode and answered a postmodule survey. This was graded using a 5-point Likert scale, with 5 representing the highest score. Using the survey results, we assessed the module's face and content validity, whereas construct validity was evaluated by comparing the expert and novice test scores. Participants rated individual survey questions pertaining to face and content validity a median score of 4 out of 5. When comparing test scores, generated by the participants completing the test mode, the experts scored higher than the novices (mean, 71.5; 95% confidence interval, 69.2 to 73.8 vs mean, 48; 95% confidence interval, 44.2 to 51.6; P mobile-based EVD simulator that is inexpensive, reusable, and accessible. Our results demonstrate that this simulator is face, content, and construct valid. Copyright © 2017 by the Congress of Neurological Surgeons

  13. Computerized Italian criticality guide, description and validation

    International Nuclear Information System (INIS)

    Carotenuto, M.; Landeyro, P.A.

    1988-10-01

    Our group is developing an 'expert system' for collecting engineering know-how on back-end nuclear plant design. An expert system is the most suitable software tool for our problem. During the analysis, the design process was divided into different branches. At each branch of the design process the Expert System relates a computerized design procedure. Any design procedure is composed of a set of design methods, together with their condition of application and reliability limits. In the framework of this expert system, the nuclear criticality safety analysis procedure was developed, in the form of a computerized criticality guide, attempting to reproduce the designer's normal 'reasoning' process. The criticality guide is composed of two parts: A computerized text, including theory, a description of the accidents occurred in the past and a description of the italian design experience; An interactive computer aided calculation module, containing a graphical facility for critical parameter curves. In the present report are presented the criticality guide (computerized Italian Criticality Guide) and its validation test. (author)

  14. Computerized Italian criticality guide, description and validation

    Energy Technology Data Exchange (ETDEWEB)

    Carotenuto, M; Landeyro, P A [ENEA - Dipartimento Ciclo del Combustibile, Centro Ricerche Energia, Casaccia (Italy)

    1988-10-15

    Our group is developing an 'expert system' for collecting engineering know-how on back-end nuclear plant design. An expert system is the most suitable software tool for our problem. During the analysis, the design process was divided into different branches. At each branch of the design process the Expert System relates a computerized design procedure. Any design procedure is composed of a set of design methods, together with their condition of application and reliability limits. In the framework of this expert system, the nuclear criticality safety analysis procedure was developed, in the form of a computerized criticality guide, attempting to reproduce the designer's normal 'reasoning' process. The criticality guide is composed of two parts: A computerized text, including theory, a description of the accidents occurred in the past and a description of the italian design experience; An interactive computer aided calculation module, containing a graphical facility for critical parameter curves. In the present report are presented the criticality guide (computerized Italian Criticality Guide) and its validation test. (author)

  15. New non-cognitive procedures for medical applicant selection: a qualitative analysis in one school.

    Science.gov (United States)

    Katz, Sara; Vinker, Shlomo

    2014-11-07

    Recent data have called into question the reliability and predictive validity of standard admission procedures to medical schools. Eliciting non-cognitive attributes of medical school applicants using qualitative tools and methods has thus become a major challenge. 299 applicants aged 18-25 formed the research group. A set of six research tools was developed in addition to the two existing ones. These included: a portfolio task, an intuitive task, a cognitive task, a personal task, an open self-efficacy questionnaire and field-notes. The criteria-based methodology design used constant comparative analysis and grounded theory techniques to produce a personal attributes profile per participant, scored on a 5-point scale holistic rubric. Qualitative validity of data gathering was checked by comparing the profiles elicited from the existing interview against the profiles elicited from the other tools, and by comparing two profiles of each of the applicants who handed in two portfolio tasks. Qualitative validity of data analysis was checked by comparing researcher results with those of an external rater (n =10). Differences between aggregated profile groups were checked by the Npar Wilcoxon Signed Ranks Test and by Spearman Rank Order Correlation Test. All subjects gave written informed consent to their participation. Privacy was protected by using code numbers. A concept map of 12 personal attributes emerged, the core constructs of which were motivation, sociability and cognition. A personal profile was elicited. Inter-rater agreement was 83.3%. Differences between groups by aggregated profiles were found significant (p < .05, p < .01, p < .001).A random sample of sixth year students (n = 12) underwent the same admission procedure as the research group. Rank order was different; and arrogance was a new construct elicited in the sixth year group. This study suggests a broadening of the methodology for selecting medical school applicants. This methodology

  16. Validation of calculational methods for nuclear criticality safety - approved 1975

    International Nuclear Information System (INIS)

    Anon.

    1977-01-01

    The American National Standard for Nuclear Criticality Safety in Operations with Fissionable Materials Outside Reactors, N16.1-1975, states in 4.2.5: In the absence of directly applicable experimental measurements, the limits may be derived from calculations made by a method shown to be valid by comparison with experimental data, provided sufficient allowances are made for uncertainties in the data and in the calculations. There are many methods of calculation which vary widely in basis and form. Each has its place in the broad spectrum of problems encountered in the nuclear criticality safety field; however, the general procedure to be followed in establishing validity is common to all. The standard states the requirements for establishing the validity and area(s) of applicability of any calculational method used in assessing nuclear criticality safety

  17. Rates and risk factors of unplanned 30-day readmission following general and thoracic pediatric surgical procedures.

    Science.gov (United States)

    Polites, Stephanie F; Potter, Donald D; Glasgow, Amy E; Klinkner, Denise B; Moir, Christopher R; Ishitani, Michael B; Habermann, Elizabeth B

    2017-08-01

    Postoperative unplanned readmissions are costly and decrease patient satisfaction; however, little is known about this complication in pediatric surgery. The purpose of this study was to determine rates and predictors of unplanned readmission in a multi-institutional cohort of pediatric surgical patients. Unplanned 30-day readmissions following general and thoracic surgical procedures in children readmission per 30 person-days were determined to account for varied postoperative length of stay (pLOS). Patients were randomly divided into 70% derivation and 30% validation cohorts which were used for creation and validation of a risk model for readmission. Readmission occurred in 1948 (3.6%) of 54,870 children for a rate of 4.3% per 30 person-days. Adjusted predictors of readmission included hepatobiliary procedures, increased wound class, operative duration, complications, and pLOS. The predictive model discriminated well in the derivation and validation cohorts (AUROC 0.710 and 0.701) with good calibration between observed and expected readmission events in both cohorts (p>.05). Unplanned readmission occurs less frequently in pediatric surgery than what is described in adults, calling into question its use as a quality indicator in this population. Factors that predict readmission including type of procedure, complications, and pLOS can be used to identify at-risk children and develop prevention strategies. III. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. A Computerized Procedure linked to Virtual Equipment

    International Nuclear Information System (INIS)

    Jung, Yeon Sub; Song, Tae Young

    2011-01-01

    Digital, information, and communication technologies have change human's behavior. This is because human has limitation to memorize and process information. Human has to access other information and real time information for important decisions. Those technologies are playing important roles. Nuclear power plants cannot be exception. Many accidents in nuclear power plants result from absent or incorrect information. The information for nuclear personnel is context sensitive. They don't have enough time to verify the context sensitive information. Therefore they skip the information, as resulting in incident. Nuclear personnel are usually carrying static paper procedures during local task performance. The procedure guides them steps to follow. There is, however, no dynamic and context sensitive information in the paper. The effect of the work is evaluated once while getting permission of the work. Afterward they are not informed. The static paper is generally simplified, so that it does not show detail of equipment being manipulated. Particularly novice workers feel difficult to understand the procedure due to lack of detail. Pictures of equipment inserted in the procedure are not enough for comprehension. A computerized procedure linked with virtual equipment is one of the best solutions to increase the detail of procedure. Virtual equipment, however, has still limitation not to provide real time information, because the virtual equipment is not synchronized with real plants

  19. Validation of the Continuum of Care Conceptual Model for Athletic Therapy

    Directory of Open Access Journals (Sweden)

    Mark R. Lafave

    2015-01-01

    Full Text Available Utilization of conceptual models in field-based emergency care currently borrows from existing standards of medical and paramedical professions. The purpose of this study was to develop and validate a comprehensive conceptual model that could account for injuries ranging from nonurgent to catastrophic events including events that do not follow traditional medical or prehospital care protocols. The conceptual model should represent the continuum of care from the time of initial injury spanning to an athlete’s return to participation in their sport. Finally, the conceptual model should accommodate both novices and experts in the AT profession. This paper chronicles the content validation steps of the Continuum of Care Conceptual Model for Athletic Therapy (CCCM-AT. The stages of model development were domain and item generation, content expert validation using a three-stage modified Ebel procedure, and pilot testing. Only the final stage of the modified Ebel procedure reached a priori 80% consensus on three domains of interest: (1 heading descriptors; (2 the order of the model; (3 the conceptual model as a whole. Future research is required to test the use of the CCCM-AT in order to understand its efficacy in teaching and practice within the AT discipline.

  20. Validation of internet-based self-reported anthropometric, demographic data and participant identity in the Food4Me study

    Science.gov (United States)

    BACKGROUND In e-health intervention studies, there are concerns about the reliability of internet-based, self-reported (SR) data and about the potential for identity fraud. This study introduced and tested a novel procedure for assessing the validity of internet-based, SR identity and validated anth...

  1. An expert system-based aid for analysis of Emergency Operating Procedures in NPPs

    International Nuclear Information System (INIS)

    Jakubowski, Z.; Beraha, D.

    1996-01-01

    Emergency Operating Procedures (EOPs) generally and an accident management (AM) particularly play a significant part in the safety philosophy on NPPs since many years. A better methodology for development and validation of EOPs is desired. A prototype of an Emergency Operating Procedures Analysis System (EOPAS), which has been developed at GRS, is presented in the paper. The hardware configuration and software organisation of the system is briefly reviewed. The main components of the system such as the knowledge base of an expert system and the engineering simulator are described. (author)

  2. A validated RP-HPLC method for the determination of Irinotecan hydrochloride residues for cleaning validation in production area

    Directory of Open Access Journals (Sweden)

    Sunil Reddy

    2013-03-01

    Full Text Available Introduction: cleaning validation is an integral part of current good manufacturing practices in pharmaceutical industry. The main purpose of cleaning validation is to prove the effectiveness and consistency of cleaning in a given pharmaceutical production equipment to prevent cross contamination and adulteration of drug product with other active ingredient. Objective: a rapid, sensitive and specific reverse phase HPLC method was developed and validated for the quantitative determination of irinotecan hydrochloride in cleaning validation swab samples. Method: the method was validated using waters symmetry shield RP-18 (250mm x 4.6mm 5 µm column with isocratic mobile phase containing a mixture of 0.02 M potassium di-hydrogen ortho-phosphate, pH adjusted to 3.5 with ortho-phosphoric acid, methanol and acetonitrile (60:20:20 v/v/v. The flow rate of mobile phase was 1.0 mL/min with column temperature of 25°C and detection wavelength at 220nm. The sample injection volume was 100 µl. Results: the calibration curve was linear over a concentration range from 0.024 to 0.143 µg/mL with a correlation coefficient of 0.997. The intra-day and inter-day precision expressed as relative standard deviation were below 3.2%. The recoveries obtained from stainless steel, PCGI, epoxy, glass and decron cloth surfaces were more than 85% and there was no interference from the cotton swab. The detection limit (DL and quantitation limit (QL were 0.008 and 0.023 µg ml-1, respectively. Conclusion: the developed method was validated with respect to specificity, linearity, limit of detection and quantification, accuracy, precision and solution stability. The overall procedure can be used as part of a cleaning validation program in pharmaceutical manufacture of irinotecan hydrochloride.

  3. Assessment of juveniles testimonies’ validity

    Directory of Open Access Journals (Sweden)

    Dozortseva E.G.

    2015-12-01

    Full Text Available The article presents a review of the English language publications concerning the history and the current state of differential psychological assessment of validity of testimonies produced by child and adolescent victims of crimes. The topicality of the problem in Russia is high due to the tendency of Russian specialists to use methodical means and instruments developed abroad in this sphere for forensic assessments of witness testimony veracity. A system of Statement Validity Analysis (SVA by means of Criteria-Based Content Analysis (CBCA and Validity Checklist is described. The results of laboratory and field studies of validity of CBCA criteria on the basis of child and adult witnesses are discussed. The data display a good differentiating capacity of the method, however, a high level of error probability. The researchers recommend implementation of SVA in the criminal investigation process, but not in the forensic assessment. New perspective developments in the field of methods for differentiation of witness statements based on the real experience and fictional are noted. The conclusion is drawn that empirical studies and a special work for adaptation and development of new approaches should precede their implementation into Russian criminal investigation and forensic assessment practice

  4. HPLC method validated for the simultaneous analysis of cichoric acid and alkamides in Echinacea purpurea plants and products

    DEFF Research Database (Denmark)

    Mølgaard, Per; Johnsen, Søren; Christensen, Peter

    2003-01-01

    phenolics as well as the lipophilic alkamides are released from the samples, followed by the analytical HPLC procedure for quantitative determination of these compounds. The method is the first one validated for the determination of these two groups of compounds in the same procedure. Naringenin has been...

  5. A rare cause for Hartmann’s procedure due to biliary stent migration: A case report

    Directory of Open Access Journals (Sweden)

    Petros Siaperas

    2017-01-01

    Conclusion: In cases of non-complicated stent migration endoscopic retrieval is the indicated treatment. In patients who suffer serious complications due to stent dislocation, emergency surgery may be the proper treatment option.

  6. Validity and Reliability of Internalized Stigma of Mental Illness (Cantonese)

    Science.gov (United States)

    Young, Daniel Kim-Wan; Ng, Petrus Y. N.; Pan, Jia-Yan; Cheng, Daphne

    2017-01-01

    Purpose: This study aims to translate and test the reliability and validity of the Internalized Stigma of Mental Illness-Cantonese (ISMI-C). Methods: The original English version of ISMI is translated into the ISMI-C by going through forward and backward translation procedure. A cross-sectional research design is adopted that involved 295…

  7. Administrative Procedure Act and mass procedures (illustrated by the nuclear licensing procedure)

    International Nuclear Information System (INIS)

    Naumann, R.

    1977-01-01

    The report deals with the administrative procedure law of 25.5.76 of the Fed. Government, esp. with its meaning for the administrative procedures for the permission for nuclear power plants, as fas ar so-called mass procedures are concerned. (UN) [de

  8. Validation of KENO-based criticality calculations at Rocky Flats

    International Nuclear Information System (INIS)

    Felsher, P.D.; McKamy, J.N.; Monahan, S.P.

    1992-01-01

    In the absence of experimental data, it is necessary to rely on computer-based computational methods in evaluating the criticality condition of a nuclear system. The validity of the computer codes is established in a two-part procedure as outlined in ANSI/ANS 8.1. The first step, usually the responsibility of the code developer, involves verification that the algorithmic structure of the code is performing the intended mathematical operations correctly. The second step involves an assessment of the code's ability to realistically portray the governing physical processes in question. This is accomplished by determining the code's bias, or systematic error, through a comparison of computational results to accepted values obtained experimentally. In this paper, the authors discuss the validation process for KENO and the Hansen-Roach cross sections in use at EG and G Rocky Flats. The validation process at Rocky Flats consists of both global and local techniques. The global validation resulted in a maximum k eff limit of 0.95 for the limiting-accident scanarios of a criticality evaluation

  9. A case of disseminated peritoneal leiomyomatosis after two laparoscopic procedures due to uterine fibroids.

    Science.gov (United States)

    Ciebiera, Michał; Słabuszewska-Jóźwiak, Aneta; Zaręba, Kornelia; Jakiel, Grzegorz

    2017-01-01

    Disseminated peritoneal leiomyomatosis (DPL) is a rare disorder characterized by the presence of multifocal nodules and tumors composed of proliferating smooth muscle tissue, spread throughout the peritoneum. Estrogens and progesterone are considered to be the main factors initiating the formation of disseminated leiomyomatosis. Disseminated peritoneal leiomyomatosis is often asymptomatic, and acyclic vaginal bleeding or pain in the lower abdomen is associated with leiomyomatous rebuilt uterus corpus. Disseminated peritoneal leiomyomatosis can have other ambiguous presentation. The difficulty in DPL diagnosis is that it is not always accompanied by scattered leiomyomas and can occur after menopause. Some cases of DPL are associated with surgical procedures on uterine fibroids, especially with the use of a morcellator. We present the case of a 39-year-old woman with DPL who underwent laparoscopic myomectomy and laparoscopic supracervical hysterectomy before the final diagnosis of DPL. After the complete surgical treatment performed in our center the patient is free of symptoms.

  10. Emergency operation procedure navigation to avoid commission errors

    International Nuclear Information System (INIS)

    Gofuku, Akio; Ito, Koji

    2004-01-01

    New types of operation control system equipped with a large screen and CRT-based operation panels have been installed in newly constructed nuclear power plants. The operators can share important information of plant conditions by the large screen. The operation control system can know the operations by operators through the computers connected to the operation panels. The software switches placed in the CRT-based operation panels have a problem such that operators may make an error to manipulate an irrelevant software switch with their current operation. This study develops an operation procedure navigation technique to avoid this kind of commission errors. The system lies between CRT-based operation panels and plant control systems and checks an operation by operators if it follows the operation procedure of operation manuals. When the operation is a right one, the operation is executed as if the operation command is directly transmitted to control systems. If the operation does not follow the operation procedure, the system warns the commission error to operators. This paper describes the operation navigation technique, format of base operation model, and a proto-type operation navigation system for a three loop pressurized water reactor plant. The validity of the proto-type system is demonstrated by the operation procedure navigation for a steam generator tube rupture accident. (author)

  11. Development of a 3-dimensional flow analysis procedure for axial pump impellers

    International Nuclear Information System (INIS)

    Kim, Min Hwan; Kim, Jong In; Park, Jin Seok; Huh, Houng Huh; Chang, Moon Hee

    1999-06-01

    A fluid dynamic analysis procedure was developed using the three-dimensional solid model of an axial pump impeller which was theoretically designed using I-DEAS CAD/CAM/CAE software. The CFD software FLUENT was used in the flow field analysis. The steady-state flow regime in the MCP impeller and diffuser was simulated using the developed procedure. The results of calculation were analyzed to confirm whether the design requirements were properly implemented in the impeller model. The validity of the developed procedure was demonstrated by comparing the calculation results with the experimental data available. The pump performance at the design point could be effectively predicted using the developed procedure. The computed velocity distributions have shown a good agreement with the experimental data except for the regions near the wall. The computed head, however, was over-predicted than the experiment. The design period and cost required for the development of an axial pump impeller can be significantly reduced by applying the proposed methodology. (author). 7 refs., 2 tabs

  12. Do coder characteristics influence validity of ICD-10 hospital discharge data?

    Directory of Open Access Journals (Sweden)

    Beck Cynthia A

    2010-04-01

    Full Text Available Abstract Background Administrative data are widely used to study health systems and make important health policy decisions. Yet little is known about the influence of coder characteristics on administrative data validity in these studies. Our goal was to describe the relationship between several measures of validity in coded hospital discharge data and 1 coders' volume of coding (≥13,000 vs. Methods This descriptive study examined 6 indicators of face validity in ICD-10 coded discharge records from 4 hospitals in Calgary, Canada between April 2002 and March 2007. Specifically, mean number of coded diagnoses, procedures, complications, Z-codes, and codes ending in 8 or 9 were compared by coding volume and employment status, as well as hospital type. The mean number of diagnoses was also compared across coder characteristics for 6 major conditions of varying complexity. Next, kappa statistics were computed to assess agreement between discharge data and linked chart data reabstracted by nursing chart reviewers. Kappas were compared across coder characteristics. Results 422,618 discharge records were coded by 59 coders during the study period. The mean number of diagnoses per record decreased from 5.2 in 2002/2003 to 3.9 in 2006/2007, while the number of records coded annually increased from 69,613 to 102,842. Coders at the tertiary hospital coded the most diagnoses (5.0 compared with 3.9 and 3.8 at other sites. There was no variation by coder or site characteristics for any other face validity indicator. The mean number of diagnoses increased from 1.5 to 7.9 with increasing complexity of the major diagnosis, but did not vary with coder characteristics. Agreement (kappa between coded data and chart review did not show any consistent pattern with respect to coder characteristics. Conclusions This large study suggests that coder characteristics do not influence the validity of hospital discharge data. Other jurisdictions might benefit from

  13. Assessment of validity with polytrauma Veteran populations.

    Science.gov (United States)

    Bush, Shane S; Bass, Carmela

    2015-01-01

    Veterans with polytrauma have suffered injuries to multiple body parts and organs systems, including the brain. The injuries can generate a triad of physical, neurologic/cognitive, and emotional symptoms. Accurate diagnosis is essential for the treatment of these conditions and for fair allocation of benefits. To accurately diagnose polytrauma disorders and their related problems, clinicians take into account the validity of reported history and symptoms, as well as clinical presentations. The purpose of this article is to describe the assessment of validity with polytrauma Veteran populations. Review of scholarly and other relevant literature and clinical experience are utilized. A multimethod approach to validity assessment that includes objective, standardized measures increases the confidence that can be placed in the accuracy of self-reported symptoms and physical, cognitive, and emotional test results. Due to the multivariate nature of polytrauma and the multiple disciplines that play a role in diagnosis and treatment, an ideal model of validity assessment with polytrauma Veteran populations utilizes neurocognitive, neurological, neuropsychiatric, and behavioral measures of validity. An overview of these validity assessment approaches as applied to polytrauma Veteran populations is presented. Veterans, the VA, and society are best served when accurate diagnoses are made.

  14. Validation of the XLACS code related to contribution of resolved and unresolved resonances and background cross sections

    International Nuclear Information System (INIS)

    Anaf, J.; Chalhoub, E.S.

    1990-01-01

    The procedures for calculating contributions of resolved and unresolved resonances and background cross sections, in XLACS code, were revised. Constant weighting function and zero Kelvin temperature were considered. Discrepancies found were corrected and now the validated XLACS code generates results that are correct and in accordance with its originally established procedures. (author)

  15. Hypertension Knowledge-Level Scale (HK-LS): A Study on Development, Validity and Reliability

    OpenAIRE

    Erkoc, Sultan Baliz; Isikli, Burhanettin; Metintas, Selma; Kalyoncu, Cemalettin

    2012-01-01

    This study was conducted to develop a scale to measure knowledge about hypertension among Turkish adults. The Hypertension Knowledge-Level Scale (HK-LS) was generated based on content, face, and construct validity, internal consistency, test re-test reliability, and discriminative validity procedures. The final scale had 22 items with six sub-dimensions. The scale was applied to 457 individuals aged ≥18 years, and 414 of them were re-evaluated for test-retest reliability. The six sub-dimensio...

  16. Thermal-hydraulic codes validation for safety analysis of NPPs with RBMK

    International Nuclear Information System (INIS)

    Brus, N.A.; Ioussoupov, O.E.

    2000-01-01

    This work is devoted to validation of western thermal-hydraulic codes (RELAP5/MOD3 .2 and ATHLET 1.1 Cycle C) in application to Russian designed light water reactors. Such validation is needed due to features of RBMK reactor design and thermal-hydraulics in comparison with PWR and BWR reactors, for which these codes were developed and validated. These validation studies are concluded with a comparison of calculation results of modeling with the thermal-hydraulics codes with the experiments performed earlier using the thermal-hydraulics test facilities with the experimental data. (authors)

  17. IN-DRIFT MICROBIAL COMMUNITIES MODEL VALIDATION CALCULATIONS

    Energy Technology Data Exchange (ETDEWEB)

    D.M. Jolley

    2001-12-18

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS M&O 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS M&O 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN M09909SPAMINGl.003 using its replacement DTN M00106SPAIDMO 1.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS M&O 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS M&O (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 200 1) which includes controls for the management of electronic data.

  18. In-Drift Microbial Communities Model Validation Calculations

    Energy Technology Data Exchange (ETDEWEB)

    D. M. Jolley

    2001-09-24

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS M&O 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS M&O 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN MO9909SPAMING1.003 using its replacement DTN MO0106SPAIDM01.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS M&O 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS M&O (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 2001) which includes controls for the management of electronic data.

  19. In-Drift Microbial Communities Model Validation Calculation

    Energy Technology Data Exchange (ETDEWEB)

    D. M. Jolley

    2001-10-31

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS M&O 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS M&O 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN MO9909SPAMING1.003 using its replacement DTN MO0106SPAIDM01.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS M&O 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS M&O (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 2001) which includes controls for the management of electronic data.

  20. In-Drift Microbial Communities Model Validation Calculations

    International Nuclear Information System (INIS)

    Jolley, D.M.

    2001-01-01

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS MandO 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS MandO 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN MO9909SPAMING1.003 using its replacement DTN MO0106SPAIDM01.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS MandO 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS MandO (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 2001) which includes controls for the management of electronic data

  1. IN-DRIFT MICROBIAL COMMUNITIES MODEL VALIDATION CALCULATIONS

    International Nuclear Information System (INIS)

    D.M. Jolley

    2001-01-01

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS M andO 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS M andO 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN M09909SPAMINGl.003 using its replacement DTN M00106SPAIDMO 1.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS M andO 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS M andO (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 200 1) which includes controls for the management of electronic data

  2. Differentiation of ileostomy from colostomy procedures: assessing the accuracy of current procedural terminology codes and the utility of natural language processing.

    Science.gov (United States)

    Vo, Elaine; Davila, Jessica A; Hou, Jason; Hodge, Krystle; Li, Linda T; Suliburk, James W; Kao, Lillian S; Berger, David H; Liang, Mike K

    2013-08-01

    Large databases provide a wealth of information for researchers, but identifying patient cohorts often relies on the use of current procedural terminology (CPT) codes. In particular, studies of stoma surgery have been limited by the accuracy of CPT codes in identifying and differentiating ileostomy procedures from colostomy procedures. It is important to make this distinction because the prevalence of complications associated with stoma formation and reversal differ dramatically between types of stoma. Natural language processing (NLP) is a process that allows text-based searching. The Automated Retrieval Console is an NLP-based software that allows investigators to design and perform NLP-assisted document classification. In this study, we evaluated the role of CPT codes and NLP in differentiating ileostomy from colostomy procedures. Using CPT codes, we conducted a retrospective study that identified all patients undergoing a stoma-related procedure at a single institution between January 2005 and December 2011. All operative reports during this time were reviewed manually to abstract the following variables: formation or reversal and ileostomy or colostomy. Sensitivity and specificity for validation of the CPT codes against the mastery surgery schedule were calculated. Operative reports were evaluated by use of NLP to differentiate ileostomy- from colostomy-related procedures. Sensitivity and specificity for identifying patients with ileostomy or colostomy procedures were calculated for CPT codes and NLP for the entire cohort. CPT codes performed well in identifying stoma procedures (sensitivity 87.4%, specificity 97.5%). A total of 664 stoma procedures were identified by CPT codes between 2005 and 2011. The CPT codes were adequate in identifying stoma formation (sensitivity 97.7%, specificity 72.4%) and stoma reversal (sensitivity 74.1%, specificity 98.7%), but they were inadequate in identifying ileostomy (sensitivity 35.0%, specificity 88.1%) and colostomy (75

  3. H.R. 2372: A Bill to provide jurisdiction and procedures for claims for compassionate payments for injuries due to exposure to radiation from nuclear testing. Introduced in the House of Representatives, One Hundredth First Congress, First Session, May 16, 1989

    International Nuclear Information System (INIS)

    Anon.

    1989-01-01

    H.R. 2372 is a bill to provide jurisdiction and procedures for claims for compassionate payments for injuries due to exposure to radiation from nuclear testing. The Act proposes the use of a Trust Fund Board of Directors to disburse funds under prescribed conditions

  4. Diagnostic flexible pharyngo-laryngoscopy: development of a procedure specific assessment tool using a Delphi methodology.

    Science.gov (United States)

    Melchiors, Jacob; Henriksen, Mikael Johannes Vuokko; Dikkers, Frederik G; Gavilán, Javier; Noordzij, J Pieter; Fried, Marvin P; Novakovic, Daniel; Fagan, Johannes; Charabi, Birgitte W; Konge, Lars; von Buchwald, Christian

    2018-05-01

    Proper training and assessment of skill in flexible pharyngo-laryngoscopy are central in the education of otorhinolaryngologists. To facilitate an evidence-based approach to curriculum development in this field, a structured analysis of what constitutes flexible pharyngo-laryngoscopy is necessary. Our aim was to develop an assessment tool based on this analysis. We conducted an international Delphi study involving experts from twelve countries in five continents. Utilizing reiterative assessment, the panel defined the procedure and reached consensus (defined as 80% agreement) on the phrasing of an assessment tool. FIFTY PANELISTS COMPLETED THE DELPHI PROCESS. THE MEDIAN AGE OF THE PANELISTS WAS 44 YEARS (RANGE 33-64 YEARS). MEDIAN EXPERIENCE IN OTORHINOLARYNGOLOGY WAS 15 YEARS (RANGE 6-35 YEARS). TWENTY-FIVE WERE SPECIALIZED IN LARYNGOLOGY, 16 WERE HEAD AND NECK SURGEONS, AND NINE WERE GENERAL OTORHINOLARYNGOLOGISTS. AN ASSESSMENT TOOL WAS CREATED CONSISTING OF TWELVE DISTINCT ITEMS.: Conclusion The gathering of validity evidence for assessment of core procedural skills within Otorhinolaryngology is central to the development of a competence-based education. The use of an international Delphi panel allows for the creation of an assessment tool which is widely applicable and valid. This work allows for an informed approach to technical skills training for flexible pharyngo-laryngoscopy and as further validity evidence is gathered allows for a valid assessment of clinical performance within this important skillset.

  5. Emergency procedures

    International Nuclear Information System (INIS)

    Abd Nasir Ibrahim; Azali Muhammad; Ab Razak Hamzah; Abd Aziz Mohamed; Mohammad Pauzi Ismail

    2004-01-01

    The following subjects are discussed - Emergency Procedures: emergency equipment, emergency procedures; emergency procedure involving X-Ray equipment; emergency procedure involving radioactive sources

  6. 76 FR 213 - National Environmental Policy Act Implementing Procedures

    Science.gov (United States)

    2011-01-03

    ... due to, for example, a threatened violation of applicable environmental, safety, and health... legally enforceable rights, benefits, or responsibilities, substantive or procedural, not otherwise... failed in indoor tests. Whether the explosives or propellants were tested indoors or outdoors, the...

  7. Uncertainty in soil-structure interaction analysis of a nuclear power plant due to different analytical techniques

    International Nuclear Information System (INIS)

    Chen, J.C.; Chun, R.C.; Goudreau, G.L.; Maslenikov, O.R.; Johnson, J.J.

    1984-01-01

    This paper summarizes the results of the dynamic response analysis of the Zion reactor containment building using three different soil-structure interaction (SSI) analytical procedures: the substructure method, CLASSI; the equivalent linear finite element approach, ALUSH and the nonlinear finite element procedure, DYNA3D. Uncertainties in analyzing a soil-structure system due to SSI analysis procedures were investigated. Responses at selected locations in the structure were compared: peak accelerations and response spectra

  8. Face, Content, and Construct Validations of Endoscopic Needle Injection Simulator for Transurethral Bulking Agent in Treatment of Stress Urinary Incontinence.

    Science.gov (United States)

    Farhan, Bilal; Soltani, Tandis; Do, Rebecca; Perez, Claudia; Choi, Hanul; Ghoniem, Gamal

    2018-05-02

    Endoscopic injection of urethral bulking agents is an office procedure that is used to treat stress urinary incontinence secondary to internal sphincteric deficiency. Validation studies important part of simulator evaluation and is considered important step to establish the effectiveness of simulation-based training. The endoscopic needle injection (ENI) simulator has not been formally validated, although it has been used widely at University of California, Irvine. We aimed to assess the face, content, and construct validity of the UC, Irvine ENI simulator. Dissected female porcine bladders were mounted in a modified Hysteroscopy Diagnostic Trainer. Using routine endoscopic equipment for this procedure with video monitoring, 6 urologists (experts group) and 6 urology trainee (novice group) completed urethral bulking agents injections on a total of 12 bladders using ENI simulator. Face and content validities were assessed by using structured quantitative survey which rating the realism. Construct validity was assessed by comparing the performance, time of the procedure, and the occlusive (anatomical and functional) evaluations between the experts and novices. Trainees also completed a postprocedure feedback survey. Effective injections were evaluated by measuring the retrograde urethral opening pressure, visual cystoscopic coaptation, and postprocedure gross anatomic examination. All 12 participants felt the simulator was a good training tool and should be used as essential part of urology training (face validity). ENI simulator showed good face and content validity with average score varies between the experts and the novices was 3.9/5 and 3.8/5, respectively. Content validity evaluation showed that most aspects of the simulator were adequately realistic (mean Likert scores 3.9-3.8/5). However, the bladder does not bleed, and sometimes thin. Experts significantly outperformed novices (p ENI simulator shows face, content and construct validities, although few

  9. Methodology for Computational Fluid Dynamic Validation for Medical Use: Application to Intracranial Aneurysm.

    Science.gov (United States)

    Paliwal, Nikhil; Damiano, Robert J; Varble, Nicole A; Tutino, Vincent M; Dou, Zhongwang; Siddiqui, Adnan H; Meng, Hui

    2017-12-01

    Computational fluid dynamics (CFD) is a promising tool to aid in clinical diagnoses of cardiovascular diseases. However, it uses assumptions that simplify the complexities of the real cardiovascular flow. Due to high-stakes in the clinical setting, it is critical to calculate the effect of these assumptions in the CFD simulation results. However, existing CFD validation approaches do not quantify error in the simulation results due to the CFD solver's modeling assumptions. Instead, they directly compare CFD simulation results against validation data. Thus, to quantify the accuracy of a CFD solver, we developed a validation methodology that calculates the CFD model error (arising from modeling assumptions). Our methodology identifies independent error sources in CFD and validation experiments, and calculates the model error by parsing out other sources of error inherent in simulation and experiments. To demonstrate the method, we simulated the flow field of a patient-specific intracranial aneurysm (IA) in the commercial CFD software star-ccm+. Particle image velocimetry (PIV) provided validation datasets for the flow field on two orthogonal planes. The average model error in the star-ccm+ solver was 5.63 ± 5.49% along the intersecting validation line of the orthogonal planes. Furthermore, we demonstrated that our validation method is superior to existing validation approaches by applying three representative existing validation techniques to our CFD and experimental dataset, and comparing the validation results. Our validation methodology offers a streamlined workflow to extract the "true" accuracy of a CFD solver.

  10. Numerical treatment of experimental data in calibration procedures

    International Nuclear Information System (INIS)

    Moreno, C.

    1993-06-01

    A discussion of a numerical procedure to find the proportionality factor between two measured quantities is given in the framework of the least-squares method. Variable, as well as constant, amounts of experimental uncertainties are considered for each variable along their measured range. The variance of the proportionality factor is explicitly given as a closed analytical expression valid for the general case. Limits of the results obtained here have been studied allowing comparisons with those obtained using classical least-squares expressions. Analytical and numerical examples are also discussed. (author). 11 refs, 1 fig., 1 tab

  11. Development and validation of Incontinence - Activity Participation Scale for spinal cord injury.

    Science.gov (United States)

    Walia, Priya; Kaur, Jaskirat

    2017-01-01

    We aimed to develop and validate an Incontinence - Activity Participation Scale (I-APS) for measurement of activity limitation and participation restriction due to bladder problems in spinal cord injury (SCI). The process of development was initiated by formation of open-ended questions after thorough review of literature which were then administered to SCI participants, caretakers, and professionals working with SCI. Items were generated based on their responses and initial draft of scale was formulated. This initial draft of the scale containing 77 items was then administered to 56 SCI participants for reduction of items using factor analysis, and a prefinal version of the scale was obtained containing thirty items only. Content validity and face validity was then established. The I-APS is both health professional and self-administered questionnaire including two domains: Activities of daily living and occupation with 16 items having a content validity of 0.84. The overall internal consistency reliability was 0.86. The I-APS is a valid, comprehensive instrument that measures the activity limitation and participation restrictions due to bladder problems in SCI.

  12. Intrasubject Predictions of Vocational Preference: Convergent Validation via the Decision Theoretic Paradigm.

    Science.gov (United States)

    Monahan, Carlyn J.; Muchinsky, Paul M.

    1985-01-01

    The degree of convergent validity among four methods of identifying vocational preferences is assessed via the decision theoretic paradigm. Vocational preferences identified by Holland's Vocational Preference Inventory (VPI), a rating procedure, and ranking were compared with preferences identified from a policy-capturing model developed from an…

  13. Validation of a spectrophotometric methodology for the quantification of polysaccharides from roots of Operculina macrocarpa (jalapa

    Directory of Open Access Journals (Sweden)

    Marcos A.M. Galvão

    Full Text Available The roots from Operculina macrocarpa (L. Urb., Convolvulaceae, are widely used in Brazilian traditional medicine as a laxative and purgative. The biological properties of this drug material have been attributed to its polysaccharides content. Thus, the aim of this study was to evaluate the polysaccharide content in drug material from O. macrocarpa by spectrophotometric quantitative analysis. The root was used as plant material and the botanical identification was performed by macro and microscopic analysis. The plant material was used to validate the spectrophotometric procedures at 490 nm for the quantification of the reaction product from drug polysaccharides and phenol-sulfuric acid solution. The analytical procedure was evaluated in order to comply with the necessary legal requirements by the determination of the following parameters: specificity, linearity, selectivity, precision, accuracy and robustness. This study provides with a simple and valid analytical procedure (linear, precise, accurate and reproducible, which can be satisfactorily used for quality control and standardization of herbal drug from O. macrocarpa.

  14. Validation of the dynamic model for a pressurized water reactor

    International Nuclear Information System (INIS)

    Zwingelstein, Gilles.

    1979-01-01

    Dynamic model validation is a necessary procedure to assure that the developed empirical or physical models are satisfactorily representing the dynamic behavior of the actual plant during normal or abnormal transients. For small transients, physical models which represent isolated core, isolated steam generator and the overall pressurized water reactor are described. Using data collected during the step power changes that occured during the startup procedures, comparisons of experimental and actual transients are given at 30% and 100% of full power. The agreement between the transients derived from the model and those recorded on the plant indicates that the developed models are well suited for use for functional or control studies

  15. Verification and software validation for nuclear instrumentation; Verificacion y validacion de software para instrumentacion nuclear

    Energy Technology Data Exchange (ETDEWEB)

    Gaytan G, E. [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico); Salgado G, J. R. [Comision Nacional de Seguridad Nuclear y Salvaguardias, Dr. Barragan No. 779, Col. Narvarte, 03020 Mexico D. F. (Mexico); De Andrade O, E. [Universidad Federal de Rio de Janeiro, Caixa Postal 68509, 21945-970 Rio de Janeiro (Brazil); Ramirez G, A., E-mail: elvira.gaytan@inin.gob.mx [Comision Federal de Electricidad, Gerencia de Centrales Nucleoelectricas, Alto Lucero, Veracruz (Mexico)

    2014-10-15

    In this work is presented a Verification Methodology and Software Validation, to be applied in instruments of nuclear use with associate software. This methodology was developed under the auspices of IAEA, through the regional projects RLA4022 (ARCAL XCIX) and RLA1011 (RLA CXXIII), led by Mexico. In the first project three plans and three procedures were elaborated taking into consideration IEEE standards, and in the second project these documents were updated considering ISO and IEC standards. The developed methodology has been distributed to the participant countries of Latin America in the ARCAL projects and two related courses have been imparted with the participation of several countries, and participating institutions of Mexico like Instituto Nacional de Investigaciones Nucleares (ININ), Comision Federal de Electricidad (CFE) and Comision Nacional de Seguridad Nuclear y Salvaguardias (CNSNS). In the ININ due to the necessity to work with Software Quality Guarantee in systems for the nuclear power plant of the CFE, a Software Quality Guarantee Plan and five procedures were developed in the year 2004, obtaining the qualification of the ININ for software development for the nuclear power plant of CFE. These first documents were developed taking like reference IEEE standards and regulator guides of NRC, being the first step for the development of the methodology. (Author)

  16. 15 CFR Supplement No. 9 to Part 748 - End-User Review Committee Procedures

    Science.gov (United States)

    2010-01-01

    ... potential validated end-user is not included in calculating the 30 calendar day deadline for the ERC's... matter to the Advisory Committee on Export Policy (ACEP). The procedures and time frame for escalating... Supplement No. 7 to this part operates as clearance by all member agencies to publish the amendment in the...

  17. Reduced infant response to a routine care procedure after sucrose analgesia.

    Science.gov (United States)

    Taddio, Anna; Shah, Vibhuti; Katz, Joel

    2009-03-01

    Sucrose has analgesic and calming effects in newborns. To date, it is not known whether the beneficial effects extend to caregiving procedures that are performed after painful procedures. Our objective was to determine the effect of sucrose analgesia for procedural pain on infant pain responses during a subsequent caregiving procedure. We conducted a double-blind, randomized, controlled trial. Healthy neonates within 2 strata (normal infants and infants of diabetic mothers) were randomly assigned to a sucrose or placebo water group before all needle procedures after birth. Pain response during a diaper change performed after venipuncture for the newborn screening test was determined by using a validated multidimensional measure, the Premature Infant Pain Profile. The study was conducted between September 15, 2003, and July 27, 2004. Altogether, 412 parents were approached; 263 consented. Twenty-three infants were not assigned, leaving 240 for participation (n = 120 per group), with an equal number in each infant strata. Of those, 186 (78%) completed the study. There were no significant differences in birth characteristics between groups. During diaper change, sucrose-treated infants had lower pain scores than placebo-treated infants. The relative risk of having pain, defined as a Premature Infant Pain Profile score of >/=6, was 0.64 with sucrose compared with placebo. This study demonstrates that when used to manage pain, sucrose reduces the pain response to a subsequent routine caregiving procedure. Therefore, the benefits of sucrose analgesia extend beyond the painful event to other aversive and potentially painful procedures.

  18. EFAM GTP 02 - the GKSS test procedure for determining the fracture behaviour of materials

    International Nuclear Information System (INIS)

    Schwalbe, K.H.; Heerens, J.; Zerbst, U.; Kocak, M.

    2002-01-01

    This document describes a unified fracture mechanics test method in procedural form for quasi-static testing of materials. It is based on the ESIS Procedures P1 and P2 and introduces additional features, such as middle cracked tension specimens, shallow cracks, the δ 5 crack tip opening displacement, the crack tip opening angle, the rate of dissipated energy, testing of weldments, and guidance for statistical treatment of scatter. Special validity criteria are given for tests on specimens with low constraint. This document represents an updated version of EFAM GTP 94. (orig.) [de

  19. [Cervical lymphoadenopathy due to Pseudomonas aeruginosa following mesotherapy].

    Science.gov (United States)

    Shaladi, Ali Muftah; Crestani, Francesco; Bocchi, Anna; Saltari, Maria Rita; Piva, Bruno; Tartari, Stefano

    2009-09-01

    Mesotherapy is a treatment method devised for controlling several diseases by means of subcutaneous microinjections given at or around the affected areas at short time intervals. It is used to treat a variety of medical conditions, amongst which all orthopaedic diseases and rheumatic pain. Mesotherapy is especially indicated for neck pain. The mechanism of action is twofold: a pharmacological effect due to the drug administered, and a reflexogenic effect, the skin containing many nerve endings that are sensitive to the mechanical action of the needle. Although this therapy is safe, like any other medical intervention it cannot be considered free of complications that may occur, such as allergies, haematomas, bruising, wheals, granulomas and telangiectasias. Infective complications are also possible, due to pathogenic bacteria that are inoculated through contamination of products, of the materials used for the procedure or even from germs on the skin. We present the case of a patient who had cervical lymphadenopathy due to Pseudomonas aeruginosa after mesotherapy treatment for neck pain.

  20. [Mood induction procedures: a critical review].

    Science.gov (United States)

    Gilet, A-L

    2008-06-01

    mood induction procedures in the study of cognitive processes in depression [Clin Psychol Rev 25 (2005) 487-510], borderline personality disorder [J Behav Ther Exp Psychiatry 36 (2005) 226-239] or associated with brain imaging [Am J Psychiatry 161 (2004) 2245-2256]. Then the inherent problems to the use of experimental mood induction procedures are reconsidered. Doubts have effectively arisen about the effectiveness and validity of the mood induction procedures usually used in research. Some authors questioned whether a sufficient intensity of mood is produced or the possibility that the effects observed are due mainly to demand effects [Br J Psychol 85 (1994) 55-78, Clin Psychol Rev 10 (1990) 669-697, Eur J Soc Psychol 26 (1996) 557-580]. In fact, the various mood induction procedures are not equal with regard to the demand effects observed. The question of demand characteristics with respect to mood induction procedures is still under debate, even if demand effects are supposed to be most likely to occur with self-statement techniques (especially with the Velten mood induction procedure) or when subjects are explicitly instructed to try to enter a specific mood state [Eur J Soc Psychol 26 (1996) 557-580]. Another interrogation relates to the effectiveness of these various procedures of induction and the duration of induced moods. Generally, the various techniques used produce true changes of moods in the majority if not the whole of the subjects. However, certain procedures seem more effective in inducing a mood in particular [Br J Psychol 85 (1994) 55-78, Clin Psychol Rev 10 (1990) 669-697, Eur J Soc Psychol 26 (1996) 557-580]. As for the duration of induced moods this depends at the same time on the procedure used and the mood induced. Nevertheless, mood induction remains fundamental in the study of the effects of mood on the cognitive activities, insofar as it makes it possible to study the effects of negative as well as positive moods.

  1. Dynamical modeling procedure of a Li-ion battery pack suitable for real-time applications

    International Nuclear Information System (INIS)

    Castano, S.; Gauchia, L.; Voncila, E.; Sanz, J.

    2015-01-01

    Highlights: • Dynamical modeling of a 50 A h battery pack composed of 56 cells. • Detailed analysis of SOC tests at realistic performance range imposed by BMS. • We propose an electrical circuit that improves how the battery capacity is modeled. • The model is validated in the SOC range using a real-time experimental setup. - Abstract: This paper presents the modeling of a 50 A h battery pack composed of 56 cells, taking into account real battery performance conditions imposed by the BMS control. The modeling procedure starts with a detailed analysis of experimental charge and discharge SOC tests. Results from these tests are used to obtain the battery model parameters at a realistic performance range (20–80% SOC). The model topology aims to better describe the finite charge contained in a battery pack. The model has been validated at three different SOC values in order to verify the model response at real battery pack operation conditions. The validation tests show that the battery pack model is able to simulate the real battery response with excellent accuracy in the range tested. The proposed modeling procedure is fully applicable to any Li-ion battery pack, regardless of the number of series or parallel cells or its rated capacity

  2. Due diligence during the integration of physician groups.

    Science.gov (United States)

    Ealey, Tom

    2011-12-01

    While contemplating physician group integration, providers should perform due diligence and ask questions in several key areas to ensure successful integrations: Financial--Is the group producing the revenue expected, and is the revenue cycle managed effectively? Statistical--What are the numbers of encounters, procedures, surgeries, and ancillaries? Compliance--Has the group developed and operated a sound compliance program, and has compliance been a priority? Succession--How many physicians are within three to five years of retirement?

  3. Development and validation of a computational model to study the effect of foot constraint on ankle injury due to external rotation.

    Science.gov (United States)

    Wei, Feng; Hunley, Stanley C; Powell, John W; Haut, Roger C

    2011-02-01

    Recent studies, using two different manners of foot constraint, potted and taped, document altered failure characteristics in the human cadaver ankle under controlled external rotation of the foot. The posterior talofibular ligament (PTaFL) was commonly injured when the foot was constrained in potting material, while the frequency of deltoid ligament injury was higher for the taped foot. In this study an existing multibody computational modeling approach was validated to include the influence of foot constraint, determine the kinematics of the joint under external foot rotation, and consequently obtain strains in various ligaments. It was hypothesized that the location of ankle injury due to excessive levels of external foot rotation is a function of foot constraint. The results from this model simulation supported this hypothesis and helped to explain the mechanisms of injury in the cadaver experiments. An excessive external foot rotation might generate a PTaFL injury for a rigid foot constraint, and an anterior deltoid ligament injury for a pliant foot constraint. The computational models may be further developed and modified to simulate the human response for different shoe designs, as well as on various athletic shoe-surface interfaces, so as to provide a computational basis for optimizing athletic performance with minimal injury risk.

  4. Parametric roll due to hull instantaneous volumetric changes and speed variations

    DEFF Research Database (Denmark)

    Vidic-Perunovic, Jelena; Jensen, Jørgen Juncher

    2009-01-01

    Parametric roll of a containership in head sea condition has been studied in the paper. A time domain routine for GZ righting arm calculation based on exact underwater hull geometry has been implemented into a two-degree-of-freedom procedure for roll response calculation. The speed variation due...

  5. Improving Flight Software Module Validation Efforts : a Modular, Extendable Testbed Software Framework

    Science.gov (United States)

    Lange, R. Connor

    2012-01-01

    Ever since Explorer-1, the United States' first Earth satellite, was developed and launched in 1958, JPL has developed many more spacecraft, including landers and orbiters. While these spacecraft vary greatly in their missions, capabilities,and destination, they all have something in common. All of the components of these spacecraft had to be comprehensively tested. While thorough testing is important to mitigate risk, it is also a very expensive and time consuming process. Thankfully,since virtually all of the software testing procedures for SMAP are computer controlled, these procedures can be automated. Most people testing SMAP flight software (FSW) would only need to write tests that exercise specific requirements and then check the filtered results to verify everything occurred as planned. This gives developers the ability to automatically launch tests on the testbed, distill the resulting logs into only the important information, generate validation documentation, and then deliver the documentation to management. With many of the steps in FSW testing automated, developers can use their limited time more effectively and can validate SMAP FSW modules quicker and test them more rigorously. As a result of the various benefits of automating much of the testing process, management is considering this automated tools use in future FSW validation efforts.

  6. 49 CFR Appendix D to Part 236 - Independent Review of Verification and Validation

    Science.gov (United States)

    2010-10-01

    ... standards. (f) The reviewer shall analyze all Fault Tree Analyses (FTA), Failure Mode and Effects... for each product vulnerability cited by the reviewer; (4) Identification of any documentation or... not properly followed; (6) Identification of the software verification and validation procedures, as...

  7. Feasibility study and uncertainties in the validation of an existing safety-related control circuit with the ISO 13849-1:2006 design standard

    International Nuclear Information System (INIS)

    Jocelyn, Sabrina; Baudoin, James; Chinniah, Yuvin; Charpentier, Philippe

    2014-01-01

    In industry, machine users and people who modify or integrate equipment often have to evaluate the safety level of a safety-related control circuit that they have not necessarily designed. The modifications or integrations may involve work to make an existing machine that does not comply with normative or regulatory specifications safe. However, how can a circuit performing a safety function be validated a posteriori? Is the validation exercise feasible? What are the difficulties and limitations of such a procedure? The aim of this article is to answer these questions by presenting a validation study of a safety function of an existing machine. A plastic injection molding machine is used for this study, as well as standard ISO 13849-1:2006. Validation consists of performing an a posteriori (post-design) estimation of the performance level of the safety function. The procedure is studied for two contexts of use of the machine: in industry, and in laboratory. The calculations required by the ISO standard were done using Excel, followed by SIStema software. It is shown that, based on the context of use, the estimated performance level was different for the same safety-related circuit. The variability in the results is explained by the assumptions made by the person undertaking the validation without the involvement of the machine designer. - Highlights: • Validation of the performance level of a safety function is undertaken. • An injection molding machine and ISO 13849-1:2006 standard are used for the procedure. • The procedure is undertaken for two contexts of use of the machine. • In this study, the performance level depends on the context of use. • The assumptions made throughout the study partially explain this difference

  8. A validated methodology for evaluating burnup credit in spent fuel casks

    International Nuclear Information System (INIS)

    Brady, M.C.; Sanders, T.L.

    1991-01-01

    The concept of allowing reactivity credit for the transmuted state of spent fuel offers both economic and risk incentives. This paper presents a general overview of the technical work being performed in support of the US Department of Energy (DOE) program to resolve issues related to the implementation of burnup credit. An analysis methodology is presented along with information representing the validation of the method against available experimental data. The experimental data that are applicable to burnup credit include chemical assay data for the validation of the isotopic prediction models, fresh fuel critical experiments for the validation of criticality calculations for various casks geometries, and reactor restart critical data to validate criticality calculations with spent fuel. The methodology has been specifically developed to be simple and generally applicable, therefore giving rise to uncertainties or sensitivities which are identified and quantified in terms of a percent bias in k eff . Implementation issues affecting licensing requirements and operational procedures are discussed briefly

  9. A validated methodology for evaluating burnup credit in spent fuel casks

    International Nuclear Information System (INIS)

    Brady, M.C.; Sanders, T.L.

    1991-01-01

    The concept of allowing reactivity credit for the transmuted state of spent fuel offers both economic and risk incentives. This paper presents a general overview of the technical work being performed in support of the U.S. Department of Energy (DOE) program to resolve issues related to the implementation of burnup credit. An analysis methodology is presented along with information representing the validation of the method against available experimental data. The experimental data that are applicable to burnup credit include chemical assay data for the validation of the isotopic prediction models, fresh fuel critical experiments for the validation of criticality calculations for various cask geometries, and reactor restart critical data to validate criticality calculations with spent fuel. The methodology has been specifically developed to be simple and generally applicable, therefore giving rise to uncertainties or sensitivities which are identified and quantified in terms of a percent bias in k eff . Implementation issues affecting licensing requirements and operational procedures are discussed briefly. (Author)

  10. A validated methodology for evaluating burnup credit in spent fuel casks

    International Nuclear Information System (INIS)

    Brady, M.C.; Sanders, T.L.

    1991-01-01

    The concept of allowing reactivity credit for the transmuted state of spent fuel offers both economic and risk incentives. This paper presents a general overview of the technical work being performed in support of the US Department of Energy (DOE) program to resolve issues related to the implementation of burnup credit. An analysis methodology is presented along with information representing the validation of the method against available experimental data. The experimental data that are applicable to burnup credit include chemical assay data for the validation of the isotopic prediction models, fresh fuel critical experiments for the validation of criticality calculations for various cask geometries, and reactor restart critical data to validate criticality calculations with spent fuel. The methodology has been specifically developed to be simple and generally applicable, therefore giving rise to uncertainties or sensitivities which are identified and quantified in terms of a percent bias in k eff . Implementation issues affecting licensing requirements and operational procedures are discussed briefly. 24 refs., 3 tabs

  11. Application of Sensitivity and Uncertainty Analysis Methods to a Validation Study for Weapons-Grade Mixed-Oxide Fuel

    International Nuclear Information System (INIS)

    Dunn, M.E.

    2001-01-01

    At the Oak Ridge National Laboratory (ORNL), sensitivity and uncertainty (S/U) analysis methods and a Generalized Linear Least-Squares Methodology (GLLSM) have been developed to quantitatively determine the similarity or lack thereof between critical benchmark experiments and an application of interest. The S/U and GLLSM methods provide a mathematical approach, which is less judgment based relative to traditional validation procedures, to assess system similarity and estimate the calculational bias and uncertainty for an application of interest. The objective of this paper is to gain experience with the S/U and GLLSM methods by revisiting a criticality safety evaluation and associated traditional validation for the shipment of weapons-grade (WG) MOX fuel in the MO-1 transportation package. In the original validation, critical experiments were selected based on a qualitative assessment of the MO-1 and MOX contents relative to the available experiments. Subsequently, traditional trending analyses were used to estimate the Δk bias and associated uncertainty. In this paper, the S/U and GLLSM procedures are used to re-evaluate the suite of critical experiments associated with the original MO-1 evaluation. Using the S/U procedures developed at ORNL, critical experiments that are similar to the undamaged and damaged MO-1 package are identified based on sensitivity and uncertainty analyses of the criticals and the MO-1 package configurations. Based on the trending analyses developed for the S/U and GLLSM procedures, the Δk bias and uncertainty for the most reactive MO-1 package configurations are estimated and used to calculate an upper subcritical limit (USL) for the MO-1 evaluation. The calculated bias and uncertainty from the S/U and GLLSM analyses lead to a calculational USL that supports the original validation study for the MO-1

  12. Validation of neutron flux redistribution factors in JSI TRIGA reactor due to control rod movements

    International Nuclear Information System (INIS)

    Kaiba, Tanja; Žerovnik, Gašper; Jazbec, Anže; Štancar, Žiga; Barbot, Loïc; Fourmentel, Damien; Snoj, Luka

    2015-01-01

    For efficient utilization of research reactors, such as TRIGA Mark II reactor in Ljubljana, it is important to know neutron flux distribution in the reactor as accurately as possible. The focus of this study is on the neutron flux redistributions due to control rod movements. For analyzing neutron flux redistributions, Monte Carlo calculations of fission rate distributions with the JSI TRIGA reactor model at different control rod configurations have been performed. Sensitivity of the detector response due to control rod movement have been studied. Optimal radial and axial positions of the detector have been determined. Measurements of the axial neutron flux distribution using the CEA manufactured fission chambers have been performed. The experiments at different control rod positions were conducted and compared with the MCNP calculations for a fixed detector axial position. In the future, simultaneous on-line measurements with multiple fission chambers will be performed inside the reactor core for a more accurate on-line power monitoring system. - Highlights: • Neutron flux redistribution due to control rod movement in JSI TRIGA has been studied. • Detector response sensitivity to the control rod position has been minimized. • Optimal radial and axial detector positions have been determined

  13. Recommendations for elaboration, transcultural adaptation and validation process of tests in Speech, Hearing and Language Pathology.

    Science.gov (United States)

    Pernambuco, Leandro; Espelt, Albert; Magalhães, Hipólito Virgílio; Lima, Kenio Costa de

    2017-06-08

    to present a guide with recommendations for translation, adaptation, elaboration and process of validation of tests in Speech and Language Pathology. the recommendations were based on international guidelines with a focus on the elaboration, translation, cross-cultural adaptation and validation process of tests. the recommendations were grouped into two Charts, one of them with procedures for translation and transcultural adaptation and the other for obtaining evidence of validity, reliability and measures of accuracy of the tests. a guide with norms for the organization and systematization of the process of elaboration, translation, cross-cultural adaptation and validation process of tests in Speech and Language Pathology was created.

  14. Effectiveness of internet-based affect induction procedures: A systematic review and meta-analysis.

    Science.gov (United States)

    Ferrer, Rebecca A; Grenen, Emily G; Taber, Jennifer M

    2015-12-01

    Procedures used to induce affect in a laboratory are effective and well-validated. Given recent methodological and technological advances in Internet research, it is important to determine whether affect can be effectively induced using Internet methodology. We conducted a meta-analysis and systematic review of prior research that has used Internet-based affect induction procedures, and examined potential moderators of the effectiveness of affect induction procedures. Twenty-six studies were included in final analyses, with 89 independent effect sizes. Affect induction procedures effectively induced general positive affect, general negative affect, fear, disgust, anger, sadness, and guilt, but did not significantly induce happiness. Contamination of other nontarget affect did not appear to be a major concern. Video inductions resulted in greater effect sizes. Overall, results indicate that affect can be effectively induced in Internet studies, suggesting an important venue for the acceleration of affective science. (PsycINFO Database Record (c) 2015 APA, all rights reserved).

  15. Peroral endoscopic myotomy: An emerging minimally invasive procedure for achalasia

    Science.gov (United States)

    Vigneswaran, Yalini; Ujiki, Michael B

    2015-01-01

    Peroral endoscopic myotomy (POEM) is an emerging minimally invasive procedure for the treatment of achalasia. Due to the improvements in endoscopic technology and techniques, this procedure allows for submucosal tunneling to safely endoscopically create a myotomy across the hypertensive lower esophageal sphincter. In the hands of skilled operators and experienced centers, the most common complications of this procedure are related to insufflation and accumulation of gas in the chest and abdominal cavities with relatively low risks of devastating complications such as perforation or delayed bleeding. Several centers worldwide have demonstrated the feasibility of this procedure in not only early achalasia but also other indications such as redo myotomy, sigmoid esophagus and spastic esophagus. Short-term outcomes have showed great clinical efficacy comparable to laparoscopic Heller myotomy (LHM). Concerns related to postoperative gastroesophageal reflux remain, however several groups have demonstrated comparable clinical and objective measures of reflux to LHM. Although long-term outcomes are necessary to better understand durability of the procedure, POEM appears to be a promising new procedure. PMID:26468336

  16. Validation of the MODIS Collection 6 MCD64 Global Burned Area Product

    Science.gov (United States)

    Boschetti, L.; Roy, D. P.; Giglio, L.; Stehman, S. V.; Humber, M. L.; Sathyachandran, S. K.; Zubkova, M.; Melchiorre, A.; Huang, H.; Huo, L. Z.

    2017-12-01

    The research, policy and management applications of satellite products place a high priority on rigorously assessing their accuracy. A number of NASA, ESA and EU funded global and continental burned area products have been developed using coarse spatial resolution satellite data, and have the potential to become part of a long-term fire Essential Climate Variable. These products have usually been validated by comparison with reference burned area maps derived by visual interpretation of Landsat or similar spatial resolution data selected on an ad hoc basis. More optimally, a design-based validation method should be adopted, characterized by the selection of reference data via probability sampling. Design based techniques have been used for annual land cover and land cover change product validation, but have not been widely used for burned area products, or for other products that are highly variable in time and space (e.g. snow, floods, other non-permanent phenomena). This has been due to the challenge of designing an appropriate sampling strategy, and to the cost and limited availability of independent reference data. This paper describes the validation procedure adopted for the latest Collection 6 version of the MODIS Global Burned Area product (MCD64, Giglio et al, 2009). We used a tri-dimensional sampling grid that allows for probability sampling of Landsat data in time and in space (Boschetti et al, 2016). To sample the globe in the spatial domain with non-overlapping sampling units, the Thiessen Scene Area (TSA) tessellation of the Landsat WRS path/rows is used. The TSA grid is then combined with the 16-day Landsat acquisition calendar to provide tri-dimensonal elements (voxels). This allows the implementation of a sampling design where not only the location but also the time interval of the reference data is explicitly drawn through stratified random sampling. The novel sampling approach was used for the selection of a reference dataset consisting of 700

  17. Is there inter-procedural transfer of skills in intraocular surgery?

    DEFF Research Database (Denmark)

    Thomsen, Ann Sofia Skou; Kiilgaard, Jens Folke; la Cour, Morten

    2017-01-01

    surgery on a virtual-reality simulator until passing a test with predefined validity evidence (cataract trainees) or to (2) no cataract surgery training (novices). Possible skill transfer was assessed using a test consisting of all 11 vitreoretinal modules on the EyeSi virtual-reality simulator. All...... participants repeated the test of vitreoretinal surgical skills until their performance curve plateaued. Three experienced vitreoretinal surgeons also performed the test to establish validity evidence. Analysis with independent samples t-tests was performed. RESULTS: The vitreoretinal test on the Eye......, p = 0.265), or maximum scores (785 ± 162 points versus 805 ± 73 points, p = 0.791). CONCLUSION: Pretraining in cataract surgery did not demonstrate any measurable effect on vitreoretinal procedural performance. The results of this study indicate that we should not anticipate extensive transfer...

  18. Towards criterion validity in classroom language analysis: methodological constraints of metadiscourse and inter-rater agreement

    Directory of Open Access Journals (Sweden)

    Douglas Altamiro Consolo

    2001-02-01

    Full Text Available

    This paper reports on a process to validate a revised version of a system for coding classroom discourse in foreign language lessons, a context in which the dual role of language (as content and means of communication and the speakers' specific pedagogical aims lead to a certain degree of ambiguity in language analysis. The language used by teachers and students has been extensively studied, and a framework of concepts concerning classroom discourse well-established. Models for coding classroom language need, however, to be revised when they are applied to specific research contexts. The application and revision of an initial framework can lead to the development of earlier models, and to the re-definition of previously established categories of analysis that have to be validated. The procedures followed to validate a coding system are related here as guidelines for conducting research under similar circumstances. The advantages of using instruments that incorporate two types of data, that is, quantitative measures and qualitative information from raters' metadiscourse, are discussed, and it is suggested that such procedure can contribute to the process of validation itself, towards attaining reliability of research results, as well as indicate some constraints of the adopted research methodology.

  19. Procedural Reform and the Reduction of Discretion: The Case of the Juvenile Court.

    Science.gov (United States)

    Sosin, Michael

    The issue of controlling discretion in large public institutions is a crucial one in modern society, and procedural legal reforms are often viewed as one tactic of control. Using due process guarantees in juvenile courts as the substantive issue, this paper tests the utility of procedural reform in reducing discretion. Results indicate that…

  20. The development of a self-administered dementia checklist: the examination of concurrent validity and discriminant validity.

    Science.gov (United States)

    Miyamae, Fumiko; Ura, Chiaki; Sakuma, Naoko; Niikawa, Hirotoshi; Inagaki, Hiroki; Ijuin, Mutsuo; Okamura, Tsuyoshi; Sugiyama, Mika; Awata, Shuichi

    2016-01-01

    The present study aims to develop a self-administered dementia checklist to enable community-residing older adults to realize their declining functions and start using necessary services. A previous study confirmed the factorial validity and internal reliability of the checklist. The present study examined its concurrent validity and discriminant validity. The authors conducted a 3-step study (a self-administered survey including the checklist, interviews by nurses, and interviews by doctors and psychologists) of 7,682 community-residing individuals who were over 65 years of age. The authors calculated Spearman's correlation coefficients between the scores of the checklist and the results of a psychological test to examine the concurrent validity. They also compared the average total scores of the checklist between groups with different Clinical Dementia Rating (CDR) scores to examine discriminant validity and conducted a receiver operating characteristic analysis to examine the discriminative power for dementia. The authors analyzed the data of 131 respondents who completed all 3 steps. The checklist scores were significantly correlated with the respondents' Mini-Mental State Examination and Frontal Assessment Battery scores. The checklist also significantly discriminated the patients with dementia (CDR = 1+) from those without dementia (CDR = 0 or 0.5). The optimal cut-off point for the two groups was 17/18 (sensitivity, 72.0%; specificity, 69.2%; positive predictive value, 69.2%; negative predictive value, 72.0%). This study confirmed the concurrent validity and discriminant validity of the self-administered dementia checklist. However, due to its insufficient discriminative power as a screening tool for older people with declining cognitive functions, the checklist is only recommended as an educational and public awareness tool.