WorldWideScience

Sample records for validation experiment cove

  1. SIZE, STRUCTURE AND FUNCTIONALITY IN SHALLOW COVE COMMUNITIES IN RI

    Science.gov (United States)

    We are using an ecosystem approach to examine the ecological integrity and important habitats in small estuarine coves. We sampled the small undeveloped Coggeshall Cove during the sununer of 1999. The cove was sampled at high tide at every 15 cm of substrate elevation along trans...

  2. ISOTHERMAL AIR INGRESS VALIDATION EXPERIMENTS

    Energy Technology Data Exchange (ETDEWEB)

    Chang H Oh; Eung S Kim

    2011-09-01

    Idaho National Laboratory carried out air ingress experiments as part of validating computational fluid dynamics (CFD) calculations. An isothermal test loop was designed and set to understand the stratified-flow phenomenon, which is important as the initial air flow into the lower plenum of the very high temperature gas cooled reactor (VHTR) when a large break loss-of-coolant accident occurs. The unique flow characteristics were focused on the VHTR air-ingress accident, in particular, the flow visualization of the stratified flow in the inlet pipe to the vessel lower plenum of the General Atomic’s Gas Turbine-Modular Helium Reactor (GT-MHR). Brine and sucrose were used as heavy fluids, and water was used to represent a light fluid, which mimics a counter current flow due to the density difference between the stimulant fluids. The density ratios were changed between 0.87 and 0.98. This experiment clearly showed that a stratified flow between simulant fluids was established even for very small density differences. The CFD calculations were compared with experimental data. A grid sensitivity study on CFD models was also performed using the Richardson extrapolation and the grid convergence index method for the numerical accuracy of CFD calculations . As a result, the calculated current speed showed very good agreement with the experimental data, indicating that the current CFD methods are suitable for predicting density gradient stratified flow phenomena in the air-ingress accident.

  3. COVE 2A Benchmarking calculations using NORIA

    International Nuclear Information System (INIS)

    Carrigan, C.R.; Bixler, N.E.; Hopkins, P.L.; Eaton, R.R.

    1991-10-01

    Six steady-state and six transient benchmarking calculations have been performed, using the finite element code NORIA, to simulate one-dimensional infiltration into Yucca Mountain. These calculations were made to support the code verification (COVE 2A) activity for the Yucca Mountain Site Characterization Project. COVE 2A evaluates the usefulness of numerical codes for analyzing the hydrology of the potential Yucca Mountain site. Numerical solutions for all cases were found to be stable. As expected, the difficulties and computer-time requirements associated with obtaining solutions increased with infiltration rate. 10 refs., 128 figs., 5 tabs

  4. CFD validation experiments for hypersonic flows

    Science.gov (United States)

    Marvin, Joseph G.

    1992-01-01

    A roadmap for CFD code validation is introduced. The elements of the roadmap are consistent with air-breathing vehicle design requirements and related to the important flow path components: forebody, inlet, combustor, and nozzle. Building block and benchmark validation experiments are identified along with their test conditions and measurements. Based on an evaluation criteria, recommendations for an initial CFD validation data base are given and gaps identified where future experiments could provide new validation data.

  5. Validity - a matter of resonant experience

    DEFF Research Database (Denmark)

    Revsbæk, Line

    This paper is about doing interview analysis drawing on researcher’s own lived experience concerning the question of inquiry. The paper exemplifies analyzing case study participants’ experience from the resonant experience of researcher’s own life evoked while listening to recorded interview...... across researcher’s past experience from the case study and her own life. The autobiographic way of analyzing conventional interview material is exemplified with a case of a junior researcher researching newcomer innovation of others, drawing on her own experience of being newcomer in work community...... entry processes. The validity of doing interview analysis drawing on the resonant experience of researcher is argued from a pragmatist perspective....

  6. Advancing the discussion about systematic classroom behavioral observation, a product review of Tenny, J. (2010). eCOVE observation software. Pacific City, OR: eCOVE Software, LLC.

    Science.gov (United States)

    Froiland, John Mark; Smith, Liana

    2014-05-01

    Applied child psychologists and behavioral consultants often use systematic behavioral observations to inform the psychological assessment and intervention development process for children referred for attention and hyperactivity problems. This article provides a review of the 2010 version of the eCOVE classroom observation software in terms of its utility in tracking the progress of children with attention and hyperactive behaviors and its use in evaluating teacher behaviors that may impede or promote children's attention and positive behavior. The eCOVE shows promise as an efficient tool for psychologists and behavioral consultants who want to evaluate the effects of interventions for children with symptoms of ADHD, ODD, mood disorders and learning disorders; however, some research-based improvements for future models are suggested. The reviewers also share their firsthand experience in using eCOVE to evaluate teacher and student behavior exhibited on a television show about teaching urban high school students and during a movie about an eccentric new kindergarten teacher. Rich examples are provided of using strategic behavioral observations to reveal how to improve the classroom environment so as to facilitate attention, motivation and positive behavior among youth. Broader implications for enhancing the use of systematic behavioral observations in the assessment of children and adolescents with attention disorders and related behavioral problems are discussed. Key issues are examined such as the use of behavioral observations during psychological consultation to prevent the previously found gender bias in referrals for ADHD. Using behavioral observations to enhance differential diagnosis is also discussed.

  7. Half Moon Cove Tidal Project. Feasibility report

    Energy Technology Data Exchange (ETDEWEB)

    1980-11-01

    The proposed Half Moon Cove Tidal Power Project would be located in a small cove in the northern part of Cobscook Bay in the vicinity of Eastport, Maine. The project would be the first tidal electric power generating plant in the United States of America. The basin impounded by the barrier when full will approximate 1.2 square miles. The average tidal range at Eastport is 18.2 feet. The maximum spring tidal range will be 26.2 feet and the neap tidal range 12.8 feet. The project will be of the single pool-type single effect in which generation takes place on the ebb tide only. Utilizing an average mean tidal range of 18.2 feet the mode of operation enables generation for approximately ten and one-half (10-1/2) hours per day or slightly in excess of five (5) hours per tide. The installed capacity will be 12 MW utilizing 2 to 6 MW units. An axial flow, or Bulb type of turbine was selected for this study.

  8. Arena Cove, California Tsunami Forecast Grids for MOST Model

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Arena Cove, California Forecast Grids provides bathymetric data strictly for tsunami inundation modeling with the Method of Splitting Tsunami (MOST) model. MOST...

  9. Selected Hydrologic Data for Sand Cove Wash, Washington County, Utah

    National Research Council Canada - National Science Library

    Norton, Aaron; Susong, David D

    2004-01-01

    .... Hydrologic data collected in this study are described and listed in this report. Six boreholes were drilled in Sand Cove Wash to determine the vertical and spatial distribution of the alluvial deposits and their hydrologic...

  10. Elfin Cove, Alaska Tsunami Forecast Grids for MOST Model

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Elfin Cove, Alaska Forecast Grids provides bathymetric data strictly for tsunami inundation modeling with the Method of Splitting Tsunami (MOST) model. MOST is a...

  11. The difference between traditional experiments and CFD validation benchmark experiments

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Barton L., E-mail: barton.smith@usu.edu

    2017-02-15

    Computation Fluid Dynamics provides attractive features for design, and perhaps licensing, of nuclear power plants. The most important of these features is low cost compared to experiments. However, uncertainty of CFD calculations must accompany these calculations in order for the results to be useful for important decision making. In order to properly assess the uncertainty of a CFD calculation, it must be “validated” against experimental data. Unfortunately, traditional “discovery” experiments are normally ill-suited to provide all of the information necessary for the validation exercise. Traditionally, experiments are performed to discover new physics, determine model parameters, or to test designs. This article will describe a new type of experiment; one that is designed and carried out with the specific purpose of providing Computational Fluid Dynamics (CFD) validation benchmark data. We will demonstrate that the goals of traditional experiments and validation experiments are often in conflict, making use of traditional experimental results problematic and leading directly to larger predictive uncertainty of the CFD model.

  12. The difference between traditional experiments and CFD validation benchmark experiments

    International Nuclear Information System (INIS)

    Smith, Barton L.

    2017-01-01

    Computation Fluid Dynamics provides attractive features for design, and perhaps licensing, of nuclear power plants. The most important of these features is low cost compared to experiments. However, uncertainty of CFD calculations must accompany these calculations in order for the results to be useful for important decision making. In order to properly assess the uncertainty of a CFD calculation, it must be “validated” against experimental data. Unfortunately, traditional “discovery” experiments are normally ill-suited to provide all of the information necessary for the validation exercise. Traditionally, experiments are performed to discover new physics, determine model parameters, or to test designs. This article will describe a new type of experiment; one that is designed and carried out with the specific purpose of providing Computational Fluid Dynamics (CFD) validation benchmark data. We will demonstrate that the goals of traditional experiments and validation experiments are often in conflict, making use of traditional experimental results problematic and leading directly to larger predictive uncertainty of the CFD model.

  13. 76 FR 35886 - Orange Cove Irrigation District, and Friant Power Authority; Notice of Availability of...

    Science.gov (United States)

    2011-06-20

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Project No. 11068-014--California] Orange Cove Irrigation District, and Friant Power Authority; Notice of Availability of Environmental... has prepared an Environmental Assessment (EA) regarding Orange Cove Irrigation District's and Friant...

  14. ATHLET validation using accident management experiments

    Energy Technology Data Exchange (ETDEWEB)

    Teschendorff, V.; Glaeser, H.; Steinhoff, F. [Gasellschaft fuer Anlagen - und Reaktorsicherheit (GSR) mbH, Garching (Germany)

    1995-09-01

    The computer code ATHLET is being developed as an advanced best-estimate code for the simulation of leaks and transients in PWRs and BWRs including beyond design basis accidents. The code has features that are of special interest for applications to small leaks and transients with accident management, e.g. initialisation by a steady-state calculation, full-range drift-flux model, and dynamic mixture level tracking. The General Control Simulation Module of ATHLET is a flexible tool for the simulation of the balance-of-plant and control systems including the various operator actions in the course of accident sequences with AM measures. The systematic validation of ATHLET is based on a well balanced set of integral and separate effect tests derived from the CSNI proposal emphasising, however, the German combined ECC injection system which was investigated in the UPTF, PKL and LOBI test facilities. PKL-III test B 2.1 simulates a cool-down procedure during an emergency power case with three steam generators isolated. Natural circulation under these conditions was investigated in detail in a pressure range of 4 to 2 MPa. The transient was calculated over 22000 s with complicated boundary conditions including manual control actions. The calculations demonstrations the capability to model the following processes successfully: (1) variation of the natural circulation caused by steam generator isolation, (2) vapour formation in the U-tubes of the isolated steam generators, (3) break-down of circulation in the loop containing the isolated steam generator following controlled cool-down of the secondary side, (4) accumulation of vapour in the pressure vessel dome. One conclusion with respect to the suitability of experiments simulating AM procedures for code validation purposes is that complete documentation of control actions during the experiment must be available. Special attention should be given to the documentation of operator actions in the course of the experiment.

  15. Benchmarking NNWSI flow and transport codes: COVE 1 results

    International Nuclear Information System (INIS)

    Hayden, N.K.

    1985-06-01

    The code verification (COVE) activity of the Nevada Nuclear Waste Storage Investigations (NNWSI) Project is the first step in certification of flow and transport codes used for NNWSI performance assessments of a geologic repository for disposing of high-level radioactive wastes. The goals of the COVE activity are (1) to demonstrate and compare the numerical accuracy and sensitivity of certain codes, (2) to identify and resolve problems in running typical NNWSI performance assessment calculations, and (3) to evaluate computer requirements for running the codes. This report describes the work done for COVE 1, the first step in benchmarking some of the codes. Isothermal calculations for the COVE 1 benchmarking have been completed using the hydrologic flow codes SAGUARO, TRUST, and GWVIP; the radionuclide transport codes FEMTRAN and TRUMP; and the coupled flow and transport code TRACR3D. This report presents the results of three cases of the benchmarking problem solved for COVE 1, a comparison of the results, questions raised regarding sensitivities to modeling techniques, and conclusions drawn regarding the status and numerical sensitivities of the codes. 30 refs

  16. The geology of Burnsville Cove, Bath and Highland Counties, Virginia

    Science.gov (United States)

    Swezey, Christopher; Haynes, John T.; Lambert, Richard A.; White, William B.; Lucas, Philip C.; Garrity, Christopher P.

    2015-01-01

    Burnsville Cove is a karst region in Bath and Highland Counties of Virginia. A new geologic map of the area reveals various units of limestone, sandstone, and siliciclastic mudstone (shale) of Silurian through Devonian age, as well as structural features such as northeast-trending anticlines and synclines, minor thrust faults, and prominent joints. Quaternary features include erosional (strath) terraces and accumulations of mud, sand, and gravel. The caves of Burnsville Cove are located within predominantly carbonate strata above the Silurian Williamsport Sandstone and below the Devonian Oriskany Sandstone. Most of the caves are located within the Silurian Tonoloway Limestone, rather than the Silurian-Devonian Keyser Limestone as reported previously.

  17. Autonomous Slat-Cove-Filler Device for Reduction of Aeroacoustic Noise Associated with Aircraft Systems

    Science.gov (United States)

    Turner, Travis L. (Inventor); Kidd, Reggie T. (Inventor); Lockard, David P (Inventor); Khorrami, Mehdi R. (Inventor); Streett, Craig L. (Inventor); Weber, Douglas Leo (Inventor)

    2016-01-01

    A slat cove filler is utilized to reduce airframe noise resulting from deployment of a leading edge slat of an aircraft wing. The slat cove filler is preferably made of a super elastic shape memory alloy, and the slat cove filler shifts between stowed and deployed shapes as the slat is deployed. The slat cove filler may be configured such that a separate powered actuator is not required to change the shape of the slat cove filler from its deployed shape to its stowed shape and vice-versa. The outer contour of the slat cove filler preferably follows a profile designed to maintain accelerating flow in the gap between the slat cove filler and wing leading edge to provide for noise reduction.

  18. Patient Experiences with the Preoperative Assessment Clinic (PEPAC): validation of an instrument to measure patient experiences

    NARCIS (Netherlands)

    Edward, G. M.; Lemaire, L. C.; Preckel, B.; Oort, F. J.; Bucx, M. J. L.; Hollmann, M. W.; de Haes, J. C. J. M.

    2007-01-01

    Background. Presently, no comprehensive and validated questionnaire to measure patient experiences of the preoperative assessment clinic (PAC) is available. We developed and validated the Patient Experiences with the Preoperative Assessment Clinic (PEPAC) questionnaire, which can be used for

  19. Modern sedimentation patterns in Potter Cove, King George Island, Antarctica

    Science.gov (United States)

    Hass, H. Christian; Kuhn, Gerhard; Wölfl, Anne-Cathrin; Wittenberg, Nina; Betzler, Christian

    2013-04-01

    IMCOAST among a number of other initiatives investigates the modern and the late Holocene environmental development of south King George Island with a strong emphasis on Maxwell Bay and its tributary fjord Potter Cove (maximum water depth: about 200 m). In this part of the project we aim at reconstructing the modern sediment distribution in the inner part of Potter Cove using an acoustic ground discrimination system (RoxAnn) and more than136 ground-truth samples. Over the past 20 years the air temperatures in the immediate working area increased by more than 0.6 K (Schloss et al. 2012) which is less than in other parts of the West Antarctic Peninsula (WAP) but it is still in the range of the recovery of temperatures from the Little Ice Age maximum to the beginning of the 20th century. Potter Cove is a small fjord characterized by a series of moraine ridges produced by a tidewater glacier (Fourcade Glacier). Presumably, the farthest moraine is not much older than about 500 years (LIA maximum), hence the sediment cover is rather thin as evidenced by high resolution seismic data. Since a few years at least the better part of the tidewater glacier retreated onto the island's mainland. It is suggested that such a fundamental change in the fjord's physiography has also changed sedimentation patterns in the area. Potter Cove is characterized by silty-clayey sediments in the deeper inner parts of the cove. Sediments are coarser (fine to coarse sands and boulders) in the shallower areas; they also coarsen from the innermost basin to the mouth of the fjord. Textural structures follow the seabed morphology, i.e. small v-shaped passages through the moraine ridges. The glacier still produces large amounts of turbid melt waters that enter the cove at various places. We presume that very fine-grained sediments fall out from the meltwater plumes and are distributed by mid-depth or even bottom currents, thus suggesting an anti-estuarine circulation pattern. Older sediments that are

  20. Excavations at Cook's Cove, Tolaga Bay, New Zealand

    International Nuclear Information System (INIS)

    Walter, R.; Jacomb, C.; Brooks, E.

    2011-01-01

    The Cook's Cove site (Z17/311) on the East Coast of the North Island of New Zealand is an unusual example of an archaeological site spanning close to the full duration of the New Zealand prehistoric sequence. In addition to a record of Polynesian activities, the site is also well known as the type site for the North Island Holocene stratigraphy. Recent excavations at Cook's Cove have resulted in a reinterpretation of the nature of Polynesian occupation and adaptation in this part of the North Island. The application of an 'event phase' interpretative approach provides the means for reconstructing a detailed history of environmental processes and their relationships to cultural activities over a period of 700 years. (author). 61 refs., 17 figs., 13 tabs.

  1. Validation of wind loading codes by experiments

    NARCIS (Netherlands)

    Geurts, C.P.W.

    1998-01-01

    Between 1994 and 1997, full scale measurements of the wind and wind induced pressures were carried out on the main building of Eindhoven University of Technology. Simultaneously, a comparative wind tunnel experiment was performed in an atmospheric boundary layer wind tunnel. In this paper, the

  2. [Validity of psychoprophylaxis in obstetrics. Authors' experience].

    Science.gov (United States)

    D'Alfonso, A; Zaurito, V; Facchini, D; Di Stefano, L; Patacchiola, F; Cappa, F

    1990-12-01

    The Authors report the results based on 20 years of practice on obstetric psycho-prophylaxis (PPO). Data on presence at course, on frequency, on primipares/pluripares ratio, on labour, on timing and mode of delivery, are assembled. Moreover, neonatal status at birth and at 10th day of life, are investigated. The data obtained were compared with a control group, constituted by women without any treatment before delivery. The acquired experience confirm the utility of PPO in the ordinary clinical practice.

  3. CFD validation experiments at the Lockheed-Georgia Company

    Science.gov (United States)

    Malone, John B.; Thomas, Andrew S. W.

    1987-01-01

    Information is given in viewgraph form on computational fluid dynamics (CFD) validation experiments at the Lockheed-Georgia Company. Topics covered include validation experiments on a generic fighter configuration, a transport configuration, and a generic hypersonic vehicle configuration; computational procedures; surface and pressure measurements on wings; laser velocimeter measurements of a multi-element airfoil system; the flowfield around a stiffened airfoil; laser velocimeter surveys of a circulation control wing; circulation control for high lift; and high angle of attack aerodynamic evaluations.

  4. 77 FR 33446 - Jordan Cove Energy Project, L.P.; Application for Long-Term Authorization to Export Liquefied...

    Science.gov (United States)

    2012-06-06

    ....\\2\\ \\2\\ Jordan Cove states that under the LTA business model, the decision whether to utilize... that presumption ``by making an affirmative showing of inconsistency with the public interest.'' \\6\\ \\5... Cove is highlighted in Jordan Cove's application. Based on the reasoning provided in the Application...

  5. COVE: a visual environment for ocean observatory design

    International Nuclear Information System (INIS)

    Grochow, K; Lazowska, E; Stoermer, M; Kelley, D; Delaney, J

    2008-01-01

    Physical, chemical, and biological ocean processes play a crucial role in determining Earth's environment. Unfortunately, our knowledge of these processes is limited because oceanography is carried out today largely the way it was a century ago: as expeditionary science, going to sea in ships and measuring a relatively small number of parameters (e.g., temperature, salinity, and pressure) as time and budget allow. The NSF Ocean Observatories Initiative is a US$330 million project that will help transform oceanography from a data-poor to a data-rich science. A cornerstone of this project is the deep water Regional Scale Nodes (RSN) that will be installed off the coasts of Washington and Oregon. The RSN will include 1500 km of fiber optic cable providing power and bandwidth to the seafloor and throughout the water column. Thousands of sensors will be deployed to stream data and imagery to shore, where they will be available in real time for ocean scientists and the public at large. The design of the RSN is a complex undertaking, requiring a combination of many different interactive tools and areas of visualization: geographic visualization to see the available seafloor bathymetry, scientific visualization to examine existing geospatially located datasets, layout tools to place the sensors, and collaborative tools to communicate across the team during the design. COVE, the Common Observatory Visualization Environment, is a visualization environment designed to meet all these needs. COVE has been built by computer scientists working closely with the engineering and scientific teams who will build and use the RSN. This paper discusses the data and activities of cabled observatory design, the design of COVE, and results from its use across the team

  6. Unsteady characteristics of a slat-cove flow field

    Science.gov (United States)

    Pascioni, Kyle A.; Cattafesta, Louis N.

    2018-03-01

    The leading-edge slat of a multielement wing is a significant contributor to the acoustic signature of an aircraft during the approach phase of the flight path. An experimental study of the two-dimensional 30P30N geometry is undertaken to further understand the flow physics and specific noise source mechanisms. The mean statistics from particle image velocimetry (PIV) shows the differences in the flow field with angle of attack, including the interaction between the cove and trailing-edge flow. Phase-locked PIV successfully links narrow-band peaks found in the surface pressure spectrum to shear layer instabilities and also reveals that a bulk cove oscillation at a Strouhal number based on a slat chord of 0.15 exists, indicative of shear layer flapping. Unsteady surface pressure measurements are documented and used to estimate spanwise coherence length scales. A narrow-band frequency prediction scheme is also tested and found to agree well with the data. Furthermore, higher-order spectral analysis suggests that nonlinear effects cause additional peaks to arise in the power spectrum, particularly at low angles of attack.

  7. Reconceptualising the external validity of discrete choice experiments.

    Science.gov (United States)

    Lancsar, Emily; Swait, Joffre

    2014-10-01

    External validity is a crucial but under-researched topic when considering using discrete choice experiment (DCE) results to inform decision making in clinical, commercial or policy contexts. We present the theory and tests traditionally used to explore external validity that focus on a comparison of final outcomes and review how this traditional definition has been empirically tested in health economics and other sectors (such as transport, environment and marketing) in which DCE methods are applied. While an important component, we argue that the investigation of external validity should be much broader than a comparison of final outcomes. In doing so, we introduce a new and more comprehensive conceptualisation of external validity, closely linked to process validity, that moves us from the simple characterisation of a model as being or not being externally valid on the basis of predictive performance, to the concept that external validity should be an objective pursued from the initial conceptualisation and design of any DCE. We discuss how such a broader definition of external validity can be fruitfully used and suggest innovative ways in which it can be explored in practice.

  8. SAS validation and analysis of in-pile TUCOP experiments

    International Nuclear Information System (INIS)

    Morman, J.A.; Tentner, A.M.; Dever, D.J.

    1985-01-01

    The validation of the SAS4A accident analysis code centers on its capability to calculate the wide range of tests performed in the TREAT (Transient Reactor Test Facility) in-pile experiments program. This paper presents the SAS4A analysis of a simulated TUCOP (Transient-Under-Cooled-Over-Power) experiment using seven full-length PFR mixed oxide fuel pins in a flowing sodium loop. Calculations agree well with measured thermal-hydraulic, pin failure time and post-failure fuel motion data. The extent of the agreement confirms the validity of the models used in the SAS4A code to describe TUCOP accidents

  9. Construction and Initial Validation of the Multiracial Experiences Measure (MEM)

    Science.gov (United States)

    Yoo, Hyung Chol; Jackson, Kelly; Guevarra, Rudy P.; Miller, Matthew J.; Harrington, Blair

    2015-01-01

    This article describes the development and validation of the Multiracial Experiences Measure (MEM): a new measure that assesses uniquely racialized risks and resiliencies experienced by individuals of mixed racial heritage. Across two studies, there was evidence for the validation of the 25-item MEM with 5 subscales including Shifting Expressions, Perceived Racial Ambiguity, Creating Third Space, Multicultural Engagement, and Multiracial Discrimination. The 5-subscale structure of the MEM was supported by a combination of exploratory and confirmatory factor analyses. Evidence of criterion-related validity was partially supported with MEM subscales correlating with measures of racial diversity in one’s social network, color-blind racial attitude, psychological distress, and identity conflict. Evidence of discriminant validity was supported with MEM subscales not correlating with impression management. Implications for future research and suggestions for utilization of the MEM in clinical practice with multiracial adults are discussed. PMID:26460977

  10. The Mistra experiment for field containment code validation first results

    International Nuclear Information System (INIS)

    Caron-Charles, M.; Blumenfeld, L.

    2001-01-01

    The MISTRA facility is a large scale experiment, designed for the purpose of thermal-hydraulics multi-D codes validation. A short description of the facility, the set up of the instrumentation and the test program are presented. Then, the first experimental results, studying helium injection in the containment and their calculations are detailed. (author)

  11. Validation of KENO V.a: Comparison with critical experiments

    International Nuclear Information System (INIS)

    Jordan, W.C.; Landers, N.F.; Petrie, L.M.

    1986-12-01

    Section 1 of this report documents the validation of KENO V.a against 258 critical experiments. Experiments considered were primarily high or low enriched uranium systems. The results indicate that the KENO V.a Monte Carlo Criticality Program accurately calculates a broad range of critical experiments. A substantial number of the calculations showed a positive or negative bias in excess of 1 1/2% in k-effective (k/sub eff/). Classes of criticals which show a bias include 3% enriched green blocks, highly enriched uranyl fluoride slab arrays, and highly enriched uranyl nitrate arrays. If these biases are properly taken into account, the KENO V.a code can be used with confidence for the design and criticality safety analysis of uranium-containing systems. Sections 2 of this report documents the results of investigation into the cause of the bias observed in Sect. 1. The results of this study indicate that the bias seen in Sect. 1 is caused by code bias, cross-section bias, reporting bias, and modeling bias. There is evidence that many of the experiments used in this validation and in previous validations are not adequately documented. The uncertainty in the experimental parameters overshadows bias caused by the code and cross sections and prohibits code validation to better than about 1% in k/sub eff/. 48 refs., 19 figs., 19 tabs

  12. A Validation Study of the Adolescent Dissociative Experiences Scale

    Science.gov (United States)

    Keck Seeley, Susan. M.; Perosa, Sandra, L.; Perosa, Linda, M.

    2004-01-01

    Objective: The purpose of this study was to further the validation process of the Adolescent Dissociative Experiences Scale (A-DES). In this study, a 6-item Likert response format with descriptors was used when responding to the A-DES rather than the 11-item response format used in the original A-DES. Method: The internal reliability and construct…

  13. A pilot project: Antioch Delta Cove, Antioch, California

    International Nuclear Information System (INIS)

    Minder, M.

    1994-01-01

    The project involves the restoration of the Hickmott cannery site, comprising approximately 15 acres (three five acre parcels) located on the Delta in inter-city Antioch. Hickmott Foods, Inc., operated a fruit and vegetable cannery between 1905 and the early 1970's, during which time tomato skins, peach and apricot pits, and asparagus butts were discharged on the site. The decaying fruit pits have caused cyanide contamination. Additionally, the site contains some petroleum hydrocarbon contamination as well as gypsum board contamination, apparently from nearby manufacturing operations. The Antioch Delta Cove Pilot shows how interested parties can work together to clean up contaminated sites and use the clean up process to stimulate technology transfer. The Antioch project is a blueprint that can be replicated at other sites across California

  14. Validating the BISON fuel performance code to integral LWR experiments

    Energy Technology Data Exchange (ETDEWEB)

    Williamson, R.L., E-mail: Richard.Williamson@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Gamble, K.A., E-mail: Kyle.Gamble@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Perez, D.M., E-mail: Danielle.Perez@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Novascone, S.R., E-mail: Stephen.Novascone@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Pastore, G., E-mail: Giovanni.Pastore@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Gardner, R.J., E-mail: Russell.Gardner@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Hales, J.D., E-mail: Jason.Hales@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Liu, W., E-mail: Wenfeng.Liu@anatech.com [ANATECH Corporation, 5435 Oberlin Dr., San Diego, CA 92121 (United States); Mai, A., E-mail: Anh.Mai@anatech.com [ANATECH Corporation, 5435 Oberlin Dr., San Diego, CA 92121 (United States)

    2016-05-15

    Highlights: • The BISON multidimensional fuel performance code is being validated to integral LWR experiments. • Code and solution verification are necessary prerequisites to validation. • Fuel centerline temperature comparisons through all phases of fuel life are very reasonable. • Accuracy in predicting fission gas release is consistent with state-of-the-art modeling and the involved uncertainties. • Rod diameter comparisons are not satisfactory and further investigation is underway. - Abstract: BISON is a modern finite element-based nuclear fuel performance code that has been under development at Idaho National Laboratory (INL) since 2009. The code is applicable to both steady and transient fuel behavior and has been used to analyze a variety of fuel forms in 1D spherical, 2D axisymmetric, or 3D geometries. Code validation is underway and is the subject of this study. A brief overview of BISON's computational framework, governing equations, and general material and behavioral models is provided. BISON code and solution verification procedures are described, followed by a summary of the experimental data used to date for validation of Light Water Reactor (LWR) fuel. Validation comparisons focus on fuel centerline temperature, fission gas release, and rod diameter both before and following fuel-clad mechanical contact. Comparisons for 35 LWR rods are consolidated to provide an overall view of how the code is predicting physical behavior, with a few select validation cases discussed in greater detail. Results demonstrate that (1) fuel centerline temperature comparisons through all phases of fuel life are very reasonable with deviations between predictions and experimental data within ±10% for early life through high burnup fuel and only slightly out of these bounds for power ramp experiments, (2) accuracy in predicting fission gas release appears to be consistent with state-of-the-art modeling and with the involved uncertainties and (3) comparison

  15. Effects of flow separation and cove leakage on pressure and heat-transfer distributions along a wing-cove-elevon configuration at Mach 6.9. [Langley 8-ft high temperature tunnel test

    Science.gov (United States)

    Deveikis, W. D.

    1983-01-01

    External and internal pressure and cold-wall heating-rate distributions were obtained in hypersonic flow on a full-scale heat-sink representation of the space shuttle orbiter wing-elevon-cove configuration in an effort to define effects of flow separation on cove aerothermal environment as a function of cove seal leak area, ramp angle, and free-stream unit Reynolds number. Average free-stream Mach number from all tests was 6.9; average total temperature from all tests was 3360 R; free-stream dynamic pressure ranged from about 2 to 9 psi; and wing angle of attack was 5 deg (flow compression). For transitional and turbulent flow separation, increasing cove leakage progressively increased heating rates in the cove. When ingested mass flow was sufficient to force large reductions in extent of separation, increasing cove leakage reduced heating rates in the cove to those for laminar attached flow. Cove heating-rate distributions calculated with a method that assumed laminar developing channel flow agreed with experimentally obtained distributions within root-mean-square differences that varied between 11 and 36 percent where cove walls were parallel for leak areas of 50 and 100 percent.

  16. Disruption Tolerant Networking Flight Validation Experiment on NASA's EPOXI Mission

    Science.gov (United States)

    Wyatt, Jay; Burleigh, Scott; Jones, Ross; Torgerson, Leigh; Wissler, Steve

    2009-01-01

    In October and November of 2008, the Jet Propulsion Laboratory installed and tested essential elements of Delay/Disruption Tolerant Networking (DTN) technology on the Deep Impact spacecraft. This experiment, called Deep Impact Network Experiment (DINET), was performed in close cooperation with the EPOXI project which has responsibility for the spacecraft. During DINET some 300 images were transmitted from the JPL nodes to the spacecraft. Then they were automatically forwarded from the spacecraft back to the JPL nodes, exercising DTN's bundle origination, transmission, acquisition, dynamic route computation, congestion control, prioritization, custody transfer, and automatic retransmission procedures, both on the spacecraft and on the ground, over a period of 27 days. All transmitted bundles were successfully received, without corruption. The DINET experiment demonstrated DTN readiness for operational use in space missions. This activity was part of a larger NASA space DTN development program to mature DTN to flight readiness for a wide variety of mission types by the end of 2011. This paper describes the DTN protocols, the flight demo implementation, validation metrics which were created for the experiment, and validation results.

  17. Site characterization and validation - Tracer migration experiment in the validation drift, report 2, part 1: performed experiments, results and evaluation

    International Nuclear Information System (INIS)

    Birgersson, L.; Widen, H.; Aagren, T.; Neretnieks, I.; Moreno, L.

    1992-01-01

    This report is the second of the two reports describing the tracer migration experiment where water and tracer flow has been monitored in a drift at the 385 m level in the Stripa experimental mine. The tracer migration experiment is one of a large number of experiments performed within the Site Characterization and Validation (SCV) project. The upper part of the 50 m long validation drift was covered with approximately 150 plastic sheets, in which the emerging water was collected. The water emerging into the lower part of the drift was collected in short boreholes, sumpholes. Sex different tracer mixtures were injected at distances between 10 and 25 m from the drift. The flowrate and tracer monitoring continued for ten months. Tracer breakthrough curves and flowrate distributions were used to study flow paths, velocities, hydraulic conductivities, dispersivities, interaction with the rock matrix and channelling effects within the rock. The present report describes the structure of the observations, the flowrate measurements and estimated hydraulic conductivities. The main part of this report addresses the interpretation of the tracer movement in fractured rock. The tracer movement as measured by the more than 150 individual tracer curves has been analysed with the traditional advection-dispersion model and a subset of the curves with the advection-dispersion-diffusion model. The tracer experiments have permitted the flow porosity, dispersion and interaction with the rock matrix to be studied. (57 refs.)

  18. Validation method training: nurses' experiences and ratings of work climate.

    Science.gov (United States)

    Söderlund, Mona; Norberg, Astrid; Hansebo, Görel

    2014-03-01

    Training nursing staff in communication skills can impact on the quality of care for residents with dementia and contributes to nurses' job satisfaction. Changing attitudes and practices takes time and energy and can affect the entire nursing staff, not just the nurses directly involved in a training programme. Therefore, it seems important to study nurses' experiences of a training programme and any influence of the programme on work climate among the entire nursing staff. To explore nurses' experiences of a 1-year validation method training programme conducted in a nursing home for residents with dementia and to describe ratings of work climate before and after the programme. A mixed-methods approach. Twelve nurses participated in the training and were interviewed afterwards. These individual interviews were tape-recorded and transcribed, then analysed using qualitative content analysis. The Creative Climate Questionnaire was administered before (n = 53) and after (n = 56) the programme to the entire nursing staff in the participating nursing home wards and analysed with descriptive statistics. Analysis of the interviews resulted in four categories: being under extra strain, sharing experiences, improving confidence in care situations and feeling uncertain about continuing the validation method. The results of the questionnaire on work climate showed higher mean values in the assessment after the programme had ended. The training strengthened the participating nurses in caring for residents with dementia, but posed an extra strain on them. These nurses also described an extra strain on the entire nursing staff that was not reflected in the results from the questionnaire. The work climate at the nursing home wards might have made it easier to conduct this extensive training programme. Training in the validation method could develop nurses' communication skills and improve their handling of complex care situations. © 2013 Blackwell Publishing Ltd.

  19. Gamma streaming experiments for validation of Monte Carlo code

    International Nuclear Information System (INIS)

    Thilagam, L.; Mohapatra, D.K.; Subbaiah, K.V.; Iliyas Lone, M.; Balasubramaniyan, V.

    2012-01-01

    In-homogeneities in shield structures lead to considerable amount of leakage radiation (streaming) increasing the radiation levels in accessible areas. Development works on experimental as well as computational methods for quantifying this streaming radiation are still continuing. Monte Carlo based radiation transport code, MCNP is usually a tool for modeling and analyzing such problems involving complex geometries. In order to validate this computational method for streaming analysis, it is necessary to carry out some experimental measurements simulating these inhomogeneities like ducts and voids present in the bulk shields for typical cases. The data thus generated will be analysed by simulating the experimental set up employing MCNP code and optimized input parameters for the code in finding solutions for similar radiation streaming problems will be formulated. Comparison of experimental data obtained from radiation streaming experiments through ducts will give a set of thumb rules and analytical fits for total radiation dose rates within and outside the duct. The present study highlights the validation of MCNP code through the gamma streaming experiments carried out with the ducts of various shapes and dimensions. Over all, the present study throws light on suitability of MCNP code for the analysis of gamma radiation streaming problems for all duct configurations considered. In the present study, only dose rate comparisons have been made. Studies on spectral comparison of streaming radiation are in process. Also, it is planned to repeat the experiments with various shield materials. Since the penetrations and ducts through bulk shields are unavoidable in an operating nuclear facility the results on this kind of radiation streaming simulations and experiments will be very useful in the shield structure optimization without compromising the radiation safety

  20. Peninsula Effects on Birds in a Coastal Landscape: Are Coves More Species Rich than Lobes?

    Directory of Open Access Journals (Sweden)

    Sam Riffell

    2012-10-01

    Full Text Available Peninsula effects - decreasing richness with increasing distance along peninsula lobes - have been identified for many taxa on large peninsulas. Peninsula effects are caused by differences in colonization and extinction predicted by island biogeography or by environmental gradients along the peninsula. We compared species-area regressions for cove patches (i.e., mainland to regressions for lobe patches (i.e., on peninsula tips for wet meadow birds along a highly interdigitated shoreline (northern Lake Huron, USA. We conducted analysis both with and without accounting for variation in habitat and landscape characteristics (i.e., environmental gradients of wet meadows. Species-area regressions for coves did not differ from lobes, nor did these results differ when we accounted for gradients. Similarly, few species were more abundant in coves. Peninsula effects may have been lacking because lobe patches were located ≈ 800 m on average from the mainland, and birds are highly mobile and can easily sample patches over these distances. One important caveat was that wet meadow patches > 5 ha were located in coves, so coves would still be important considerations in conservation plans because of the contribution of large patches to reproductive success, dispersal and population dynamics.

  1. 77 FR 59601 - Dominion Cove Point LNG, LP; Notice of Intent To Prepare an Environmental Assessment for the...

    Science.gov (United States)

    2012-09-28

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PF12-16-000] Dominion Cove Point LNG, LP; Notice of Intent To Prepare an Environmental Assessment for the Planned Cove Point Liquefaction Project, Request for Comments on Environmental Issues, Notice of On- Site Environmental Review, and Notice of Public Scoping Meetings The...

  2. Getting the Price Right: Costing and Charging Commercial Provision in Centres of Vocational Excellence (CoVEs). Research Report

    Science.gov (United States)

    Aitken, Liz; Chadwick, Arthur; Hughes, Maria

    2006-01-01

    Centres of Vocational Excellence (CoVEs) were established in 2001, intended to be a key driver in enhancing the contribution of the further education (FE) sector to meeting skills needs. Current government policy expects employers and individuals to pay a greater share of the costs of training, particularly at Level 3, which is the CoVE priority…

  3. 33 CFR 165.502 - Safety and Security Zone; Cove Point Liquefied Natural Gas Terminal, Chesapeake Bay, Maryland.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 2 2010-07-01 2010-07-01 false Safety and Security Zone; Cove... Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) PORTS AND WATERWAYS SAFETY... Areas Fifth Coast Guard District § 165.502 Safety and Security Zone; Cove Point Liquefied Natural Gas...

  4. TRIMS: Validating T2 Molecular Effects for Neutrino Mass Experiments

    Science.gov (United States)

    Lin, Ying-Ting; Trims Collaboration

    2017-09-01

    The Tritium Recoil-Ion Mass Spectrometer (TRIMS) experiment examines the branching ratio of the molecular tritium (T2) beta decay to the bound state (3HeT+). Measuring this branching ratio helps to validate the current molecular final-state theory applied in neutrino mass experiments such as KATRIN and Project 8. TRIMS consists of a magnet-guided time-of-flight mass spectrometer with a detector located on each end. By measuring the kinetic energy and time-of-flight difference of the ions and beta particles reaching the detectors, we will be able to distinguish molecular ions from atomic ones and hence derive the ratio in question. We will give an update on the apparatus, simulation software, and analysis tools, including efforts to improve the resolution of our detectors and to characterize the stability and uniformity of our field sources. We will also share our commissioning results and prospects for physics data. The TRIMS experiment is supported by U.S. Department of Energy Office of Science, Office of Nuclear Physics, Award Number DE-FG02-97ER41020.

  5. Explicating Experience: Development of a Valid Scale of Past Hazard Experience for Tornadoes.

    Science.gov (United States)

    Demuth, Julie L

    2018-03-23

    People's past experiences with a hazard theoretically influence how they approach future risks. Yet, past hazard experience has been conceptualized and measured in wide-ranging, often simplistic, ways, resulting in mixed findings about its relationship with risk perception. This study develops a scale of past hazard experiences, in the context of tornadoes, that is content and construct valid. A conceptual definition was developed, a set of items were created to measure one's most memorable and multiple tornado experiences, and the measures were evaluated through two surveys of the public who reside in tornado-prone areas. Four dimensions emerged of people's most memorable experience, reflecting their awareness of the tornado risk that day, their personalization of the risk, the intrusive impacts on them personally, and impacts experienced vicariously through others. Two dimensions emerged of people's multiple experiences, reflecting common types of communication received and negative emotional responses. These six dimensions are novel in that they capture people's experience across the timeline of a hazard as well as intangible experiences that are both direct and indirect. The six tornado experience dimensions were correlated with tornado risk perceptions measured as cognitive-affective and as perceived probability of consequences. The varied experience-risk perception results suggest that it is important to understand the nuances of these concepts and their relationships. This study provides a foundation for future work to continue explicating past hazard experience, across different risk contexts, and for understanding its effect on risk assessment and responses. © 2018 Society for Risk Analysis.

  6. Benchmark validation by means of pulsed sphere experiment at OKTAVIAN

    Energy Technology Data Exchange (ETDEWEB)

    Ichihara, Chihiro [Kyoto Univ., Kumatori, Osaka (Japan). Research Reactor Inst.; Hayashi, Shu A.; Kimura, Itsuro; Yamamoto, Junji; Takahashi, Akito

    1998-03-01

    In order to make benchmark validation of the existing evaluated nuclear data for fusion related material, neutron leakage spectra from spherical piles were measured with a time-of-flight technique using the intense 14 MeV neutron source, OKTAVIAN in the energy range from 0.1 to 15 MeV. The neutron energy spectra were obtained as the absolute value normalized per the source neutron. The measured spectra were compared with those by theoretical calculation using a Monte Carlo neutron transport code, MCNP with several libraries processed from the evaluated nuclear data files. Comparison has been made with the spectrum shape, the C/E values of neutron numbers integrated in 4 energy regions and the calculated spectra unfolded by the number of collisions, especially those after a single collision. The new libraries predicted the experiment fairly well for Li, Cr, Mn, Cu and Mo. For Al, Si, Zr, Nb and W, new data files could give fair prediction. However, C/E differed more than 20% for several regions. For LiF, CF{sub 2}, Ti and Co, no calculation could predict the experiment. The detailed discussion has been given for Cr, Mn and Cu samples. EFF-2 calculation overestimated by 24% for the Cr experiment between 1 and 5-MeV neutron energy region, presumably because of overestimation of inelastic cross section and {sup 52}Cr(n,2n) cross section and the problem in energy and angular distribution of secondary neutrons in EFF-2. For Cu, ENDF/B-VI and EFF-2 overestimated the experiment by about 20 to 30-% in the energy range between 5 and 12-MeV, presumably from the problem in inelastic scattering cross section. (author)

  7. Low noise wing slat system with rigid cove-filled slat

    Science.gov (United States)

    Shmilovich, Arvin (Inventor); Yadlin, Yoram (Inventor)

    2013-01-01

    Concepts and technologies described herein provide for a low noise aircraft wing slat system. According to one aspect of the disclosure provided herein, a cove-filled wing slat is used in conjunction with a moveable panel rotatably attached to the wing slat to provide a high lift system. The moveable panel rotates upward against the rear surface of the slat during deployment of the slat, and rotates downward to bridge a gap width between the stowed slat and the lower wing surface, completing the continuous outer mold line shape of the wing, when the cove-filled slat is retracted to the stowed position.

  8. Discrete fracture modelling for the Stripa tracer validation experiment predictions

    International Nuclear Information System (INIS)

    Dershowitz, W.; Wallmann, P.

    1992-02-01

    Groundwater flow and transport through three-dimensional networks of discrete fractures was modeled to predict the recovery of tracer from tracer injection experiments conducted during phase 3 of the Stripa site characterization and validation protect. Predictions were made on the basis of an updated version of the site scale discrete fracture conceptual model used for flow predictions and preliminary transport modelling. In this model, individual fractures were treated as stochastic features described by probability distributions of geometric and hydrologic properties. Fractures were divided into three populations: Fractures in fracture zones near the drift, non-fracture zone fractures within 31 m of the drift, and fractures in fracture zones over 31 meters from the drift axis. Fractures outside fracture zones are not modelled beyond 31 meters from the drift axis. Transport predictions were produced using the FracMan discrete fracture modelling package for each of five tracer experiments. Output was produced in the seven formats specified by the Stripa task force on fracture flow modelling. (au)

  9. Benchmark validation by means of pulsed sphere experiment at OKTAVIAN

    Energy Technology Data Exchange (ETDEWEB)

    Ichihara, Chihiro [Kyoto Univ., Kumatori, Osaka (Japan). Research Reactor Inst.; Hayashi, S.A.; Kimura, Itsuro; Yamamoto, Junji; Takahashi, Akito

    1997-03-01

    The new version of Japanese nuclear data library JENDL-3.2 has recently been released. JENDL Fusion File which adopted DDX representations for secondary neutrons was also improved with the new evaluation method. On the other hand, FENDL nuclear data project to compile nuclear data library for fusion related research has been conducted partly under auspices of International Atomic Energy Agency (IAEA). The first version FENDL-1 consists of JENDL-3.1, ENDF/B-VI, BROND-2 and EFF-1 and has been released in 1995. The work for the second version FENDL-2 is now ongoing. The Bench mark validation of the nuclear data libraries have been performed to help selecting the candidate for the FENDL-2. The benchmark experiment have been conducted at OKTAVIAN of Osaka university. The sample spheres were constructed by filling the spherical shells with sample. The leakage neutron spectra from sphere piles were measured with a time-of-flight method. The measured spectra were compared with the theoretical calculation using MCNP 4A and the processed libraries from JENDL-3.1, JENDL-3.2, JENDL Fusion File, and FENDL-1. JENDL Fusion File and JENDL-3.2 gave almost the same prediction for the experiment. And both prediction are almost satisfying for Li, Cr, Mn, Cu, Zr, Nb and Mo, whereas for Al, LiF, CF2, Si, Ti, Co and W there is some discrepancy. However, they gave better prediction than the calculations using the library from FENDL-1, except for W. (author)

  10. EPIC Calibration/Validation Experiment Field Campaign Report

    Energy Technology Data Exchange (ETDEWEB)

    Koch, Steven E [National Severe Storm Laboratory/NOAA; Chilson, Phillip [University of Oklahoma; Argrow, Brian [University of Colorado

    2017-03-15

    A field exercise involving several different kinds of Unmanned Aerial Systems (UAS) and supporting instrumentation systems provided by DOE/ARM and NOAA/NSSL was conducted at the ARM SGP site in Lamont, Oklahoma on 29-30 October 2016. This campaign was part of a larger National Oceanic and Atmospheric Administration (NOAA) UAS Program Office program awarded to the National Severe Storms Laboratory (NSSL). named Environmental Profiling and Initiation of Convection (EPIC). The EPIC Field Campaign (Test and Calibration/Validation) proposed to ARM was a test or “dry-run” for a follow-up campaign to be requested for spring/summer 2017. The EPIC project addresses NOAA’s objective to “evaluate options for UAS profiling of the lower atmosphere with applications for severe weather.” The project goal is to demonstrate that fixed-wing and rotary-wing small UAS have the combined potential to provide a unique observing system capable of providing detailed profiles of temperature, moisture, and winds within the atmospheric boundary layer (ABL) to help determine the potential for severe weather development. Specific project objectives are: 1) to develop small UAS capable of acquiring needed wind and thermodynamic profiles and transects of the ABL using one fixed-wing UAS operating in tandem with two different fixed rotary-wing UAS pairs; 2) adapt and test miniaturized, high-precision, and fast-response atmospheric sensors with high accuracy in strong winds characteristic of the pre-convective ABL in Oklahoma; 3) conduct targeted short-duration experiments at the ARM Southern Great Plains site in northern Oklahoma concurrently with a second site to be chosen in “real-time” from the Oklahoma Mesonet in coordination with the (National Weather Service (NWS)-Norman Forecast Office; and 4) gain valuable experience in pursuit of NOAA’s goals for determining the value of airborne, mobile observing systems for monitoring rapidly evolving high-impact severe weather

  11. The Childbirth Experience Questionnaire (CEQ) - validation of its use in a Danish population

    DEFF Research Database (Denmark)

    Boie, Sidsel; Glavind, Julie; Uldbjerg, Niels

    experience is lacking. The Childbirth Experience Questionnaire (CEQ) was developed in Sweden in 2010 and validated in Swedish women, but never validated in a Danish setting, and population. The purpose of our study was to validate the CEQ as a reliable tool for measuring the childbirth experience in Danish......Title The Childbirth Experience Questionnaire (CEQ) - validation the use in a Danish population Introduction Childbirth experience is arguably as important as measuring birth outcomes such as mode of delivery or perinatal morbidity. A robust, validated, Danish tool for evaluating childbirth...... index of agreement between the two scores. Case description (mandatory for Clinical Report) Results (mandatory for Original Research) Face validity: All respondents stated that it was easy to understand and complete the questionnaire. Construct validity: Statistically significant higher CEQ scores were...

  12. Monitoring Building Deformation with InSAR: Experiments and Validation

    Science.gov (United States)

    Yang, Kui; Yan, Li; Huang, Guoman; Chen, Chu; Wu, Zhengpeng

    2016-01-01

    Synthetic Aperture Radar Interferometry (InSAR) techniques are increasingly applied for monitoring land subsidence. The advantages of InSAR include high accuracy and the ability to cover large areas; nevertheless, research validating the use of InSAR on building deformation is limited. In this paper, we test the monitoring capability of the InSAR in experiments using two landmark buildings; the Bohai Building and the China Theater, located in Tianjin, China. They were selected as real examples to compare InSAR and leveling approaches for building deformation. Ten TerraSAR-X images spanning half a year were used in Permanent Scatterer InSAR processing. These extracted InSAR results were processed considering the diversity in both direction and spatial distribution, and were compared with true leveling values in both Ordinary Least Squares (OLS) regression and measurement of error analyses. The detailed experimental results for the Bohai Building and the China Theater showed a high correlation between InSAR results and the leveling values. At the same time, the two Root Mean Square Error (RMSE) indexes had values of approximately 1 mm. These analyses show that a millimeter level of accuracy can be achieved by means of InSAR technique when measuring building deformation. We discuss the differences in accuracy between OLS regression and measurement of error analyses, and compare the accuracy index of leveling in order to propose InSAR accuracy levels appropriate for monitoring buildings deformation. After assessing the advantages and limitations of InSAR techniques in monitoring buildings, further applications are evaluated. PMID:27999403

  13. Monitoring Building Deformation with InSAR: Experiments and Validation

    Directory of Open Access Journals (Sweden)

    Kui Yang

    2016-12-01

    Full Text Available Synthetic Aperture Radar Interferometry (InSAR techniques are increasingly applied for monitoring land subsidence. The advantages of InSAR include high accuracy and the ability to cover large areas; nevertheless, research validating the use of InSAR on building deformation is limited. In this paper, we test the monitoring capability of the InSAR in experiments using two landmark buildings; the Bohai Building and the China Theater, located in Tianjin, China. They were selected as real examples to compare InSAR and leveling approaches for building deformation. Ten TerraSAR-X images spanning half a year were used in Permanent Scatterer InSAR processing. These extracted InSAR results were processed considering the diversity in both direction and spatial distribution, and were compared with true leveling values in both Ordinary Least Squares (OLS regression and measurement of error analyses. The detailed experimental results for the Bohai Building and the China Theater showed a high correlation between InSAR results and the leveling values. At the same time, the two Root Mean Square Error (RMSE indexes had values of approximately 1 mm. These analyses show that a millimeter level of accuracy can be achieved by means of InSAR technique when measuring building deformation. We discuss the differences in accuracy between OLS regression and measurement of error analyses, and compare the accuracy index of leveling in order to propose InSAR accuracy levels appropriate for monitoring buildings deformation. After assessing the advantages and limitations of InSAR techniques in monitoring buildings, further applications are evaluated.

  14. Validation of NEPTUNE-CFD on ULPU-V experiments

    Energy Technology Data Exchange (ETDEWEB)

    Jamet, Mathieu, E-mail: mathieu.jamet@edf.fr; Lavieville, Jerome; Atkhen, Kresna; Mechitoua, Namane

    2015-11-15

    In-vessel retention (IVR) of molten corium through external cooling of the reactor pressure vessel is one possible means of severe accident mitigation for a class of nuclear power plants. The aim is to successfully terminate the progression of a core melt within the reactor vessel. The probability of success depends on the efficacy of the cooling strategy; hence one of the key aspects of an IVR demonstration relates to the heat removal capability through the vessel wall by convection and boiling in the external water flow. This is only possible if the in-vessel thermal loading is lower than the local critical heat flux expected along the outer wall of the vessel, which is in turn highly dependent on the flow characteristics between the vessel and the insulator. The NEPTUNE-CFD multiphase flow solver is used to obtain a better understanding at local scale of the thermal hydraulics involved in this situation. The validation of the NEPTUNE-CFD code on the ULPU-V facility experiments carried out at the University of California Santa Barbara is presented as a first attempt of using CFD codes at EDF to address such an issue. Two types of computation are performed. On the one hand, a steady state algorithm is used to compute natural circulation flow rates and differential pressures and, on the other, a transient algorithm computation reveals the oscillatory nature of the pressure data recorded in the ULPU facility. Several dominant frequencies are highlighted. In both cases, the CFD simulations reproduce reasonably well the experimental data for these quantities.

  15. Physical and chemical limnology of Ides Cove near Rochester, New York, 1970-1982

    Science.gov (United States)

    Bubeck, R.C.; Staubitz, W.W.; Weidemann, A.D.; Spittal, L.P.

    1995-01-01

    Ides Cove is a small embayment on the western shore of Irondequoit Bay near Rochester, N.Y. In 1982, alum was applied to the cove to seal the bottom sediments and thereby decrease nutrient fluxes in an effort to assess the applicability of this technique to Irondequoit Bay. Published data were used to develop a baseline analysis of the chemical and physical limnology of Ides Cove prior to the alum treatment and to provide a basis for comparison and evaluation of post-treatment data. The baseline analysis also enables evaluation of trends in the nutrient status and mixing patterns in Ides Cove since the decrease of sewage inflows and use of road salt in the Irondequoit Bay and Ides Cove drainage basins during 1970-82. Data from 1970-72 and 1979-82 were used to construct partial and full-year depth profiles of several physical properties and chemical constituents of water in the cove; comparison of these profiles indicates a significant improvement in water quality between 1970 and 1982. The diversion of sewage out of the Irondequoit Creek drainage basin in the late 1970's resulted in an 80-percent decrease in total phosphate concentration and a 50- to 60-percent decrease in nitrogen (nitrate and ammonia) concentration in the cove. Indications of decreased primary productivity are associated with these lowered nutrient concentrations. Summer Secchi-disk transparency increased from 0.6 m (meters) in 1970-72 to 1.2 m in 1980-82; peak epilimnetic dissolved oxygen levels decreased from a range of 22 to 28 mg/L (milligrams per liter) to a range of 16 to 20 mg/L; and peak epilimnetic pH decreased from greater than 9.4 to between 8.8 and 9.0. The decrease in the use of road salt in the Irondequoit basin beginning in 1974 resulted in a decrease in chloride concentration and gradient (difference between the surface and bottom con- centration). The maximum annual chloride concentration in the epilimnion decreased from the 210-to-225-mg/L range in the spring of 1971-72 to the

  16. Validation of the Danish language Injustice Experience Questionnaire

    DEFF Research Database (Denmark)

    la Cour, Peter; Schultz, Rikke; Smith, Anne Agerskov

    2017-01-01

    /somatoform symptoms. These patients also completed questionnaires concerning sociodemographics, anxiety and depression, subjective well-being, and overall physical and mental functioning. Our results showed satisfactory interpretability and face validity, and high internal consistency (Cronbach's alpha = .90...

  17. Measuring the experience of hospitality : Scale development and validation

    NARCIS (Netherlands)

    Pijls-Hoekstra, Ruth; Groen, Brenda H.; Galetzka, Mirjam; Pruyn, Adriaan T.H.

    2017-01-01

    This paper identifies what customers experience as hospitality and subsequently presents a novel and compact assessment scale for measuring customers’ experience of hospitality at any kind of service organization. The Experience of Hospitality Scale (EH-Scale) takes a broader perspective compared to

  18. Benthic diatoms from Potter Cove, 25 de Mayo (King George) Island, Antarctica: Mucilage and glucan storage as a C-source for limpets

    Science.gov (United States)

    Daglio, Yasmin; Sacristán, Hernán; Ansaldo, Martín; Rodríguez, María C.

    2018-03-01

    Biofilms were allowed to develop on ceramic tiles placed in closed containers on the shore of Potter Cove, 25 de Mayo (King George) Island. Water pumping from the cove inside the containers extended for 25 days. Diatoms were the dominant microalgae in these biofilms, which were removed from a set of tiles to a) characterize the extracellular mucilage, b) carry out floristic determination and c) perform grazing experiments with the limpet Nacella concinna. Biofilms mucilaginous matrix consisted of proteins and carbohydrates. Room temperature aqueous extraction of the freeze-dried material rendered a fraction enriched in the storage glucan chrysolaminarin, its identity confirmed by methylation structural analyses. Hot water extracted products showed greater heterogeneity in monosaccharide composition, including glucose, mannose, galactose, fucose, xylose and rhamnose. Diatom identification revealed that Pseudogomphonema kamtschaticum was the dominant species followed by several Navicula species, Nitzschia pellucida and Synedra kerguelensis. Photographical survey of colonized tiles placed in glass flasks together with a specimen of Nacella concinna exhibited between 5 and 30% removal of the biofilms coverage after 24 h of exposure to the limpet, suggesting that EPS and chrysolaminarin constitute a C-source for the gastropod.

  19. Measuring experience of hospitality : scale development and validation

    NARCIS (Netherlands)

    Pijls-Hoekstra, Ruth; Groen, Brenda H.; Galetzka, Mirjam; Pruyn, Adriaan T.H.

    This paper describes the development of the Experience of Hospitality Scale (EH-Scale) for assessing hospitality in service environments from a guest point of view. In contrast to other scales, which focus specifically on staff behaviour, the present scale focuses on the experience of hospitality

  20. A validation of DRAGON based on lattice experiments

    International Nuclear Information System (INIS)

    Marleau, G.

    1996-01-01

    Here we address the validation of DRAGON using the Chalk River Laboratory experimental database which has already been used for the validation of other codes. Because of the large variety of information for different fuel and moderator types compiled on this database, the most basic modules of DRAGON are thoroughly tested. The general behaviour observed with DRAGON is very good. Its main weakness is seen in the self-shielding ,calculation where the correction applied to the inner fuel pin seems to be overevaluated with respect to the outer fuel pins. One question which is left open this paper concerns the need for inserting end-regions in the DRAGON cells when the heterogeneous B, leakage model is used. (author)

  1. Blast Load Simulator Experiments for Computational Model Validation: Report 2

    Science.gov (United States)

    2017-02-01

    simulations of these explosive events and their effects . These codes are continuously improving, but still require validation against experimental data to...contents of this report are not to be used for advertising , publication, or promotional purposes. Citation of trade names does not constitute an...12 Figure 18. Ninety-five percent confidence intervals on measured peak pressure. ............................ 14 Figure 19. Ninety-five percent

  2. Blast Load Simulator Experiments for Computational Model Validation Report 3

    Science.gov (United States)

    2017-07-01

    the effect of the contact surface on the measurement . For gauge locations where a clearly defined initial peak is not present, Figure 24 for example...these explosive events and their effects . These codes are continuously improving, but still require validation against experimental data to...DISCLAIMER: The contents of this report are not to be used for advertising , publication, or promotional purposes. Citation of trade names does not

  3. Tributyltin in environmental samples from the Former Derecktor Shipyard, Coddington Cove, Newport RI

    Energy Technology Data Exchange (ETDEWEB)

    Wade, Terry L.; Sweet, Stephen T.; Quinn, James G.; Cairns, Robert W.; King, John W

    2004-05-01

    Tributyltin (TBT) was detected in all 24 surface sediment (top 2 cm) samples collected from Coddington Cove, Newport, RI. TBT surface sediment concentrations ranged from 32 to 372 ng Sn/g with a mean concentration of 146 ng Sn/g. Analyses of selected core sections detected TBT in at least the top 18 cm at all 7 stations where cores were collected. No consistent TBT concentration trends with depth for these cores suggest mixing is an important process in the sediment column. In one core (station 28), TBT was found in the 76-86 cm section at a concentration of 141 ng Sn/g; thus sediments are a significant sink for TBT. However, sediment mixing processes can enhance releases of bioavailable TBT. Mussels, clams and fish from Coddington Cove contain TBT at concentrations ranging from 9.2 to 977 ng Sn/g. TBT concentrations in lobsters were below the detection limit (<6 ng Sn/g). Based on available screening criteria, TBT concentrations in Coddington Cove sediment is likely to be having an adverse effect on the biota at some locations. - TBT is likely to continue to be bioavailable for many years.

  4. Tributyltin in environmental samples from the Former Derecktor Shipyard, Coddington Cove, Newport RI

    International Nuclear Information System (INIS)

    Wade, Terry L.; Sweet, Stephen T.; Quinn, James G.; Cairns, Robert W.; King, John W.

    2004-01-01

    Tributyltin (TBT) was detected in all 24 surface sediment (top 2 cm) samples collected from Coddington Cove, Newport, RI. TBT surface sediment concentrations ranged from 32 to 372 ng Sn/g with a mean concentration of 146 ng Sn/g. Analyses of selected core sections detected TBT in at least the top 18 cm at all 7 stations where cores were collected. No consistent TBT concentration trends with depth for these cores suggest mixing is an important process in the sediment column. In one core (station 28), TBT was found in the 76-86 cm section at a concentration of 141 ng Sn/g; thus sediments are a significant sink for TBT. However, sediment mixing processes can enhance releases of bioavailable TBT. Mussels, clams and fish from Coddington Cove contain TBT at concentrations ranging from 9.2 to 977 ng Sn/g. TBT concentrations in lobsters were below the detection limit (<6 ng Sn/g). Based on available screening criteria, TBT concentrations in Coddington Cove sediment is likely to be having an adverse effect on the biota at some locations. - TBT is likely to continue to be bioavailable for many years

  5. The Food Web of Potter Cove (Antarctica): complexity, structure and function

    Science.gov (United States)

    Marina, Tomás I.; Salinas, Vanesa; Cordone, Georgina; Campana, Gabriela; Moreira, Eugenia; Deregibus, Dolores; Torre, Luciana; Sahade, Ricardo; Tatián, Marcos; Barrera Oro, Esteban; De Troch, Marleen; Doyle, Santiago; Quartino, María Liliana; Saravia, Leonardo A.; Momo, Fernando R.

    2018-01-01

    Knowledge of the food web structure and complexity are central to better understand ecosystem functioning. A food-web approach includes both species and energy flows among them, providing a natural framework for characterizing species' ecological roles and the mechanisms through which biodiversity influences ecosystem dynamics. Here we present for the first time a high-resolution food web for a marine ecosystem at Potter Cove (northern Antarctic Peninsula). Eleven food web properties were analyzed in order to document network complexity, structure and topology. We found a low linkage density (3.4), connectance (0.04) and omnivory percentage (45), as well as a short path length (1.8) and a low clustering coefficient (0.08). Furthermore, relating the structure of the food web to its dynamics, an exponential degree distribution (in- and out-links) was found. This suggests that the Potter Cove food web may be vulnerable if the most connected species became locally extinct. For two of the three more connected functional groups, competition overlap graphs imply high trophic interaction between demersal fish and niche specialization according to feeding strategies in amphipods. On the other hand, the prey overlap graph shows also that multiple energy pathways of carbon flux exist across benthic and pelagic habitats in the Potter Cove ecosystem. Although alternative food sources might add robustness to the web, network properties (low linkage density, connectance and omnivory) suggest fragility and potential trophic cascade effects.

  6. Feasibility for development of an aquaculture facility at Hot Spring Cove

    Energy Technology Data Exchange (ETDEWEB)

    1986-01-01

    This report describes the feasibilty of obtaining geothermally warmed water for use in aquaculture at Hot Springs Cove, British Columbia, and concludes that while the sources can probably be assessed from two sites in the cove, neither this nor the quantity of water available can be known for certain without field trials. The report also examines the feasibility of culturing various species of sea life at Hot Springs Cove, and concludes that a combination of rearing coho salmon smolts and oysters, with the late addition of tilapia, appears to be the most suitable both for biological and economic reasons. The total capital investment amounts to about $1,033,000. Operating costs would be about $450,000 annually, and additional capital to cover this would be needed in the first years of operation. A business plan is provided which includes cash flow projections for the first nine years of operation, and this shows that a maximum investment of approximately $1.2 million would be needed by the third year of operation. If sufficient warm water is available, and the facility is operated successfully, it should pay off the investment in seven to nine years, provided that interest free loans are available for capital investments. 20 refs., 1 fig., 8 tabs.

  7. Measures of agreement between computation and experiment:validation metrics.

    Energy Technology Data Exchange (ETDEWEB)

    Barone, Matthew Franklin; Oberkampf, William Louis

    2005-08-01

    With the increasing role of computational modeling in engineering design, performance estimation, and safety assessment, improved methods are needed for comparing computational results and experimental measurements. Traditional methods of graphically comparing computational and experimental results, though valuable, are essentially qualitative. Computable measures are needed that can quantitatively compare computational and experimental results over a range of input, or control, variables and sharpen assessment of computational accuracy. This type of measure has been recently referred to as a validation metric. We discuss various features that we believe should be incorporated in a validation metric and also features that should be excluded. We develop a new validation metric that is based on the statistical concept of confidence intervals. Using this fundamental concept, we construct two specific metrics: one that requires interpolation of experimental data and one that requires regression (curve fitting) of experimental data. We apply the metrics to three example problems: thermal decomposition of a polyurethane foam, a turbulent buoyant plume of helium, and compressibility effects on the growth rate of a turbulent free-shear layer. We discuss how the present metrics are easily interpretable for assessing computational model accuracy, as well as the impact of experimental measurement uncertainty on the accuracy assessment.

  8. Validation experience with the core calculation program karate

    International Nuclear Information System (INIS)

    Hegyi, Gy.; Hordosy, G.; Kereszturi, A.; Makai, M.; Maraczy, Cs.

    1995-01-01

    A relatively fast and easy-to-handle modular code system named KARATE-440 has been elaborated for steady-state operational calculations of VVER-440 type reactors. It is built up from cell, assembly and global calculations. In the frame of the program neutron physical and thermohydraulic process of the core at normal startup, steady and slow transient can be simulated. The verification and validation of the global code have been prepared recently. The test cases include mathematical benchmark and measurements on operating VVER-440 units. Summary of the results, such as startup parameters, boron letdown curves, radial and axial power distributions of some cycles of Paks NPP is presented. (author)

  9. Electrically Driven Thermal Management: Flight Validation, Experiment Development, Future Technologies

    Science.gov (United States)

    Didion, Jeffrey R.

    2018-01-01

    Electrically Driven Thermal Management is an active research and technology development initiative incorporating ISS technology flight demonstrations (STP-H5), development of Microgravity Science Glovebox (MSG) flight experiment, and laboratory-based investigations of electrically based thermal management techniques. The program targets integrated thermal management for future generations of RF electronics and power electronic devices. This presentation reviews four program elements: i.) results from the Electrohydrodynamic (EHD) Long Term Flight Demonstration launched in February 2017 ii.) development of the Electrically Driven Liquid Film Boiling Experiment iii.) two University based research efforts iv.) development of Oscillating Heat Pipe evaluation at Goddard Space Flight Center.

  10. Synthesis of clad motion experiments interpretation: codes and validation

    International Nuclear Information System (INIS)

    Papin, J.; Fortunato, M.; Seiler, J.M.

    1983-04-01

    This communication deals with clad melting and relocation phenomena related to LMFBR safety analysis of loss of flow accidents. We present: - the physical models developed at DSN/CEN Cadarache in single channel and bundle geometry. The interpretation with these models of experiments performed by the STT (CEN Grenoble). It comes out that we have now obtained a good understanding of the involved phenomena in single channel geometry. On the other hand, further studies are necessary for a better knowledge of clad motion phenomena in bundle cases with conditions close to reactor ones

  11. Validation of the ABBN/CONSYST constants system. Part 1: Validation through the critical experiments on compact metallic cores

    International Nuclear Information System (INIS)

    Ivanova, T.T.; Manturov, G.N.; Nikolaev, M.N.; Rozhikhin, E.V.; Semenov, M.Yu.; Tsiboulia, A.M.

    1999-01-01

    Worldwide compilation of criticality safety benchmark experiments, evaluated due to an activity of the International Criticality Safety Benchmark Evaluation Project (ICSBEP), discovers new possibilities for validation of the ABBN-93.1 cross section library for criticality safety analysis. Results of calculations of small assemblies with metal-fuelled cores are presented in this paper. It is concluded that ABBN-93.1 predicts criticality of such systems with required accuracy

  12. Freshwater and Saline Loads of Dissolved Inorganic Nitrogen to Hood Canal and Lynch Cove, Western Washington

    Science.gov (United States)

    Paulson, Anthony J.; Konrad, Christopher P.; Frans, Lonna M.; Noble, Marlene; Kendall, Carol; Josberger, Edward G.; Huffman, Raegan L.; Olsen, Theresa D.

    2006-01-01

    Hood Canal is a long (110 kilometers), deep (175 meters) and narrow (2 to 4 kilometers wide) fjord of Puget Sound in western Washington. The stratification of a less dense, fresh upper layer of the water column causes the cold, saltier lower layer of the water column to be isolated from the atmosphere in the late summer and autumn, which limits reaeration of the lower layer. In the upper layer of Hood Canal, the production of organic matter that settles and consumes dissolved oxygen in the lower layer appears to be limited by the load of dissolved inorganic nitrogen (DIN): nitrate, nitrite, and ammonia. Freshwater and saline loads of DIN to Hood Canal were estimated from available historical data. The freshwater load of DIN to the upper layer of Hood Canal, which could be taken up by phytoplankton, came mostly from surface and ground water from subbasins, which accounts for 92 percent of total load of DIN to the upper layer of Hood Canal. Although DIN in rain falling on land surfaces amounts to about one-half of the DIN entering Hood Canal from subbasins, rain falling directly on the surface of marine waters contributed only 4 percent of the load to the upper layer. Point-source discharges and subsurface flow from shallow shoreline septic systems contributed less than 4 percent of the DIN load to the upper layer. DIN in saline water flowing over the sill into Hood Canal from Admiralty Inlet was at least 17 times the total load to the upper layer of Hood Canal. In September and October 2004, field data were collected to estimate DIN loads to Lynch Cove - the most inland marine waters of Hood Canal that routinely contain low dissolved-oxygen waters. Based on measured streamflow and DIN concentrations, surface discharge was estimated to have contributed about one-fourth of DIN loads to the upper layer of Lynch Cove. Ground-water flow from subbasins was estimated to have contributed about one-half of total DIN loads to the upper layer. In autumn 2004, the relative

  13. Experience with Aero- and Fluid-Dynamic Testing for Engineering and CFD Validation

    Science.gov (United States)

    Ross, James C.

    2016-01-01

    Ever since computations have been used to simulate aerodynamics the need to ensure that the computations adequately represent real life has followed. Many experiments have been performed specifically for validation and as computational methods have improved, so have the validation experiments. Validation is also a moving target because computational methods improve requiring validation for the new aspect of flow physics that the computations aim to capture. Concurrently, new measurement techniques are being developed that can help capture more detailed flow features pressure sensitive paint (PSP) and particle image velocimetry (PIV) come to mind. This paper will present various wind-tunnel tests the author has been involved with and how they were used for validation of various kinds of CFD. A particular focus is the application of advanced measurement techniques to flow fields (and geometries) that had proven to be difficult to predict computationally. Many of these difficult flow problems arose from engineering and development problems that needed to be solved for a particular vehicle or research program. In some cases the experiments required to solve the engineering problems were refined to provide valuable CFD validation data in addition to the primary engineering data. All of these experiments have provided physical insight and validation data for a wide range of aerodynamic and acoustic phenomena for vehicles ranging from tractor-trailers to crewed spacecraft.

  14. Validation of ozone measurements from the Atmospheric Chemistry Experiment (ACE

    Directory of Open Access Journals (Sweden)

    E. Dupuy

    2009-01-01

    Full Text Available This paper presents extensive {bias determination} analyses of ozone observations from the Atmospheric Chemistry Experiment (ACE satellite instruments: the ACE Fourier Transform Spectrometer (ACE-FTS and the Measurement of Aerosol Extinction in the Stratosphere and Troposphere Retrieved by Occultation (ACE-MAESTRO instrument. Here we compare the latest ozone data products from ACE-FTS and ACE-MAESTRO with coincident observations from nearly 20 satellite-borne, airborne, balloon-borne and ground-based instruments, by analysing volume mixing ratio profiles and partial column densities. The ACE-FTS version 2.2 Ozone Update product reports more ozone than most correlative measurements from the upper troposphere to the lower mesosphere. At altitude levels from 16 to 44 km, the average values of the mean relative differences are nearly all within +1 to +8%. At higher altitudes (45–60 km, the ACE-FTS ozone amounts are significantly larger than those of the comparison instruments, with mean relative differences of up to +40% (about +20% on average. For the ACE-MAESTRO version 1.2 ozone data product, mean relative differences are within ±10% (average values within ±6% between 18 and 40 km for both the sunrise and sunset measurements. At higher altitudes (~35–55 km, systematic biases of opposite sign are found between the ACE-MAESTRO sunrise and sunset observations. While ozone amounts derived from the ACE-MAESTRO sunrise occultation data are often smaller than the coincident observations (with mean relative differences down to −10%, the sunset occultation profiles for ACE-MAESTRO show results that are qualitatively similar to ACE-FTS, indicating a large positive bias (mean relative differences within +10 to +30% in the 45–55 km altitude range. In contrast, there is no significant systematic difference in bias found for the ACE-FTS sunrise and sunset measurements.

  15. Well-founded cost estimation validated by experience

    International Nuclear Information System (INIS)

    LaGuardia, T.S.

    2005-01-01

    to build consistency into its cost estimates. A standardized list of decommissioning activities needs to be adopted internationally so estimates can be prepared on a consistent basis, and to facilitate tracking of actual costs against the estimate. The OECD/NEA Standardized List incorporates the consensus of international experts as to the elements of cost and activities that should be included in the estimate. A significant effort was made several years ago to promote universal adoption of this standard. Using the standardized list of activities as a template, a questionnaire was distributed to gather actual decommissioning costs (and other parameters) from international projects. The results of cost estimate contributions from many countries were analyzed and evaluated as to reactor types, decommissioning strategies, cost drivers, and waste disposal quantities. The results were reported in the literature A standardized list of activities will only be valuable if the underlying cost elements and methodology is clearly identified in the estimate. While no one would expect perfect correlation of every element of cost in a large project estimate versus actual cost comparison, the variants should be visible so the basis for the difference can be examined and evaluated. For the nuclear power industry to grow to meet the increasing demand for electricity, the investors, regulators and the public must understand the total cost of the nuclear fuel cycle. The costs for decommissioning and the funding requirements to provide for safe closure and dismantling of these units are well recognized to represent a significant liability to the owner utilities and governmental agencies. Owners and government regulatory agencies need benchmarked decommissioning costs to test the validity of each proposed cost and funding request. The benchmarking process requires the oversight of decommissioning experts to evaluate contributed cost data in a meaningful manner. An international

  16. COVE-1: a finite difference creep collapse code for oval fuel pin cladding material

    International Nuclear Information System (INIS)

    Mohr, C.L.

    1975-03-01

    COVE-1 is a time-dependent incremental creep collapse code that estimates the change in ovality of a fuel pin cladding tube. It uses a finite difference method of solving the differential equations which describe the deflection of the tube walls as a function of time. The physical problem is nonlinear, both with respect to geometry and material properties, which requires the use of an incremental, analytical, path-dependent solution. The application of this code is intended primarily for tubes manufactured from Zircaloy. Therefore, provision has been made to include some of the effects of anisotropy in the flow equations for inelastic incremental deformations. 10 references. (U.S.)

  17. Diatoms of the marine littoral of Steenberg's cove in St. Helena Bay, Cape province, South Africa

    CSIR Research Space (South Africa)

    Malcolm, HG

    1973-01-01

    Full Text Available transapical striae in 10 ~m. It is easily overlooked and though recorded in the samples as infrequent may be more abundant. Dimensions of the Steenberg?s Cove material were 8?10 1zm long, 3 jzm broad, striae 27 in 10 1cm. ? 619, 620. A. proteus GREGORY... (cf. CLEVE 1895: 103; GIFFEN 1970a, 267, Fig. 19). Always rare in the material. ? 619, 620. A. proteus var. coutigna CLEVE (cf. GIFFEN 1971 a, 3). As stated in a previous paper the author doubts whether these varieties viz. var. conti,gua CLEVE...

  18. Validation of the revised Mystical Experience Questionnaire in experimental sessions with psilocybin.

    Science.gov (United States)

    Barrett, Frederick S; Johnson, Matthew W; Griffiths, Roland R

    2015-11-01

    The 30-item revised Mystical Experience Questionnaire (MEQ30) was previously developed within an online survey of mystical-type experiences occasioned by psilocybin-containing mushrooms. The rated experiences occurred on average eight years before completion of the questionnaire. The current paper validates the MEQ30 using data from experimental studies with controlled doses of psilocybin. Data were pooled and analyzed from five laboratory experiments in which participants (n=184) received a moderate to high oral dose of psilocybin (at least 20 mg/70 kg). Results of confirmatory factor analysis demonstrate the reliability and internal validity of the MEQ30. Structural equation models demonstrate the external and convergent validity of the MEQ30 by showing that latent variable scores on the MEQ30 positively predict persisting change in attitudes, behavior, and well-being attributed to experiences with psilocybin while controlling for the contribution of the participant-rated intensity of drug effects. These findings support the use of the MEQ30 as an efficient measure of individual mystical experiences. A method to score a "complete mystical experience" that was used in previous versions of the mystical experience questionnaire is validated in the MEQ30, and a stand-alone version of the MEQ30 is provided for use in future research. © The Author(s) 2015.

  19. Validation Experiments for Spent-Fuel Dry-Cask In-Basket Convection

    International Nuclear Information System (INIS)

    Smith, Barton L.

    2016-01-01

    This work consisted of the following major efforts; 1. Literature survey on validation of external natural convection; 2. Design the experiment; 3. Build the experiment; 4. Run the experiment; 5. Collect results; 6. Disseminate results; and 7. Perform a CFD validation study using the results. We note that while all tasks are complete, some deviations from the original plan were made. Specifically, geometrical changes in the parameter space were skipped in favor of flow condition changes, which were found to be much more practical to implement. Changing the geometry required new as-built measurements, which proved extremely costly and impractical given the time and funds available

  20. Validation Experiments for Spent-Fuel Dry-Cask In-Basket Convection

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Barton L. [Utah State Univ., Logan, UT (United States). Dept. of Mechanical and Aerospace Engineering

    2016-08-16

    This work consisted of the following major efforts; 1. Literature survey on validation of external natural convection; 2. Design the experiment; 3. Build the experiment; 4. Run the experiment; 5. Collect results; 6. Disseminate results; and 7. Perform a CFD validation study using the results. We note that while all tasks are complete, some deviations from the original plan were made. Specifically, geometrical changes in the parameter space were skipped in favor of flow condition changes, which were found to be much more practical to implement. Changing the geometry required new as-built measurements, which proved extremely costly and impractical given the time and funds available

  1. EXQ: development and validation of a multiple-item scale for assessing customer experience quality

    OpenAIRE

    Klaus, Philipp

    2010-01-01

    Positioned in the deliberations related to service marketing, the conceptualisation of service quality, current service quality measurements, and the importance of the evolving construct of customer experience, this thesis develops and validates a measurement for customer experience quality (EXQ) in the context of repeat purchases of mortgage buyers in the United Kingdom. The thesis explores the relationship between the customer experience quality and the important marketing ou...

  2. Locating inputs of freshwater to Lynch Cove, Hood Canal, Washington, using aerial infrared photography

    Science.gov (United States)

    Sheibley, Rich W.; Josberger, Edward G.; Chickadel, Chris

    2010-01-01

    The input of freshwater and associated nutrients into Lynch Cove and lower Hood Canal (fig. 1) from sources such as groundwater seeps, small streams, and ephemeral creeks may play a major role in the nutrient loading and hydrodynamics of this low dissolved-oxygen (hypoxic) system. These disbursed sources exhibit a high degree of spatial variability. However, few in-situ measurements of groundwater seepage rates and nutrient concentrations are available and thus may not represent adequately the large spatial variability of groundwater discharge in the area. As a result, our understanding of these processes and their effect on hypoxic conditions in Hood Canal is limited. To determine the spatial variability and relative intensity of these sources, the U.S. Geological Survey Washington Water Science Center collaborated with the University of Washington Applied Physics Laboratory to obtain thermal infrared (TIR) images of the nearshore and intertidal regions of Lynch Cove at or near low tide. In the summer, cool freshwater discharges from seeps and streams, flows across the exposed, sun-warmed beach, and out on the warm surface of the marine water. These temperature differences are readily apparent in aerial thermal infrared imagery that we acquired during the summers of 2008 and 2009. When combined with co-incident video camera images, these temperature differences allow identification of the location, the type, and the relative intensity of the sources.

  3. Groundwater flow code verification ''benchmarking'' activity (COVE-2A): Analysis of participants' work

    International Nuclear Information System (INIS)

    Dykhuizen, R.C.; Barnard, R.W.

    1992-02-01

    The Nuclear Waste Repository Technology Department at Sandia National Laboratories (SNL) is investigating the suitability of Yucca Mountain as a potential site for underground burial of nuclear wastes. One element of the investigations is to assess the potential long-term effects of groundwater flow on the integrity of a potential repository. A number of computer codes are being used to model groundwater flow through geologic media in which the potential repository would be located. These codes compute numerical solutions for problems that are usually analytically intractable. Consequently, independent confirmation of the correctness of the solution is often not possible. Code verification is a process that permits the determination of the numerical accuracy of codes by comparing the results of several numerical solutions for the same problem. The international nuclear waste research community uses benchmarking for intercomparisons that partially satisfy the Nuclear Regulatory Commission (NRC) definition of code verification. This report presents the results from the COVE-2A (Code Verification) project, which is a subset of the COVE project

  4. Hydrogen and acetate cycling in two sulfate-reducing sediments: Buzzards Bay and Town Cove, Massachusetts

    Energy Technology Data Exchange (ETDEWEB)

    Novelli, P.C. (SUNY, Stony Brook, NY (USA) Univ. of Colorado, Boulder (USA)); Michelson, A.R.; Scranton, M.I. (SUNY, Stony Brook, NY (USA)); Banta, G.T.; Hobbie, J.E. (Marine Biological Laboratory, Woods, Hole, MA (USA)); Howarth, R.W. (Cornell Univ., Ithaca, NY (USA))

    1988-10-01

    Molecular hydrogen and acetate are believed to be key intermediates in the anaerobic remineralization of organic carbon. The authors have made measurements of the cycling of both these compounds in two marine sediments: the bioturbated sediments of Buzzards Bay, Mass., and the much more reducing sediments of Town Cove, Orleans, Mass. Hydrogen concentrations are similar in these environments (from less than 5 to 30 nM), and are within the range previously reported for coastal sediments. However, apparent hydrogen production rates differ by a factor of 60 between these two sediments and at both sites show strong correlation with measured rates of sulfate reduction. Acetate concentrations generally increased with depth in both environments; this increase was greater in Buzzards Bay (22.5 to 71.5 {mu}M) than in Town Cove (26 to 44 {mu}M). Acetate oxidation rates calculated from measured concentrations and {sup 14}C-acetate consumption rate constants suggest that the measured acetate was not all available to sulfate-reducing bacteria. Using the measured sulfate reduction rates, they estimate that between 2% and 100% of the measured acetate pool is biologically available, and that the bioavailable pool decreases with depth. A diagenetic model of the total acetate concentration suggests that consumption may be first order with respect to only a fraction of the total pool.

  5. Validation of MORET 4 perturbation against 'physical' type fission products experiments

    International Nuclear Information System (INIS)

    Anno, Jacques; Jacquet, Olivier; Miss, Joachim

    2003-01-01

    After shortly recalling one among the many pertinent recent features of the French criticality CRISTAL package i.e. the perturbation algorithm (so called MORET 4 'Perturbation' or MP), this paper presents original MP validations. Numerical and experimental validations are made using close fission products (FP) experiments. As results, it is shown that, all being equal, MP can detect FP's absorption cross-section variations in the range 0.3-1.2%. (author)

  6. The inventory for déjà vu experiences assessment. Development, utility, reliability, and validity

    NARCIS (Netherlands)

    Sno, H. N.; Schalken, H. F.; de Jonghe, F.; Koeter, M. W.

    1994-01-01

    In this article the development, utility, reliability, and validity of the Inventory for Déjà vu Experiences Assessment (IDEA) are described. The IDEA is a 23-item self-administered questionnaire consisting of a general section of nine questions and qualitative section of 14 questions. The latter

  7. An Examination and Validation of an Adapted Youth Experience Scale for University Sport

    Science.gov (United States)

    Rathwell, Scott; Young, Bradley W.

    2016-01-01

    Limited tools assess positive development through university sport. Such a tool was validated in this investigation using two independent samples of Canadian university athletes. In Study 1, 605 athletes completed 99 survey items drawn from the Youth Experience Scale (YES 2.0), and separate a priori measurement models were evaluated (i.e., 99…

  8. Service validity and service reliability of search, experience and credence services. A scenario study

    NARCIS (Netherlands)

    Galetzka, Mirjam; Verhoeven, J.W.M.; Pruyn, Adriaan T.H.

    2006-01-01

    The purpose of this research is to add to our understanding of the antecedents of customer satisfaction by examining the effects of service reliability (Is the service “correctly” produced?) and service validity (Is the “correct” service produced?) of search, experience and credence services.

  9. Validation and Scaling of Soil Moisture in a Semi-Arid Environment: SMAP Validation Experiment 2015 (SMAPVEX15)

    Science.gov (United States)

    Colliander, Andreas; Cosh, Michael H.; Misra, Sidharth; Jackson, Thomas J.; Crow, Wade T.; Chan, Steven; Bindlish, Rajat; Chae, Chun; Holifield Collins, Chandra; Yueh, Simon H.

    2017-01-01

    The NASA SMAP (Soil Moisture Active Passive) mission conducted the SMAP Validation Experiment 2015 (SMAPVEX15) in order to support the calibration and validation activities of SMAP soil moisture data products. The main goals of the experiment were to address issues regarding the spatial disaggregation methodologies for improvement of soil moisture products and validation of the in situ measurement upscaling techniques. To support these objectives high-resolution soil moisture maps were acquired with the airborne PALS (Passive Active L-band Sensor) instrument over an area in southeast Arizona that includes the Walnut Gulch Experimental Watershed (WGEW), and intensive ground sampling was carried out to augment the permanent in situ instrumentation. The objective of the paper was to establish the correspondence and relationship between the highly heterogeneous spatial distribution of soil moisture on the ground and the coarse resolution radiometer-based soil moisture retrievals of SMAP. The high-resolution mapping conducted with PALS provided the required connection between the in situ measurements and SMAP retrievals. The in situ measurements were used to validate the PALS soil moisture acquired at 1-km resolution. Based on the information from a dense network of rain gauges in the study area, the in situ soil moisture measurements did not capture all the precipitation events accurately. That is, the PALS and SMAP soil moisture estimates responded to precipitation events detected by rain gauges, which were in some cases not detected by the in situ soil moisture sensors. It was also concluded that the spatial distribution of the soil moisture resulted from the relatively small spatial extents of the typical convective storms in this region was not completely captured with the in situ stations. After removing those cases (approximately10 of the observations) the following metrics were obtained: RMSD (root mean square difference) of0.016m3m3 and correlation of 0.83. The

  10. Persistence, biodegradation and biological impact of Bunker C residues in Black Duck Cove, Nova Scotia

    Energy Technology Data Exchange (ETDEWEB)

    Lee, K.; Wohlgeschaffen, G. D.; Tremblay, G. H. [Dept. of Fisheries and Oceans of Canada, Inst. Maurice-Lamontagne, Mont Joli, PQ (Canada); Vandermeulen, D. C.; Mossman, K. G. [Dept. of Fisheries and Oceans of Canada, Bedford Inst. of Oceanography, Dartmouth, NS (Canada); Doe, K. G.; Jackman, P. M. [Environment Canada, Environmental Science Center, Moncton, NB (Canada); Prince, R. C.; Garrett, R. M.; Haith, C. E. [Exxon Research and Engineering Company, Annandale, NJ (United States)

    1998-12-31

    In 1970, approximately 2,045 cubic metres of Bunker C oil impacted on 300 km of Nova Scotia`s coastline following the grounding of the tanker `Arrow`. Only 10 per cent of the coastline was subjected to cleanup, the remainder was left to degrade naturally. Samples of sediments were collected in 1993 and 1997 in order to assess the attenuation processes on the reduction of toxicity within sediments and interstitial waters at Black Duck Cove, one of the untreated sites where residual oil was clearly evident. Detailed chemical analyses showed that the Bunker C oil at this site has undergone substantial biodegradation. Over the 20 plus years since the oil spill the toxicity of the residual oil has been significantly reduced and there is substantial evidence of habitat recovery.

  11. Persistence, biodegradation and biological impact of Bunker C residues in Black Duck Cove, Nova Scotia

    Energy Technology Data Exchange (ETDEWEB)

    Lee, K.; Wohlgeschaffen, G. D.; Tremblay, G. H. [Dept. of Fisheries and Oceans of Canada, Inst. Maurice-Lamontagne, Mont Joli, PQ (Canada); Vandermeulen, D. C.; Mossman, K. G. [Dept. of Fisheries and Oceans of Canada, Bedford Inst. of Oceanography, Dartmouth, NS (Canada); Doe, K. G.; Jackman, P. M. [Environment Canada, Environmental Science Center, Moncton, NB (Canada); Prince, R. C.; Garrett, R. M.; Haith, C. E. [Exxon Research and Engineering Company, Annandale, NJ (United States)

    1998-07-01

    In 1970, approximately 2,045 cubic metres of Bunker C oil impacted on 300 km of Nova Scotia's coastline following the grounding of the tanker 'Arrow'. Only 10 per cent of the coastline was subjected to cleanup, the remainder was left to degrade naturally. Samples of sediments were collected in 1993 and 1997 in order to assess the attenuation processes on the reduction of toxicity within sediments and interstitial waters at Black Duck Cove, one of the untreated sites where residual oil was clearly evident. Detailed chemical analyses showed that the Bunker C oil at this site has undergone substantial biodegradation. Over the 20 plus years since the oil spill the toxicity of the residual oil has been significantly reduced and there is substantial evidence of habitat recovery.

  12. Differential diagnosis between early repolarization of athlete's heart and coved-type Brugada electrocardiogram.

    Science.gov (United States)

    Zorzi, Alessandro; Leoni, Loira; Di Paolo, Fernando M; Rigato, Ilaria; Migliore, Federico; Bauce, Barbara; Pelliccia, Antonio; Corrado, Domenico

    2015-02-15

    Early repolarization (ER) is typically observed in highly trained athletes as a physiologic consequence of increased vagal tone. The variant of anterior (V1 to V3) ER characterized by "domed" ST-segment elevation and negative T wave raises problems of differential diagnosis with the "coved-type" electrocardiographic pattern seen in Brugada syndrome (BS). This study was designed to identify electrocardiographic criteria for distinguishing athlete's ER from BS. The study compared the electrocardiographic tracings of 61 healthy athletes (80% men, median age 23 ± 8 years), showing "domed" ST-segment elevation and negative T wave in leads V1 to V3, with those of 92 consecutive age- and sex-matched BS patients with a "coved-type" electrocardiographic pattern. The electrocardiographic analysis focused on the ST-segment elevation at J point (STJ) and at 80 milliseconds after J point (ST₈₀). Athletes had a lower maximum amplitude of STJ (1.46 ± 0.7 vs 3.25 ± 0.6 mm, p 1) versus only 2 (3%) athletes (p <0.001). An upsloping ST-segment configuration (STJ/ST₈₀ <1) showed a sensitivity of 97%, a specificity of 100%, and a diagnostic accuracy of 98.7% for the diagnosis of ER. At multivariate analysis, STJ/ST₈₀ ratio remained the only independent predictor for ER (odds ratio 87, 95% confidence interval 19 to 357, p <0.001). In conclusion, the STJ/ST₈₀ ratio is a highly accurate electrocardiographic parameter for differential diagnosis between anterior ER of the athlete and BS. Our results may help in reducing the number of athletes who undergo expensive diagnostic workup or are unnecessarily disqualified from competition for changes that fall within the normal range of athlete's heart. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Chemometric and biological validation of a capillary electrophoresis metabolomic experiment of Schistosoma mansoni infection in mice.

    Science.gov (United States)

    Garcia-Perez, Isabel; Angulo, Santiago; Utzinger, Jürg; Holmes, Elaine; Legido-Quigley, Cristina; Barbas, Coral

    2010-07-01

    Metabonomic and metabolomic studies are increasingly utilized for biomarker identification in different fields, including biology of infection. The confluence of improved analytical platforms and the availability of powerful multivariate analysis software have rendered the multiparameter profiles generated by these omics platforms a user-friendly alternative to the established analysis methods where the quality and practice of a procedure is well defined. However, unlike traditional assays, validation methods for these new multivariate profiling tools have yet to be established. We propose a validation for models obtained by CE fingerprinting of urine from mice infected with the blood fluke Schistosoma mansoni. We have analysed urine samples from two sets of mice infected in an inter-laboratory experiment where different infection methods and animal husbandry procedures were employed in order to establish the core biological response to a S. mansoni infection. CE data were analysed using principal component analysis. Validation of the scores consisted of permutation scrambling (100 repetitions) and a manual validation method, using a third of the samples (not included in the model) as a test or prediction set. The validation yielded 100% specificity and 100% sensitivity, demonstrating the robustness of these models with respect to deciphering metabolic perturbations in the mouse due to a S. mansoni infection. A total of 20 metabolites across the two experiments were identified that significantly discriminated between S. mansoni-infected and noninfected control samples. Only one of these metabolites, allantoin, was identified as manifesting different behaviour in the two experiments. This study shows the reproducibility of CE-based metabolic profiling methods for disease characterization and screening and highlights the importance of much needed validation strategies in the emerging field of metabolomics.

  14. The Development and Validation of the Game User Experience Satisfaction Scale (GUESS).

    Science.gov (United States)

    Phan, Mikki H; Keebler, Joseph R; Chaparro, Barbara S

    2016-12-01

    The aim of this study was to develop and psychometrically validate a new instrument that comprehensively measures video game satisfaction based on key factors. Playtesting is often conducted in the video game industry to help game developers build better games by providing insight into the players' attitudes and preferences. However, quality feedback is difficult to obtain from playtesting sessions without a quality gaming assessment tool. There is a need for a psychometrically validated and comprehensive gaming scale that is appropriate for playtesting and game evaluation purposes. The process of developing and validating this new scale followed current best practices of scale development and validation. As a result, a mixed-method design that consisted of item pool generation, expert review, questionnaire pilot study, exploratory factor analysis (N = 629), and confirmatory factor analysis (N = 729) was implemented. A new instrument measuring video game satisfaction, called the Game User Experience Satisfaction Scale (GUESS), with nine subscales emerged. The GUESS was demonstrated to have content validity, internal consistency, and convergent and discriminant validity. The GUESS was developed and validated based on the assessments of over 450 unique video game titles across many popular genres. Thus, it can be applied across many types of video games in the industry both as a way to assess what aspects of a game contribute to user satisfaction and as a tool to aid in debriefing users on their gaming experience. The GUESS can be administered to evaluate user satisfaction of different types of video games by a variety of users. © 2016, Human Factors and Ergonomics Society.

  15. Are screening instruments valid for psychotic-like experiences? A validation study of screening questions for psychotic-like experiences using in-depth clinical interview.

    LENUS (Irish Health Repository)

    Kelleher, Ian

    2011-03-01

    Individuals who report psychotic-like experiences are at increased risk of future clinical psychotic disorder. They constitute a unique "high-risk" group for studying the developmental trajectory to schizophrenia and related illnesses. Previous research has used screening instruments to identify this high-risk group, but the validity of these instruments has not yet been established. We administered a screening questionnaire with 7 items designed to assess psychotic-like experiences to 334 adolescents aged 11-13 years. Detailed clinical interviews were subsequently carried out with a sample of these adolescents. We calculated sensitivity and specificity and positive predictive value (PPV) and negative predictive value (NPV) for each screening question for the specific symptom it enquired about and also in relation to any psychotic-like experience. The predictive power varied substantially between items, with the question on auditory hallucinations ("Have you ever heard voices or sounds that no one else can hear?") providing the best predictive power. For interview-verified auditory hallucinations specifically, this question had a PPV of 71.4% and an NPV of 90.4%. When assessed for its predictive power for any psychotic-like experience (including, but not limited to, auditory hallucinations), it provided a PPV of 100% and an NPV of 88.4%. Two further questions-relating to visual hallucinations and paranoid thoughts-also demonstrated good predictive power for psychotic-like experiences. Our results suggest that it may be possible to screen the general adolescent population for psychotic-like experiences with a high degree of accuracy using a short self-report questionnaire.

  16. Are screening instruments valid for psychotic-like experiences? A validation study of screening questions for psychotic-like experiences using in-depth clinical interview.

    LENUS (Irish Health Repository)

    Kelleher, Ian

    2012-02-01

    Individuals who report psychotic-like experiences are at increased risk of future clinical psychotic disorder. They constitute a unique "high-risk" group for studying the developmental trajectory to schizophrenia and related illnesses. Previous research has used screening instruments to identify this high-risk group, but the validity of these instruments has not yet been established. We administered a screening questionnaire with 7 items designed to assess psychotic-like experiences to 334 adolescents aged 11-13 years. Detailed clinical interviews were subsequently carried out with a sample of these adolescents. We calculated sensitivity and specificity and positive predictive value (PPV) and negative predictive value (NPV) for each screening question for the specific symptom it enquired about and also in relation to any psychotic-like experience. The predictive power varied substantially between items, with the question on auditory hallucinations ("Have you ever heard voices or sounds that no one else can hear?") providing the best predictive power. For interview-verified auditory hallucinations specifically, this question had a PPV of 71.4% and an NPV of 90.4%. When assessed for its predictive power for any psychotic-like experience (including, but not limited to, auditory hallucinations), it provided a PPV of 100% and an NPV of 88.4%. Two further questions-relating to visual hallucinations and paranoid thoughts-also demonstrated good predictive power for psychotic-like experiences. Our results suggest that it may be possible to screen the general adolescent population for psychotic-like experiences with a high degree of accuracy using a short self-report questionnaire.

  17. Characterization of a CLYC detector and validation of the Monte Carlo Simulation by measurement experiments

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyun Suk; Ye, Sung Joon [Seoul National University, Seoul (Korea, Republic of); Smith, Martin B.; Koslowsky, Martin R. [Bubble Technology Industries Inc., Chalk River (Canada); Kwak, Sung Woo [Korea Institute of Nuclear Nonproliferation And Control (KINAC), Daejeon (Korea, Republic of); Kim Gee Hyun [Sejong University, Seoul (Korea, Republic of)

    2017-03-15

    Simultaneous detection of neutrons and gamma rays have become much more practicable, by taking advantage of good gamma-ray discrimination properties using pulse shape discrimination (PSD) technique. Recently, we introduced a commercial CLYC system in Korea, and performed an initial characterization and simulation studies for the CLYC detector system to provide references for the future implementation of the dual-mode scintillator system in various studies and applications. We evaluated a CLYC detector with 95% 6Li enrichment using various gamma-ray sources and a 252Cf neutron source, with validation of our Monte Carlo simulation results via measurement experiments. Absolute full-energy peak efficiency values were calculated for gamma-ray sources and neutron source using MCNP6 and compared with measurement experiments of the calibration sources. In addition, behavioral characteristics of neutrons were validated by comparing simulations and experiments on neutron moderation with various polyethylene (PE) moderator thicknesses. Both results showed good agreements in overall characteristics of the gamma and neutron detection efficiencies, with consistent ⁓20% discrepancy. Furthermore, moderation of neutrons emitted from {sup 252}Cf showed similarities between the simulation and the experiment, in terms of their relative ratios depending on the thickness of the PE moderator. A CLYC detector system was characterized for its energy resolution and detection efficiency, and Monte Carlo simulations on the detector system was validated experimentally. Validation of the simulation results in overall trend of the CLYC detector behavior will provide the fundamental basis and validity of follow-up Monte Carlo simulation studies for the development of our dual-particle imager using a rotational modulation collimator.

  18. The structural validity of the Experience of Work and Life Circumstances Questionnaire (WLQ

    Directory of Open Access Journals (Sweden)

    Pieter Schaap

    2016-09-01

    Full Text Available Orientation: Best practice frameworks suggest that an assessment practitioner’s choice of an assessment tool should be based on scientific evidence that underpins the appropriate and just use of the instrument. This is a context-specific validity study involving a classified psychological instrument against the background of South African regulatory frameworks and contemporary validity theory principles. Research purpose: The aim of the study was to explore the structural validity of the Experience of Work and Life Circumstances Questionnaire (WLQ administered to employees in the automotive assembly plant of a South African automotive manufacturing company. Motivation for the study: Although the WLQ has been used by registered health practitioners and numerous researchers, evidence to support the structural validity is lacking. This study, therefore, addressed the need for context-specific empirical support for the validity of score inferences in respect of employees in a South African automotive manufacturing plant. Research design, approach and method: The research was conducted using a convenience sample (N = 217 taken from the automotive manufacturing company where the instrument was used. Reliability and factor analyses were carried out to explore the structural validity of the WLQ. Main findings: The reliability of the WLQ appeared to be acceptable, and the assumptions made about unidimensionality were mostly confirmed. One of the proposed higher-order structural models of the said questionnaire administered to the sample group was confirmed, whereas the other one was partially confirmed. Practical/managerial implications: The conclusion reached was that preliminary empirical grounds existed for considering the continued use of the WLQ (with some suggested refinements by the relevant company, provided the process of accumulating a body of validity evidence continued. Contribution/value-add: This study identified some of the difficulties

  19. ISOTHERMAL AIR INGRESS VALIDATION EXPERIMENTS AT IDAHO NATIONAL LABORATORY: DESCRIPTION AND SUMMARY OF DATA

    International Nuclear Information System (INIS)

    Oh, Chang H.; Kim, Eung S.

    2010-01-01

    Idaho National Laboratory performed air ingress experiments as part of validating computational fluid dynamics code (CFD). An isothermal stratified flow experiment was designed and set to understand stratified flow phenomena in the very high temperature gas cooled reactor (VHTR) and to provide experimental data for validating computer codes. The isothermal experiment focused on three flow characteristics unique in the VHTR air-ingress accident: stratified flow in the horizontal pipe, stratified flow expansion at the pipe and vessel junction, and stratified flow around supporting structures. Brine and sucrose were used as heavy fluids and water was used as light fluids. The density ratios were changed between 0.87 and 0.98. This experiment clearly showed that a stratified flow between heavy and light fluids is generated even for very small density differences. The code was validated by conducting blind CFD simulations and comparing the results to the experimental data. A grid sensitivity study was also performed based on the Richardson extrapolation and the grid convergence index method for modeling confidence. As a result, the calculated current speed showed very good agreement with the experimental data, indicating that the current CFD methods are suitable for predicting density gradient stratified flow phenomena in the air-ingress accident.

  20. Validation of dispersion model of RTARC-DSS based on ''KIT'' field experiments

    International Nuclear Information System (INIS)

    Duran, J.

    2000-01-01

    The aim of this study is to present the performance of the Gaussian dispersion model RTARC-DSS (Real Time Accident Release Consequences - Decision Support System) at the 'Kit' field experiments. The Model Validation Kit is a collection of three experimental data sets from Kincaid, Copenhagen, Lillestrom and supplementary Indianopolis experimental campaigns accompanied by software for model evaluation. The validation of the model has been performed on the basis of the maximum arc-wise concentrations using the Bootstrap resampling procedure the variation of the model residuals. Validation was performed for the short-range distances (about 1 - 10 km, maximum for Kincaid data set - 50 km from source). Model evaluation procedure and amount of relative over- or under-prediction are discussed and compared with the model. (author)

  1. Non-destructive measurements of nuclear wastes. Validation and industrial operating experience

    International Nuclear Information System (INIS)

    Saas, A.; Tchemitciieff, E.

    1993-01-01

    After a short survey of the means employed for the non-destructive measurement of specific activities (γ and X-ray) in waste packages and raw waste, the performances of the device and the ANDRA requirements are presented. The validation of the γ and X-ray measurements on packages is obtained through determining, by destructive means, the same activity on coring samples. The same procedure is used for validating the homogeneity measurements on packages (either homogeneous or heterogeneous). Different operating experiences are then exposed for several kinds of packages and waste. Up to now, about twenty different types of packages have been examined and more than 200 packages have allowed the calibration, validation, and control

  2. Examining students' views about validity of experiments: From introductory to Ph.D. students

    Science.gov (United States)

    Hu, Dehui; Zwickl, Benjamin M.

    2018-06-01

    We investigated physics students' epistemological views on measurements and validity of experimental results. The roles of experiments in physics have been underemphasized in previous research on students' personal epistemology, and there is a need for a broader view of personal epistemology that incorporates experiments. An epistemological framework incorporating the structure, methodology, and validity of scientific knowledge guided the development of an open-ended survey. The survey was administered to students in algebra-based and calculus-based introductory physics courses, upper-division physics labs, and physics Ph.D. students. Within our sample, we identified several differences in students' ideas about validity and uncertainty in measurement. The majority of introductory students justified the validity of results through agreement with theory or with results from others. Alternatively, Ph.D. students frequently justified the validity of results based on the quality of the experimental process and repeatability of results. When asked about the role of uncertainty analysis, introductory students tended to focus on the representational roles (e.g., describing imperfections, data variability, and human mistakes). However, advanced students focused on the inferential roles of uncertainty analysis (e.g., quantifying reliability, making comparisons, and guiding refinements). The findings suggest that lab courses could emphasize a variety of approaches to establish validity, such as by valuing documentation of the experimental process when evaluating the quality of student work. In order to emphasize the role of uncertainty in an authentic way, labs could provide opportunities to iterate, make repeated comparisons, and make decisions based on those comparisons.

  3. [Caregiver's health: adaption and validation in a Spanish population of the Experience of Caregiving Inventory (ECI)].

    Science.gov (United States)

    Crespo-Maraver, Mariacruz; Doval, Eduardo; Fernández-Castro, Jordi; Giménez-Salinas, Jordi; Prat, Gemma; Bonet, Pere

    2018-04-04

    To adapt and to validate the Experience of Caregiving Inventory (ECI) in a Spanish population, providing empirical evidence of its internal consistency, internal structure and validity. Psychometric validation of the adapted version of the ECI. One hundred and seventy-two caregivers (69.2% women), mean age 57.51 years (range: 21-89) participated. Demographic and clinical data, standardized measures (ECI, suffering scale of SCL-90-R, Zarit burden scale) were used. The two scales of negative evaluation of the ECI most related to serious mental disorders (disruptive behaviours [DB] and negative symptoms [NS]) and the two scales of positive appreciation (positive personal experiences [PPE], and good aspects of the relationship [GAR]) were analyzed. Exploratory structural equation modelling was used to analyze the internal structure. The relationship between the ECI scales and the SCL-90-R and Zarit scores was also studied. The four-factor model presented a good fit. Cronbach's alpha (DB: 0.873; NS: 0.825; PPE: 0.720; GAR: 0.578) showed a higher homogeneity in the negative scales. The SCL-90-R scores correlated with the negative ECI scales, and none of the ECI scales correlated with the Zarit scale. The Spanish version of the ECI can be considered a valid, reliable, understandable and feasible self-report measure for its administration in the health and community context. Copyright © 2018 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.

  4. Experiment of Laser Pointing Stability on Different Surfaces to validate Micrometric Positioning Sensor

    CERN Document Server

    AUTHOR|(SzGeCERN)721924; Mainaud Durand, Helene; Piedigrossi, Didier; Sandomierski, Jacek; Sosin, Mateusz; Geiger, Alain; Guillaume, Sebastien

    2014-01-01

    CLIC requires 10 μm precision and accuracy over 200m for the pre-alignment of beam related components. A solution based on laser beam as straight line reference is being studied at CERN. It involves camera/shutter assemblies as micrometric positioning sensors. To validate the sensors, it is necessary to determine an appropriate material for the shutter in terms of laser pointing stability. Experiments are carried out with paper, metal and ceramic surfaces. This paper presents the standard deviations of the laser spot coordinates obtained on the different surfaces, as well as the measurement error. Our experiments validate the choice of paper and ceramic for the shutter of the micrometric positioning sensor. It also provides an estimate of the achievable precision and accuracy of the determination of the laser spot centre with respect to the shutter coordinate system defined by reference targets.

  5. The Role of Laboratory Experiments in the Validation of Field Data

    DEFF Research Database (Denmark)

    Mouneyrac, Catherine; Lagarde, Fabienne; Chatel, Amelie

    2017-01-01

    The ubiquitous presence and persistency of microplastics (MPs) in aquatic environments are of particular concern, since they constitute a potential threat to marine organisms and ecosystems. However, evaluating this threat and the impacts of MP on aquatic organisms is challenging. MPs form a very...... and to what degree these complexities are addressed in the current literature, to: (1) evaluate how well laboratory studies, investigated so far, represent environmentally relevant processes and scenarios and (2) suggest directions for future research The Role of Laboratory Experiments in the Validation...... of Field Data | Request PDF. Available from: https://www.researchgate.net/publication/310360438_The_Role_of_Laboratory_Experiments_in_the_Validation_of_Field_Data [accessed Jan 15 2018]....

  6. Validation of neutronic methods applied to the analysis of fast subcritical systems. The MUSE-2 experiments

    International Nuclear Information System (INIS)

    Soule, R.; Salvatores, M.; Jacqmin, R.; Martini, M.; Lebrat, J.F.; Bertrand, P.; Broccoli, U.; Peluso, V.

    1997-01-01

    In the framework of the French SPIN program devoted to the separation and the transmutation of radioactive wastes, the CEA has launched the ISAAC program to investigate the potential of accelerator-driven systems and to provide an experimental validation of the physics characteristics of these systems. The neutronics of the subcritical core needs experimental validation. This can be done by decoupling the problem of the neutron source from the problem of the subcritical medium. Experiments with a well known external source placed in a subcritical medium have been performed in the MASURCA facility. The results confirm the high accuracy achievable with such experiments and the good quality of the ERANOS code system predictions. (author)

  7. Validation of neutronic methods applied to the analysis of fast subcritical systems. The MUSE-2 experiments

    Energy Technology Data Exchange (ETDEWEB)

    Soule, R; Salvatores, M; Jacqmin, R; Martini, M; Lebrat, J F; Bertrand, P [CEA Centre d` Etudes de Cadarache, Service de Physique des Reacteurs et du Cycle, 13 - Saint-Paul-lez-Durance (France); Broccoli, U; Peluso, V

    1998-12-31

    In the framework of the French SPIN program devoted to the separation and the transmutation of radioactive wastes, the CEA has launched the ISAAC program to investigate the potential of accelerator-driven systems and to provide an experimental validation of the physics characteristics of these systems. The neutronics of the subcritical core needs experimental validation. This can be done by decoupling the problem of the neutron source from the problem of the subcritical medium. Experiments with a well known external source placed in a subcritical medium have been performed in the MASURCA facility. The results confirm the high accuracy achievable with such experiments and the good quality of the ERANOS code system predictions. (author)

  8. Validation of a numerical FSI simulation of an aortic BMHV by in vitro PIV experiments.

    Science.gov (United States)

    Annerel, S; Claessens, T; Degroote, J; Segers, P; Vierendeels, J

    2014-08-01

    In this paper, a validation of a recently developed fluid-structure interaction (FSI) coupling algorithm to simulate numerically the dynamics of an aortic bileaflet mechanical heart valve (BMHV) is performed. This validation is done by comparing the numerical simulation results with in vitro experiments. For the in vitro experiments, the leaflet kinematics and flow fields are obtained via the particle image velocimetry (PIV) technique. Subsequently, the same case is numerically simulated by the coupling algorithm and the resulting leaflet kinematics and flow fields are obtained. Finally, the results are compared, revealing great similarity in leaflet motion and flow fields between the numerical simulation and the experimental test. Therefore, it is concluded that the developed algorithm is able to capture very accurately all the major leaflet kinematics and dynamics and can be used to study and optimize the design of BMHVs. Copyright © 2014 IPEM. Published by Elsevier Ltd. All rights reserved.

  9. Validation experiment of a numerically processed millimeter-wave interferometer in a laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Kogi, Y., E-mail: kogi@fit.ac.jp; Higashi, T.; Matsukawa, S. [Department of Information Electronics, Fukuoka Institute of Technology, Fukuoka 811-0295 (Japan); Mase, A. [Art, Science and Technology Center for Cooperative Research, Kyushu University, Kasuga, Fukuoka 816-0811 (Japan); Kohagura, J.; Yoshikawa, M. [Plasma Research Center, University of Tsukuba, Tsukuba, Ibaraki 305-8577 (Japan); Nagayama, Y.; Kawahata, K. [National Institute for Fusion Science, Toki, Gifu 509-5202 (Japan); Kuwahara, D. [Tokyo University of Agriculture and Technology, Koganei, Tokyo 184-8588 (Japan)

    2014-11-15

    We propose a new interferometer system for density profile measurements. This system produces multiple measurement chords by a leaky-wave antenna driven by multiple frequency inputs. The proposed system was validated in laboratory evaluation experiments. We confirmed that the interferometer generates a clear image of a Teflon plate as well as the phase shift corresponding to the plate thickness. In another experiment, we confirmed that quasi-optical mirrors can produce multiple measurement chords; however, the finite spot size of the probe beam degrades the sharpness of the resulting image.

  10. Validation of the containment code Sirius: interpretation of an explosion experiment on a scale model

    International Nuclear Information System (INIS)

    Blanchet, Y.; Obry, P.; Louvet, J.; Deshayes, M.; Phalip, C.

    1979-01-01

    The explicit 2-D axisymmetric Langrangian code SIRIUS, developed at the CEA/DRNR, Cadarache, deals with transient compressive flows in deformable primary tanks with more or less complex internal component geometries. This code has been subjected to a two-year intensive validation program on scale model experiments and a number of improvements have been incorporated. This paper presents a recent calculation of one of these experiments using the SIRIUS code, and the comparison with experimental results shows the encouraging possibilities of this Lagrangian code

  11. Development and Validation of an Instrument for Assessing Patient Experience of Chronic Illness Care

    Directory of Open Access Journals (Sweden)

    José Joaquín Mira

    2016-08-01

    Full Text Available Introduction: The experience of chronic patients with the care they receive, fuelled by the focus on patient-centeredness and the increasing evidence on its positive relation with other dimensions of quality, is being acknowledged as a key element in improving the quality of care. There are a dearth of accepted tools and metrics to assess patient experience from the patient’s perspective that have been adapted to the new chronic care context: continued, systemic, with multidisciplinary teams and new technologies. Methods: Development and validation of a scale conducting a literature review, expert panel, pilot and field studies with 356 chronic primary care patients, to assess content and face validities and reliability. Results: IEXPAC is an 11+1 item scale with adequate metric properties measured by Alpha Chronbach, Goodness of fit index, and satisfactory convergence validity around three factors named: productive interactions, new relational model and person’s self-management. Conclusions: IEXPAC allows measurement of the patient experience of chronic illness care. Together with other indicators, IEXPAC can determine the quality of care provided according to the Triple Aim framework, facilitating health systems reorientation towards integrated patient-centred care.

  12. Development and Validation of a Scale Assessing Mental Health Clinicians' Experiences of Associative Stigma.

    Science.gov (United States)

    Yanos, Philip T; Vayshenker, Beth; DeLuca, Joseph S; O'Connor, Lauren K

    2017-10-01

    Mental health professionals who work with people with serious mental illnesses are believed to experience associative stigma. Evidence suggests that associative stigma could play an important role in the erosion of empathy among professionals; however, no validated measure of the construct currently exists. This study examined the convergent and discriminant validity and factor structure of a new scale assessing the associative stigma experiences of clinicians working with people with serious mental illnesses. A total of 473 clinicians were recruited from professional associations in the United States and participated in an online study. Participants completed the Clinician Associative Stigma Scale (CASS) and measures of burnout, quality of care, expectations about recovery, and self-efficacy. Associative stigma experiences were commonly endorsed; eight items on the 18-item scale were endorsed as being experienced "sometimes" or "often" by over 50% of the sample. The new measure demonstrated a logical four-factor structure: "negative stereotypes about professional effectiveness," "discomfort with disclosure," "negative stereotypes about people with mental illness," and "stereotypes about professionals' mental health." The measure had good internal consistency. It was significantly related to measures of burnout and quality of care, but it was not related to measures of self-efficacy or expectations about recovery. Findings suggest that the CASS is internally consistent and shows evidence of convergent validity and that associative stigma is commonly experienced by mental health professionals who work with people with serious mental illnesses.

  13. Validation experiments of nuclear characteristics of the fast-thermal system HERBE

    International Nuclear Information System (INIS)

    Pesic, M.; Zavaljevski, N.; Marinkovic, P.; Stefanovis, D.; Nikolic, D.; Avdic, S.

    1992-01-01

    In 1988/90 a coupled fast-thermal system HERBE at RB reactor, based on similar facilities, is designed and realized. Fast core of HERBE is built of natural U fuel in RB reactor center surrounded by the neutron filter and neutron converter located in an independent Al tank. Fast zone is surrounded by thermal neutron core driver. Designed nuclear characteristics of HERBE core are validated in the experiments described in the paper. HERBE cell parameters were calculated with developed computer codes: VESNA and DENEB. HERBE system criticality calculation are performed with 4G 2D RZ computer codes GALER and TWENTY GRAND, 1D multi-group AVERY code and 3D XYZ few-group TRITON computer code. The experiments for determination of critical level, dρ/dH, and reactivity of safety rods are accomplished in order to validate calculation results. Specific safety experiment is performed in aim to determine reactivity of flooded fast zone in possible accident. A very good agreements with calculation results are obtained and the validation procedures are presented. It is expected that HERBE will offer qualitative new opportunities for work with fast neutrons at RB reactor including nuclear data determination. (author)

  14. Panamanian women׳s experience of vaginal examination in labour: A questionnaire validation.

    Science.gov (United States)

    Bonilla-Escobar, Francisco J; Ortega-Lenis, Delia; Rojas-Mirquez, Johanna C; Ortega-Loubon, Christian

    2016-05-01

    to validate a tool that allows healthcare providers to obtain accurate information regarding Panamanian women׳s thoughts and feelings about vaginal examination during labour that can be used in other Latin-American countries. validation study based on a database from a cross-sectional study carried out in two tertiary care hospitals in Panama City, Panama. Women in the immediate postpartum period who had spontaneous labour onset and uncomplicated deliveries were included in the study from April to August 2008. Researchers used a survey designed by Lewin et al. that included 20 questions related to a patient׳s experience during a vaginal examination. five constructs (factors) related to a patient׳s experience of vaginal examination during labour were identified: Approval (Alpha Cronbach׳s 0.72), Perception (0.67), Rejection (0.40), Consent (0.51), and Stress (0.20). it was demonstrated the validity of the scale and its constructs used to obtain information related to vaginal examination during labour, including patients' experiences with examination and healthcare staff performance. utilisation of the scale will allow institutions to identify items that need improvement and address these areas in order to promote the best care for patients in labour. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Proceedings of the workshop on integral experiment covariance data for critical safety validation

    Energy Technology Data Exchange (ETDEWEB)

    Stuke, Maik (ed.)

    2016-04-15

    For some time, attempts to quantify the statistical dependencies of critical experiments and to account for them properly in validation procedures were discussed in the literature by various groups. Besides the development of suitable methods especially the quality and modeling issues of the freely available experimental data are in the focus of current discussions, carried out for example in the Expert Group on Uncertainty Analysis for Criticality Safety Assessment (UACSA) of the OECD-NEA Nuclear Science Committee. The same committee compiles and publishes also the freely available experimental data in the International Handbook of Evaluated Criticality Safety Benchmark Experiments. Most of these experiments were performed as series and might share parts of experimental setups leading to correlated results. The quality of the determination of these correlations and the underlying covariance data depend strongly on the quality of the documentation of experiments.

  16. Proceedings of the workshop on integral experiment covariance data for critical safety validation

    International Nuclear Information System (INIS)

    Stuke, Maik

    2016-04-01

    For some time, attempts to quantify the statistical dependencies of critical experiments and to account for them properly in validation procedures were discussed in the literature by various groups. Besides the development of suitable methods especially the quality and modeling issues of the freely available experimental data are in the focus of current discussions, carried out for example in the Expert Group on Uncertainty Analysis for Criticality Safety Assessment (UACSA) of the OECD-NEA Nuclear Science Committee. The same committee compiles and publishes also the freely available experimental data in the International Handbook of Evaluated Criticality Safety Benchmark Experiments. Most of these experiments were performed as series and might share parts of experimental setups leading to correlated results. The quality of the determination of these correlations and the underlying covariance data depend strongly on the quality of the documentation of experiments.

  17. Validation results of satellite mock-up capturing experiment using nets

    Science.gov (United States)

    Medina, Alberto; Cercós, Lorenzo; Stefanescu, Raluca M.; Benvenuto, Riccardo; Pesce, Vincenzo; Marcon, Marco; Lavagna, Michèle; González, Iván; Rodríguez López, Nuria; Wormnes, Kjetil

    2017-05-01

    The PATENDER activity (Net parametric characterization and parabolic flight), funded by the European Space Agency (ESA) via its Clean Space initiative, was aiming to validate a simulation tool for designing nets for capturing space debris. This validation has been performed through a set of different experiments under microgravity conditions where a net was launched capturing and wrapping a satellite mock-up. This paper presents the architecture of the thrown-net dynamics simulator together with the set-up of the deployment experiment and its trajectory reconstruction results on a parabolic flight (Novespace A-310, June 2015). The simulator has been implemented within the Blender framework in order to provide a highly configurable tool, able to reproduce different scenarios for Active Debris Removal missions. The experiment has been performed over thirty parabolas offering around 22 s of zero-g conditions. Flexible meshed fabric structure (the net) ejected from a container and propelled by corner masses (the bullets) arranged around its circumference have been launched at different initial velocities and launching angles using a pneumatic-based dedicated mechanism (representing the chaser satellite) against a target mock-up (the target satellite). High-speed motion cameras were recording the experiment allowing 3D reconstruction of the net motion. The net knots have been coloured to allow the images post-process using colour segmentation, stereo matching and iterative closest point (ICP) for knots tracking. The final objective of the activity was the validation of the net deployment and wrapping simulator using images recorded during the parabolic flight. The high-resolution images acquired have been post-processed to determine accurately the initial conditions and generate the reference data (position and velocity of all knots of the net along its deployment and wrapping of the target mock-up) for the simulator validation. The simulator has been properly

  18. Sessile macro-epibiotic community of solitary ascidians, ecosystem engineers in soft substrates of Potter Cove, Antarctica

    OpenAIRE

    Rimondino, Clara; Torre, Luciana; Sahade, Ricardo Jose; Tatian, Marcos

    2016-01-01

    The muddy bottoms of inner Potter Cove, King George Island (Isla 25 de Mayo), South Shetlands, Antarctica, show a high density and richness of macrobenthic species, particularly ascidians. In other areas, ascidians have been reported to play the role of ecosystem engineers, as they support a significant number of epibionts, increasing benthic diversity. In this study, a total of 21 sessile macro-epibiotic taxa present on the ascidian species Corella antarctica Sluiter, 1905, Cnemidocarpa verr...

  19. CFD Validation Experiment of a Mach 2.5 Axisymmetric Shock-Wave/Boundary-Layer Interaction

    Science.gov (United States)

    Davis, David O.

    2015-01-01

    Experimental investigations of specific flow phenomena, e.g., Shock Wave Boundary-Layer Interactions (SWBLI), provide great insight to the flow behavior but often lack the necessary details to be useful as CFD validation experiments. Reasons include: 1.Undefined boundary conditions Inconsistent results 2.Undocumented 3D effects (CL only measurements) 3.Lack of uncertainty analysis While there are a number of good subsonic experimental investigations that are sufficiently documented to be considered test cases for CFD and turbulence model validation, the number of supersonic and hypersonic cases is much less. This was highlighted by Settles and Dodsons [1] comprehensive review of available supersonic and hypersonic experimental studies. In all, several hundred studies were considered for their database.Of these, over a hundred were subjected to rigorous acceptance criteria. Based on their criteria, only 19 (12 supersonic, 7 hypersonic) were considered of sufficient quality to be used for validation purposes. Aeschliman and Oberkampf [2] recognized the need to develop a specific methodology for experimental studies intended specifically for validation purposes.

  20. Validation of ASTEC v2.0 corium jet fragmentation model using FARO experiments

    International Nuclear Information System (INIS)

    Hermsmeyer, S.; Pla, P.; Sangiorgi, M.

    2015-01-01

    Highlights: • Model validation base extended to six FARO experiments. • Focus on the calculation of the fragmented particle diameter. • Capability and limits of the ASTEC fragmentation model. • Sensitivity analysis of model outputs. - Abstract: ASTEC is an integral code for the prediction of Severe Accidents in Nuclear Power Plants. As such, it needs to cover all physical processes that could occur during accident progression, yet keeping its models simple enough for the ensemble to stay manageable and produce results within an acceptable time. The present paper is concerned with the validation of the Corium jet fragmentation model of ASTEC v2.0 rev3 by means of a selection of six experiments carried out within the FARO facility. The different conditions applied within these six experiments help to analyse the model behaviour in different situations and to expose model limits. In addition to comparing model outputs with experimental measurements, sensitivity analyses are applied to investigate the model. Results of the paper are (i) validation runs, accompanied by an identification of situations where the implemented fragmentation model does not match the experiments well, and discussion of results; (ii) its special attention to the models calculating the diameter of fragmented particles, the identification of a fault in one model implemented, and the discussion of simplification and ad hoc modification to improve the model fit; and, (iii) an investigation of the sensitivity of predictions towards inputs and parameters. In this way, the paper offers a thorough investigation of the merit and limitation of the fragmentation model used in ASTEC

  1. BACCHUS 2: an in situ backfill hydration experiment for model validation

    International Nuclear Information System (INIS)

    Volckaert, G.; Bernier, F.; Alonso, E.; Gens, A.

    1995-01-01

    The BACCHUS 2 experiment is an in situ backfill hydration test performed in the HADES underground research facility situated in the plastic Boom clay layer at 220 m depth. The experiment aims at the optimization and demonstration of an installation procedure for a clay based backfill material. The instrumentation has been optimized in such a way that the results of the experiments can be used for the validation of hydro-mechanical codes such a NOSAT developed at the University of Catalunya Spain (UPC). The experimental set-up consists in a bottom flange and a central filter around which the backfill material was applied. The backfill material consist of a mixture of high density clay pellets and clay powder. The experimental set-up and its instrumentation are described in detail. The results of the hydro-mechanical characterization of the backfill material is summarized. (authors). 8 refs., 16 figs., 1 tab

  2. Strain gauge validation experiments for the Sandia 34-meter VAWT (Vertical Axis Wind Turbine) test bed

    Science.gov (United States)

    Sutherland, Herbert J.

    1988-08-01

    Sandia National Laboratories has erected a research oriented, 34- meter diameter, Darrieus vertical axis wind turbine near Bushland, Texas. This machine, designated the Sandia 34-m VAWT Test Bed, is equipped with a large array of strain gauges that have been placed at critical positions about the blades. This manuscript details a series of four-point bend experiments that were conducted to validate the output of the blade strain gauge circuits. The output of a particular gauge circuit is validated by comparing its output to equivalent gauge circuits (in this stress state) and to theoretical predictions. With only a few exceptions, the difference between measured and predicted strain values for a gauge circuit was found to be of the order of the estimated repeatability for the measurement system.

  3. Validation experiments of the chimney model for the operational simulation of hydrogen recombiners

    International Nuclear Information System (INIS)

    Simon, Berno

    2013-01-01

    The calculation program REKO-DIREKT allows the simulation of the operational behavior of a hydrogen recombiner during accidents with hydrogen release. The interest is focused on the interaction between the catalyst insertion and the chimney that influences the natural ventilation and thus the throughput through the recombiner significantly. For validation experiments were performed with a small-scale recombiner model in the test facility REKO-4. The results show the correlation between the hydrogen concentration at the recombiner entrance, the temperature on catalyst sheets and the entrance velocity using different chimney heights. The entrance velocity increases with the heights of the installed chimney that influences the natural ventilation significantly. The results allow the generation of a wide data base for validation of the computer code REKO-DIREKT.

  4. Towards program theory validation: Crowdsourcing the qualitative analysis of participant experiences.

    Science.gov (United States)

    Harman, Elena; Azzam, Tarek

    2018-02-01

    This exploratory study examines a novel tool for validating program theory through crowdsourced qualitative analysis. It combines a quantitative pattern matching framework traditionally used in theory-driven evaluation with crowdsourcing to analyze qualitative interview data. A sample of crowdsourced participants are asked to read an interview transcript and identify whether program theory components (Activities and Outcomes) are discussed and to highlight the most relevant passage about that component. The findings indicate that using crowdsourcing to analyze qualitative data can differentiate between program theory components that are supported by a participant's experience and those that are not. This approach expands the range of tools available to validate program theory using qualitative data, thus strengthening the theory-driven approach. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Design of experiments in medical physics: Application to the AAA beam model validation.

    Science.gov (United States)

    Dufreneix, S; Legrand, C; Di Bartolo, C; Bremaud, M; Mesgouez, J; Tiplica, T; Autret, D

    2017-09-01

    The purpose of this study is to evaluate the usefulness of the design of experiments in the analysis of multiparametric problems related to the quality assurance in radiotherapy. The main motivation is to use this statistical method to optimize the quality assurance processes in the validation of beam models. Considering the Varian Eclipse system, eight parameters with several levels were selected: energy, MLC, depth, X, Y 1 and Y 2 jaw dimensions, wedge and wedge jaw. A Taguchi table was used to define 72 validation tests. Measurements were conducted in water using a CC04 on a TrueBeam STx, a TrueBeam Tx, a Trilogy and a 2300IX accelerator matched by the vendor. Dose was computed using the AAA algorithm. The same raw data was used for all accelerators during the beam modelling. The mean difference between computed and measured doses was 0.1±0.5% for all beams and all accelerators with a maximum difference of 2.4% (under the 3% tolerance level). For all beams, the measured doses were within 0.6% for all accelerators. The energy was found to be an influencing parameter but the deviations observed were smaller than 1% and not considered clinically significant. Designs of experiment can help define the optimal measurement set to validate a beam model. The proposed method can be used to identify the prognostic factors of dose accuracy. The beam models were validated for the 4 accelerators which were found dosimetrically equivalent even though the accelerator characteristics differ. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  6. [Questionnaire on dissociative symptoms. German adaptation, reliability and validity of the American Dissociative Experience Scale (DES)].

    Science.gov (United States)

    Freyberger, H J; Spitzer, C; Stieglitz, R D; Kuhn, G; Magdeburg, N; Bernstein-Carlson, E

    1998-06-01

    The "Fragebogen zu dissoziativen Symptomen (FDS)" represents the authorised German translation and adaptation of the "Dissociative Experience Scale" (DES; Bernstein and Putnam 1986). The original scale comprises 28 items covering dissociative experiences with regard to memory, identity, awareness and cognition according to DSM-III-R and DSM-IV. For the German version, 16 items were added to cover dissociative phenomena according to ICD-10, mainly pseudoneurological conversion symptoms. Reliability and validity of the German version were studied in a total sample of 813 persons and were compared to the results of the original version. Test-retest reliability of the FDS was rtt = 0.88 and Cronbach's consistency coefficient was alpha = 0.93, which is comparable to the results of the DES. The instrument differentiates between different samples (healthy control subjects, students, unselected neurological and psychiatric inpatients, neurological and psychiatric patients with a dissociative disorder and schizophrenics). The FDS is an easily applicable, reliable and valid measure to quantify dissociative experiences.

  7. Validation of a CFD Analysis Model for Predicting CANDU-6 Moderator Temperature Against SPEL Experiments

    International Nuclear Information System (INIS)

    Churl Yoon; Bo Wook Rhee; Byung-Joo Min

    2002-01-01

    A validation of a 3D CFD model for predicting local subcooling of the moderator in the vicinity of calandria tubes in a CANDU-6 reactor is performed. The small scale moderator experiments performed at Sheridan Park Experimental Laboratory (SPEL) in Ontario, Canada[1] is used for the validation. Also a comparison is made between previous CFD analyses based on 2DMOTH and PHOENICS, and the current analysis for the same SPEL experiment. For the current model, a set of grid structures for the same geometry as the experimental test section is generated and the momentum, heat and continuity equations are solved by CFX-4.3, a CFD code developed by AEA technology. The matrix of calandria tubes is simplified by the porous media approach. The standard k-ε turbulence model associated with logarithmic wall treatment and SIMPLEC algorithm on the body fitted grid are used. Buoyancy effects are accounted for by the Boussinesq approximation. For the test conditions simulated in this study, the flow pattern identified is the buoyancy-dominated flow, which is generated by the interaction between the dominant buoyancy force by heating and inertial momentum forces by the inlet jets. As a result, the current CFD moderator analysis model predicts the moderator temperature reasonably, and the maximum error against the experimental data is kept at less than 2.0 deg. C over the whole domain. The simulated velocity field matches with the visualization of SPEL experiments quite well. (authors)

  8. Experiences using IAEA Code of practice for radiation sterilization of tissue allografts: Validation and routine control

    Energy Technology Data Exchange (ETDEWEB)

    Hilmy, N. [Batan Research Tissue Bank (BRTB), Centre for Research and Development of Isotopes and Radiation Technology, P.O. Box 7002, JKSKL, Jakarta 12070 (Indonesia)], E-mail: nazly@batan.go.id; Febrida, A.; Basril, A. [Batan Research Tissue Bank (BRTB), Centre for Research and Development of Isotopes and Radiation Technology, P.O. Box 7002, JKSKL, Jakarta 12070 (Indonesia)

    2007-11-15

    Problems of tissue allografts in using International Standard (ISO) 11137 for validation of radiation sterilization dose (RSD) are limited and low numbers of uniform samples per production batch, those are products obtained from one donor. Allograft is a graft transplanted between two different individuals of the same species. The minimum number of uniform samples needed for verification dose (VD) experiment at the selected sterility assurance level (SAL) per production batch according to the IAEA Code is 20, i.e., 10 for bio-burden determination and the remaining 10 for sterilization test. Three methods of the IAEA Code have been used for validation of RSD, i.e., method A1 that is a modification of method 1 of ISO 11137:1995, method B (ISO 13409:1996), and method C (AAMI TIR 27:2001). This paper describes VD experiments using uniform products obtained from one cadaver donor, i.e., cancellous bones, demineralized bone powders and amnion grafts from one life donor. Results of the verification dose experiments show that RSD is 15.4 kGy for cancellous and demineralized bone grafts and 19.2 kGy for amnion grafts according to method A1 and 25 kGy according to methods B and C.

  9. Experiments of Laser Pointing Stability in Air and in Vacuum to Validate Micrometric Positioning Sensor

    CERN Document Server

    Stern, G; Piedigrossi, D; Sandomierski, J; Sosin, M; Geiger, A; Guillaume, S

    2014-01-01

    Aligning accelerator components over 200m with 10 μm accuracy is a challenging task within the Compact Linear Collider (CLIC) study. A solution based on laser beam in vacuum as straight line reference is proposed. The positions of the accelerator’s components are measured with respect to the laser beam by sensors made of camera/shutter assemblies. To validate these sensors, laser pointing stability has to be studied over 200m. We perform experiments in air and in vacuum in order to know how laser pointing stability varies with the distance of propagation and with the environment. The experiments show that the standard deviations of the laser spot coordinates increase with the distance of propagation. They also show that the standard deviations are much smaller in vacuum (8 μm at 35m) than in air (2000 μm at 200m). Our experiment validates the concept of laser beam in vacuum with camera/shutter assembly for micrometric positioning over 35m. It also gives an estimation of the achievable precision.

  10. Ensemble of cell survival experiments after ion irradiation for validation of RBE models

    Energy Technology Data Exchange (ETDEWEB)

    Friedrich, Thomas; Scholz, Uwe; Scholz, Michael [GSI Helmholtzzentrum fuer Schwerionenforschung, Darmstadt (Germany); Durante, Marco [GSI Helmholtzzentrum fuer Schwerionenforschung, Darmstadt (Germany); Institut fuer Festkoerperphysik, TU Darmstadt, Darmstadt (Germany)

    2012-07-01

    There is persistent interest in understanding the systematics of the relative biological effectiveness (RBE). Models such as the Local Effect Model (LEM) or the Microdosimetric Kinetic Model have the goal to predict the RBE. For the validation of these models a collection of many in-vitro cell survival experiments is most appropriate. The set-up of an ensemble of in-vitro cell survival data comprising about 850 survival experiments after both ion and photon irradiation is reported. The survival curves have been taken out from publications. The experiments encompass survival curves obtained in different labs, using different ion species from protons to uranium, varying irradiation modalities (shaped or monoenergetic beam), various energies and linear energy transfers, and a whole variety of cell types (human or rodent; normal, mutagenic or tumor; radioresistant or -sensitive). Each cell survival curve has been parameterized by the linear-quadratic model. The photon parameters have been added to the data base to allow to calculate the experimental RBE to any survival level. We report on experimental trends found within the data ensemble. The data will serve as a testing ground for RBE models such as the LEM. Finally, a roadmap for further validation and first model results using the data base in combination with the LEM are presented.

  11. International integral experiments databases in support of nuclear data and code validation

    International Nuclear Information System (INIS)

    Briggs, J. Blair; Gado, Janos; Hunter, Hamilton; Kodeli, Ivan; Salvatores, Massimo; Sartori, Enrico

    2002-01-01

    The OECD/NEA Nuclear Science Committee (NSC) has identified the need to establish international databases containing all the important experiments that are available for sharing among the specialists. The NSC has set up or sponsored specific activities to achieve this. The aim is to preserve them in an agreed standard format in computer accessible form, to use them for international activities involving validation of current and new calculational schemes including computer codes and nuclear data libraries, for assessing uncertainties, confidence bounds and safety margins, and to record measurement methods and techniques. The databases so far established or in preparation related to nuclear data validation cover the following areas: SINBAD - A Radiation Shielding Experiments database encompassing reactor shielding, fusion blanket neutronics, and accelerator shielding. ICSBEP - International Criticality Safety Benchmark Experiments Project Handbook, with more than 2500 critical configurations with different combination of materials and spectral indices. IRPhEP - International Reactor Physics Experimental Benchmarks Evaluation Project. The different projects are described in the following including results achieved, work in progress and planned. (author)

  12. Generation of integral experiment covariance data and their impact on criticality safety validation

    Energy Technology Data Exchange (ETDEWEB)

    Stuke, Maik; Peters, Elisabeth; Sommer, Fabian

    2016-11-15

    The quantification of statistical dependencies in data of critical experiments and how to account for them properly in validation procedures has been discussed in the literature by various groups. However, these subjects are still an active topic in the Expert Group on Uncertainty Analysis for Criticality Safety Assessment (UACSA) of the OECDNEA Nuclear Science Committee. The latter compiles and publishes the freely available experimental data collection, the International Handbook of Evaluated Criticality Safety Benchmark Experiments, ICSBEP. Most of the experiments were performed as series and share parts of experimental setups, consequently leading to correlation effects in the results. The correct consideration of correlated data seems to be inevitable if the experimental data in a validation procedure is limited or one cannot rely on a sufficient number of uncorrelated data sets, e.g. from different laboratories using different setups. The general determination of correlations and the underlying covariance data as well as the consideration of them in a validation procedure is the focus of the following work. We discuss and demonstrate possible effects on calculated k{sub eff}'s, their uncertainties, and the corresponding covariance matrices due to interpretation of evaluated experimental data and its translation into calculation models. The work shows effects of various modeling approaches, varying distribution functions of parameters and compares and discusses results from the applied Monte-Carlo sampling method with available data on correlations. Our findings indicate that for the reliable determination of integral experimental covariance matrices or the correlation coefficients a detailed study of the underlying experimental data, the modeling approach and assumptions made, and the resulting sensitivity analysis seems to be inevitable. Further, a Bayesian method is discussed to include integral experimental covariance data when estimating an

  13. Generation of integral experiment covariance data and their impact on criticality safety validation

    International Nuclear Information System (INIS)

    Stuke, Maik; Peters, Elisabeth; Sommer, Fabian

    2016-11-01

    The quantification of statistical dependencies in data of critical experiments and how to account for them properly in validation procedures has been discussed in the literature by various groups. However, these subjects are still an active topic in the Expert Group on Uncertainty Analysis for Criticality Safety Assessment (UACSA) of the OECDNEA Nuclear Science Committee. The latter compiles and publishes the freely available experimental data collection, the International Handbook of Evaluated Criticality Safety Benchmark Experiments, ICSBEP. Most of the experiments were performed as series and share parts of experimental setups, consequently leading to correlation effects in the results. The correct consideration of correlated data seems to be inevitable if the experimental data in a validation procedure is limited or one cannot rely on a sufficient number of uncorrelated data sets, e.g. from different laboratories using different setups. The general determination of correlations and the underlying covariance data as well as the consideration of them in a validation procedure is the focus of the following work. We discuss and demonstrate possible effects on calculated k eff 's, their uncertainties, and the corresponding covariance matrices due to interpretation of evaluated experimental data and its translation into calculation models. The work shows effects of various modeling approaches, varying distribution functions of parameters and compares and discusses results from the applied Monte-Carlo sampling method with available data on correlations. Our findings indicate that for the reliable determination of integral experimental covariance matrices or the correlation coefficients a detailed study of the underlying experimental data, the modeling approach and assumptions made, and the resulting sensitivity analysis seems to be inevitable. Further, a Bayesian method is discussed to include integral experimental covariance data when estimating an application

  14. Pressure-Velocity Correlations in the Cove of a Leading Edge Slat

    Science.gov (United States)

    Wilkins, Stephen; Richard, Patrick; Hall, Joseph

    2015-11-01

    One of the major sources of aircraft airframe noise is related to the deployment of high-lift devices, such as leading-edge slats, particularly when the aircraft is preparing to land. As the engines are throttled back, the noise produced by the airframe itself is of great concern, as the aircraft is low enough for the noise to impact civilian populations. In order to reduce the aeroacoustic noise sources associated with these high lift devices for the next generation of aircraft an experimental investigation of the correlation between multi-point surface-mounted fluctuating pressures measured via flush-mounted microphones and the simultaneously measured two-component velocity field measured via Particle Image Velocimetry (PIV) is studied. The development of the resulting shear-layer within the slat cove is studied for Re =80,000, based on the wing chord. For low Mach number flows in air, the major acoustic source is a dipole acoustic source tied to fluctuating surface pressures on solid boundaries, such as the underside of the slat itself. Regions of high correlations between the pressure and velocity field near the surface will likely indicate a strong acoustic dipole source. In order to study the underlying physical mechanisms and understand their role in the development of aeroacoustic noise, Proper Orthogonal Decomposition (POD) by the method of snapshots is employed on the velocity field. The correlation between low-order reconstructions and the surface-pressure measurements are also studied.

  15. First experience from in-core sensor validation based on correlation and neuro-fuzzy techniques

    International Nuclear Information System (INIS)

    Figedy, S.

    2011-01-01

    In this work new types of nuclear reactor in-core sensor validation methods are outlined. The first one is based on combination of correlation coefficients and mutual information indices, which reflect the correlation of signals in linear and nonlinear regions. The method may be supplemented by wavelet transform based signal features extraction and pattern recognition by artificial neural networks and also fuzzy logic based decision making. The second one is based on neuro-fuzzy modeling of residuals between experimental values and their theoretical counterparts obtained from the reactor core simulator calculations. The first experience with this approach is described and further improvements to enhance the outcome reliability are proposed (Author)

  16. Analysis of Fresh Fuel Critical Experiments Appropriate for Burnup Credit Validation

    International Nuclear Information System (INIS)

    DeHart, M.D.

    1995-01-01

    The ANS/ANS-8.1 standard requires that calculational methods used in determining criticality safety limits for applications outside reactors be validated by comparison with appropriate critical experiments. This report provides a detailed description of 34 fresh fuel critical experiments and their analyses using the SCALE-4.2 code system and the 27-group ENDF/B-IV cross-section library. The 34 critical experiments were selected based on geometry, material, and neutron interaction characteristics that are applicable to a transportation cask loaded with pressurized-water-reactor spent fuel. These 34 experiments are a representative subset of a much larger data base of low-enriched uranium and mixed-oxide critical experiments. A statistical approach is described and used to obtain an estimate of the bias and uncertainty in the calculational methods and to predict a confidence limit for a calculated neutron multiplication factor. The SCALE-4.2 results for a superset of approximately 100 criticals are included in uncertainty analyses, but descriptions of the individual criticals are not included

  17. Development and validation of the Consumer Quality index instrument to measure the experience and priority of chronic dialysis patients

    NARCIS (Netherlands)

    van der Veer, Sabine N.; Jager, Kitty J.; Visserman, Ella; Beekman, Robert J.; Boeschoten, Els W.; de Keizer, Nicolette F.; Heuveling, Lara; Stronks, Karien; Arah, Onyebuchi A.

    2012-01-01

    Patient experience is an established indicator of quality of care. Validated tools that measure both experiences and priorities are lacking for chronic dialysis care, hampering identification of negative experiences that patients actually rate important. We developed two Consumer Quality (CQ) index

  18. Integral large scale experiments on hydrogen combustion for severe accident code validation-HYCOM

    International Nuclear Information System (INIS)

    Breitung, W.; Dorofeev, S.; Kotchourko, A.; Redlinger, R.; Scholtyssek, W.; Bentaib, A.; L'Heriteau, J.-P.; Pailhories, P.; Eyink, J.; Movahed, M.; Petzold, K.-G.; Heitsch, M.; Alekseev, V.; Denkevits, A.; Kuznetsov, M.; Efimenko, A.; Okun, M.V.; Huld, T.; Baraldi, D.

    2005-01-01

    A joint research project was carried out in the EU Fifth Framework Programme, concerning hydrogen risk in a nuclear power plant. The goals were: Firstly, to create a new data base of results on hydrogen combustion experiments in the slow to turbulent combustion regimes. Secondly, to validate the partners CFD and lumped parameter codes on the experimental data, and to evaluate suitable parameter sets for application calculations. Thirdly, to conduct a benchmark exercise by applying the codes to the full scale analysis of a postulated hydrogen combustion scenario in a light water reactor containment after a core melt accident. The paper describes the work programme of the project and the partners activities. Significant progress has been made in the experimental area, where test series in medium and large scale facilities have been carried out with the focus on specific effects of scale, multi-compartent geometry, heat losses and venting. The data were used for the validation of the partners CFD and lumped parameter codes, which included blind predictive calculations and pre- and post-test intercomparison exercises. Finally, a benchmark exercise was conducted by applying the codes to the full scale analysis of a hydrogen combustion scenario. The comparison and assessment of the results of the validation phase and of the challenging containment calculation exercise allows a deep insight in the quality, capabilities and limits of the CFD and the lumped parameter tools which are currently in use at various research laboratories

  19. Numerical studies and metric development for validation of magnetohydrodynamic models on the HIT-SI experiment

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, C., E-mail: hansec@uw.edu [PSI-Center, University of Washington, Seattle, Washington 98195 (United States); Columbia University, New York, New York 10027 (United States); Victor, B.; Morgan, K.; Hossack, A.; Sutherland, D. [HIT-SI Group, University of Washington, Seattle, Washington 98195 (United States); Jarboe, T.; Nelson, B. A. [HIT-SI Group, University of Washington, Seattle, Washington 98195 (United States); PSI-Center, University of Washington, Seattle, Washington 98195 (United States); Marklin, G. [PSI-Center, University of Washington, Seattle, Washington 98195 (United States)

    2015-05-15

    We present application of three scalar metrics derived from the Biorthogonal Decomposition (BD) technique to evaluate the level of agreement between macroscopic plasma dynamics in different data sets. BD decomposes large data sets, as produced by distributed diagnostic arrays, into principal mode structures without assumptions on spatial or temporal structure. These metrics have been applied to validation of the Hall-MHD model using experimental data from the Helicity Injected Torus with Steady Inductive helicity injection experiment. Each metric provides a measure of correlation between mode structures extracted from experimental data and simulations for an array of 192 surface-mounted magnetic probes. Numerical validation studies have been performed using the NIMROD code, where the injectors are modeled as boundary conditions on the flux conserver, and the PSI-TET code, where the entire plasma volume is treated. Initial results from a comprehensive validation study of high performance operation with different injector frequencies are presented, illustrating application of the BD method. Using a simplified (constant, uniform density and temperature) Hall-MHD model, simulation results agree with experimental observation for two of the three defined metrics when the injectors are driven with a frequency of 14.5 kHz.

  20. STORMVEX: The Storm Peak Lab Cloud Property Validation Experiment Science and Operations Plan

    Energy Technology Data Exchange (ETDEWEB)

    Mace, J; Matrosov, S; Shupe, M; Lawson, P; Hallar, G; McCubbin, I; Marchand, R; Orr, B; Coulter, R; Sedlacek, A; Avallone, L; Long, C

    2010-09-29

    During the Storm Peak Lab Cloud Property Validation Experiment (STORMVEX), a substantial correlative data set of remote sensing observations and direct in situ measurements from fixed and airborne platforms will be created in a winter season, mountainous environment. This will be accomplished by combining mountaintop observations at Storm Peak Laboratory and the airborne National Science Foundation-supported Colorado Airborne Multi-Phase Cloud Study campaign with collocated measurements from the second ARM Mobile Facility (AMF2). We describe in this document the operational plans and motivating science for this experiment, which includes deployment of AMF2 to Steamboat Springs, Colorado. The intensive STORMVEX field phase will begin nominally on 1 November 2010 and extend to approximately early April 2011.

  1. Analysis and evaluation of critical experiments for validation of neutron transport calculations

    International Nuclear Information System (INIS)

    Bazzana, S.; Blaumann, H; Marquez Damian, J.I

    2009-01-01

    The calculation schemes, computational codes and nuclear data used in neutronic design require validation to obtain reliable results. In the nuclear criticality safety field this reliability also translates into a higher level of safety in procedures involving fissile material. The International Criticality Safety Benchmark Evaluation Project is an OECD/NEA activity led by the United States, in which participants from over 20 countries evaluate and publish criticality safety benchmarks. The product of this project is a set of benchmark experiment evaluations that are published annually in the International Handbook of Evaluated Criticality Safety Benchmark Experiments. With the recent participation of Argentina, this information is now available for use by the neutron calculation and criticality safety groups in Argentina. This work presents the methodology used for the evaluation of experimental data, some results obtained by the application of these methods, and some examples of the data available in the Handbook. [es

  2. Optimal Design and Model Validation for Combustion Experiments in a Shock Tube

    KAUST Repository

    Long, Quan

    2014-01-06

    We develop a Bayesian framework for the optimal experimental design of the shock tube experiments which are being carried out at the KAUST Clean Combustion Center. The unknown parameters are the pre-exponential parameters and the activation energies in the reaction rate functions. The control parameters are the initial hydrogen concentration and the temperature. First, we build a polynomial based surrogate model for the observable related to the reactions in the shock tube. Second, we use a novel MAP based approach to estimate the expected information gain in the proposed experiments and select the best experimental set-ups corresponding to the optimal expected information gains. Third, we use the synthetic data to carry out virtual validation of our methodology.

  3. Decay heat experiment and validation of calculation code systems for fusion reactor

    International Nuclear Information System (INIS)

    Maekawa, Fujio; Ikeda, Yujiro; Wada, Masayuki

    1999-10-01

    Although accurate estimation of decay heat value is essential for safety analyses of fusion reactors against loss of coolant accidents and so on, no experimental work has been devoted to validating the estimation. Hence, a decay heat measurement experiment was performed as a task (T-339) of ITER/EDA. A new detector, the Whole Energy Absorption Spectrometer (WEAS), was developed for accurate and efficient measurements of decay heat. Decay heat produced in the thirty-two sample materials which were irradiated by 14-MeV neutrons at FNS/JAERI were measured with WEAS for a wide cooling time period from 1 min to 400 days. The data presently obtained were the first experimental decay heat data in the field of fusion. Validity of decay heat calculation codes of ACT4 and CINAC-V4, activation cross section libraries of FENDL/A-2.0 and JENDL Activation File, and decay data was investigated through analyses of the experiment. As a result, several points that should be modified were found in the codes and data. After solving the problems, it was demonstrated that decay heat valued calculated for most of samples were in good agreement with the experimental data. Especially for stainless steel 316 and copper, which were important materials for ITER, decay heat could be predicted with accuracy of ±10%. (author)

  4. Decay heat experiment and validation of calculation code systems for fusion reactor

    Energy Technology Data Exchange (ETDEWEB)

    Maekawa, Fujio; Ikeda, Yujiro [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Wada, Masayuki

    1999-10-01

    Although accurate estimation of decay heat value is essential for safety analyses of fusion reactors against loss of coolant accidents and so on, no experimental work has been devoted to validating the estimation. Hence, a decay heat measurement experiment was performed as a task (T-339) of ITER/EDA. A new detector, the Whole Energy Absorption Spectrometer (WEAS), was developed for accurate and efficient measurements of decay heat. Decay heat produced in the thirty-two sample materials which were irradiated by 14-MeV neutrons at FNS/JAERI were measured with WEAS for a wide cooling time period from 1 min to 400 days. The data presently obtained were the first experimental decay heat data in the field of fusion. Validity of decay heat calculation codes of ACT4 and CINAC-V4, activation cross section libraries of FENDL/A-2.0 and JENDL Activation File, and decay data was investigated through analyses of the experiment. As a result, several points that should be modified were found in the codes and data. After solving the problems, it was demonstrated that decay heat valued calculated for most of samples were in good agreement with the experimental data. Especially for stainless steel 316 and copper, which were important materials for ITER, decay heat could be predicted with accuracy of {+-}10%. (author)

  5. Computational Design and Discovery of Ni-Based Alloys and Coatings: Thermodynamic Approaches Validated by Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Zi-Kui [Pennsylvania State University; Gleeson, Brian [University of Pittsburgh; Shang, Shunli [Pennsylvania State University; Gheno, Thomas [University of Pittsburgh; Lindwall, Greta [Pennsylvania State University; Zhou, Bi-Cheng [Pennsylvania State University; Liu, Xuan [Pennsylvania State University; Ross, Austin [Pennsylvania State University

    2018-04-23

    This project developed computational tools that can complement and support experimental efforts in order to enable discovery and more efficient development of Ni-base structural materials and coatings. The project goal was reached through an integrated computation-predictive and experimental-validation approach, including first-principles calculations, thermodynamic CALPHAD (CALculation of PHAse Diagram), and experimental investigations on compositions relevant to Ni-base superalloys and coatings in terms of oxide layer growth and microstructure stabilities. The developed description included composition ranges typical for coating alloys and, hence, allow for prediction of thermodynamic properties for these material systems. The calculation of phase compositions, phase fraction, and phase stabilities, which are directly related to properties such as ductility and strength, was a valuable contribution, along with the collection of computational tools that are required to meet the increasing demands for strong, ductile and environmentally-protective coatings. Specifically, a suitable thermodynamic description for the Ni-Al-Cr-Co-Si-Hf-Y system was developed for bulk alloy and coating compositions. Experiments were performed to validate and refine the thermodynamics from the CALPHAD modeling approach. Additionally, alloys produced using predictions from the current computational models were studied in terms of their oxidation performance. Finally, results obtained from experiments aided in the development of a thermodynamic modeling automation tool called ESPEI/pycalphad - for more rapid discovery and development of new materials.

  6. Validation of mechanistic models for gas precipitation in solids during postirradiation annealing experiments

    Science.gov (United States)

    Rest, J.

    1989-12-01

    A number of different phenomenological models for gas precipitation in solids during postirradiation annealing experiments have been proposed. Validation of such mechanistic models for gas release and swelling is complicated by the use of data containing large systematic errors, and phenomena characterized by synergistic effects as well as uncertainties in materials properties. Statistical regression analysis is recommended for the selection of a reasonably well characterized data base for gas release from irradiated fuel under transient heating conditions. It is demonstrated that an appropriate data selection method is required in order to realistically examine the impact of differing descriptions of the phenomena, and uncertainties in selected materials properties, on the validation results. The results of the analysis show that the kinetics of gas precipitation in solids depend on bubble overpressurization effects and need to be accounted for during the heatup phase of isothermal heating experiments. It is shown that if only the total gas release values (as opposed to time-dependent data) were available, differentiation between different gas precipitation models would be ambiguous. The observed sustained increase in the fractional release curve at relatively high temperatures after the total precipitation of intragranular gas in fission gas bubbles is ascribed to the effects of a grain-growth/grain-boundary sweeping mechanism.

  7. Validation of mechanistic models for gas precipitation in solids during postirradiation annealing experiments

    International Nuclear Information System (INIS)

    Rest, J.

    1989-01-01

    A number of different phenomenological models for gas precipitation in solids during postirradiation annealing experiments have been proposed. Validation of such mechanistic models for gas release and swelling is complicated by the use of data containing large systematic errors, and phenomena characterized by synergistic effects as well as uncertainties in materials properties. Statistical regression analysis is recommended for the selection of a reasonably well characterized data base for gas release from irradiated fuel under transient heating conditions. It is demonstrated that an appropriate data selection method is required in order to realistically examine the impact of differing descriptions of the phenomena, and uncertainties in selected materials properties, on the validation results. The results of the analysis show that the kinetics of gas precipitation in solid depend on bubble overpressurization effects and need to be accounted for during the heatup phase of isothermal heating experiments. It is shown that if only the total gas release values (as opposed to time-dependent data) were available, differentiation between different gas precipitation models would be ambiguous. The observed sustained increase in the fractional release curve at relatively high temperatures after the total precipitation of intragranular gas in fission gas bubbles is ascribed to the effects of a grain-growth/grain-boundary sweeping mechanism. (orig.)

  8. Validation of two-phase flow code THYC on VATICAN experiment

    International Nuclear Information System (INIS)

    Maurel, F.; Portesse, A.; Rimbert, P.; Thomas, B.

    1997-01-01

    As part of a comprehensive program for THYC validation (THYC is a 3-dimensional two-phase flow computer code for PWR core configuration), an experimental project > has been initiated by the Direction des Etudes et Recherches of Electricite de France. Two mock-ups tested in Refrigerant-114, VATICAN-1 (with simple space grids) and VATICAN-2 (with mixing grids) were set up to investigate void fraction distributions using a single beam gamma densitometer. First, experiments were conducted with the VATICAN-1 mock-up. A set of constitutive laws to be used in rod bundles was determined but some doubts still remain for friction losses closure laws for oblique flow over tubes. From VATICAN-2 tests, calculations were performed using the standard set of correlations. Comparison with the experimental data shows an underprediction of void fraction by THYC in disturbed regions. Analyses highlight the poor treatment of axial relative velocity in these regions. A fitting of the radial and axial relative velocity values in the disturbed region improves the prediction of void fraction by the code but without any physical explanation. More analytical experiments should be carried out to validate friction losses closure laws for oblique flows and relative velocity downstream of a mixing grid. (author)

  9. Validation of two-phase flow code THYC on VATICAN experiment

    Energy Technology Data Exchange (ETDEWEB)

    Maurel, F.; Portesse, A.; Rimbert, P.; Thomas, B. [EDF/DER, Dept. TTA, 78 - Chatou (France)

    1997-12-31

    As part of a comprehensive program for THYC validation (THYC is a 3-dimensional two-phase flow computer code for PWR core configuration), an experimental project <> has been initiated by the Direction des Etudes et Recherches of Electricite de France. Two mock-ups tested in Refrigerant-114, VATICAN-1 (with simple space grids) and VATICAN-2 (with mixing grids) were set up to investigate void fraction distributions using a single beam gamma densitometer. First, experiments were conducted with the VATICAN-1 mock-up. A set of constitutive laws to be used in rod bundles was determined but some doubts still remain for friction losses closure laws for oblique flow over tubes. From VATICAN-2 tests, calculations were performed using the standard set of correlations. Comparison with the experimental data shows an underprediction of void fraction by THYC in disturbed regions. Analyses highlight the poor treatment of axial relative velocity in these regions. A fitting of the radial and axial relative velocity values in the disturbed region improves the prediction of void fraction by the code but without any physical explanation. More analytical experiments should be carried out to validate friction losses closure laws for oblique flows and relative velocity downstream of a mixing grid. (author)

  10. Design of an intermediate-scale experiment to validate unsaturated- zone transport models

    International Nuclear Information System (INIS)

    Siegel, M.D.; Hopkins, P.L.; Glass, R.J.; Ward, D.B.

    1991-01-01

    An intermediate-scale experiment is being carried out to evaluate instrumentation and models that might be used for transport-model validation for the Yucca Mountain Site Characterization Project. The experimental test bed is a 6-m high x 3-m diameter caisson filled with quartz sand with a sorbing layer at an intermediate depth. The experiment involves the detection and prediction of the migration of fluid and tracers through an unsaturated porous medium. Pre-test design requires estimation of physical properties of the porous medium such as the relative permeability, saturation/pressure relations, porosity, and saturated hydraulic conductivity as well as geochemical properties such as surface complexation constants and empircial K d 'S. The pre-test characterization data will be used as input to several computer codes to predict the fluid flow and tracer migration. These include a coupled chemical-reaction/transport model, a stochastic model, and a deterministic model using retardation factors. The calculations will be completed prior to elution of the tracers, providing a basis for validation by comparing the predictions to observed moisture and tracer behavior

  11. Directed Design of Experiments for Validating Probability of Detection Capability of NDE Systems (DOEPOD)

    Science.gov (United States)

    Generazio, Edward R.

    2015-01-01

    Directed Design of Experiments for Validating Probability of Detection Capability of NDE Systems (DOEPOD) Manual v.1.2 The capability of an inspection system is established by applications of various methodologies to determine the probability of detection (POD). One accepted metric of an adequate inspection system is that there is 95% confidence that the POD is greater than 90% (90/95 POD). Design of experiments for validating probability of detection capability of nondestructive evaluation (NDE) systems (DOEPOD) is a methodology that is implemented via software to serve as a diagnostic tool providing detailed analysis of POD test data, guidance on establishing data distribution requirements, and resolving test issues. DOEPOD demands utilization of observance of occurrences. The DOEPOD capability has been developed to provide an efficient and accurate methodology that yields observed POD and confidence bounds for both Hit-Miss or signal amplitude testing. DOEPOD does not assume prescribed POD logarithmic or similar functions with assumed adequacy over a wide range of flaw sizes and inspection system technologies, so that multi-parameter curve fitting or model optimization approaches to generate a POD curve are not required. DOEPOD applications for supporting inspector qualifications is included.

  12. The cross-cultural validity of the Caregiving Experiences Questionnaire (CEQ) among Danish mothers with preschool children

    DEFF Research Database (Denmark)

    Røhder, Katrine; George, Carol; Brennan, Jessica

    2018-01-01

    The present study explored the Danish cross-cultural validity of the Caregiving Experiences Questionnaire (CEQ), a new measure of caregiving representations in parent-child relationships. Low-risk Danish mothers (N = 159) with children aged 1.5–5 years completed the CEQ and predictive validity...

  13. Assessing decentering: validation, psychometric properties, and clinical usefulness of the Experiences Questionnaire in a Spanish sample.

    Science.gov (United States)

    Soler, Joaquim; Franquesa, Alba; Feliu-Soler, Albert; Cebolla, Ausias; García-Campayo, Javier; Tejedor, Rosa; Demarzo, Marcelo; Baños, Rosa; Pascual, Juan Carlos; Portella, Maria J

    2014-11-01

    Decentering is defined as the ability to observe one's thoughts and feelings in a detached manner. The Experiences Questionnaire (EQ) is a self-report instrument that originally assessed decentering and rumination. The purpose of this study was to evaluate the psychometric properties of the Spanish version of EQ-Decentering and to explore its clinical usefulness. The 11-item EQ-Decentering subscale was translated into Spanish and psychometric properties were examined in a sample of 921 adult individuals, 231 with psychiatric disorders and 690 without. The subsample of nonpsychiatric participants was also split according to their previous meditative experience (meditative participants, n=341; and nonmeditative participants, n=349). Additionally, differences among these three subgroups were explored to determine clinical validity of the scale. Finally, EQ-Decentering was administered twice in a group of borderline personality disorder, before and after a 10-week mindfulness intervention. Confirmatory factor analysis indicated acceptable model fit, sbχ(2)=243.8836 (p.46; and divergent validity: r<-.35). The scale detected changes in decentering after a 10-session intervention in mindfulness (t=-4.692, p<.00001). Differences among groups were significant (F=134.8, p<.000001), where psychiatric participants showed the lowest scores compared to nonpsychiatric meditative and nonmeditative participants. The Spanish version of the EQ-Decentering is a valid and reliable instrument to assess decentering either in clinical and nonclinical samples. In addition, the findings show that EQ-Decentering seems an adequate outcome instrument to detect changes after mindfulness-based interventions. Copyright © 2014. Published by Elsevier Ltd.

  14. SCALE Validation Experience Using an Expanded Isotopic Assay Database for Spent Nuclear Fuel

    International Nuclear Information System (INIS)

    Gauld, Ian C.; Radulescu, Georgeta; Ilas, Germina

    2009-01-01

    The availability of measured isotopic assay data to validate computer code predictions of spent fuel compositions applied in burnup-credit criticality calculations is an essential component for bias and uncertainty determination in safety and licensing analyses. In recent years, as many countries move closer to implementing or expanding the use of burnup credit in criticality safety for licensing, there has been growing interest in acquiring additional high-quality assay data. The well-known open sources of assay data are viewed as potentially limiting for validating depletion calculations for burnup credit due to the relatively small number of isotopes measured (primarily actinides with relatively few fission products), sometimes large measurement uncertainties, incomplete documentation, and the limited burnup and enrichment range of the fuel samples. Oak Ridge National Laboratory (ORNL) recently initiated an extensive isotopic validation study that includes most of the public data archived in the Organization for Economic Cooperation and Development/Nuclear Energy Agency (OECD/NEA) electronic database, SFCOMPO, and new datasets obtained through participation in commercial experimental programs. To date, ORNL has analyzed approximately 120 different spent fuel samples from pressurized-water reactors that span a wide enrichment and burnup range and represent a broad class of assembly designs. The validation studies, completed using SCALE 5.1, are being used to support a technical basis for expanded implementation of burnup credit for spent fuel storage facilities, and other spent fuel analyses including radiation source term, dose assessment, decay heat, and waste repository safety analyses. This paper summarizes the isotopic assay data selected for this study, presents validation results obtained with SCALE 5.1, and discusses some of the challenges and experience associated with evaluating the results. Preliminary results obtained using SCALE 6 and ENDF

  15. Experiment designs offered for discussion preliminary to an LLNL field scale validation experiment in the Yucca Mountain Exploratory Shaft Facility

    International Nuclear Information System (INIS)

    Lowry, B.; Keller, C.

    1988-01-01

    It has been proposed (''Progress Report on Experiment Rationale for Validation of LLNL Models of Ground Water Behavior Near Nuclear Waste Canisters,'' Keller and Lowry, Dec. 7, 1988) that a heat generating spent fuel canister emplaced in unsaturated tuff, in a ventilated hole, will cause a net flux of water into the borehole during the heating cycle of the spent fuel. Accompanying this mass flux will be the formation of mineral deposits near the borehole wall as the water evaporates and leaves behind its dissolved solids. The net effect of this process upon the containment of radioactive wastes is a function of (1) where and how much solid material is deposited in the tuff matrix and cracks, and (2) the resultant effect on the medium flow characteristics. Experimental concepts described in this report are designed to quantify the magnitude and relative location of solid mineral deposit formation due to a heated and vented borehole environment. The most simple tests address matrix effects only; after the process is understood in the homogeneous matrix, fracture effects would be investigated. Three experiment concepts have been proposed. Each has unique advantages and allows investigation of specific aspects of the precipitate formation process. All could be done in reasonable time (less than a year) and none of them are extremely expensive (the most expensive is probably the structurally loaded block test). The calculational ability exists to analyze the ''real'' situation and each of the experiment designs, and produce a credible series of tests. None of the designs requires the acquisition of material property data beyond current capabilities. The tests could be extended, if our understanding is consistent with the data produced, to analyze fracture effects. 7 figs

  16. Validation analysis of pool fire experiment (Run-F7) using SPHINCS code

    International Nuclear Information System (INIS)

    Yamaguchi, Akira; Tajima, Yuji

    1998-04-01

    SPHINCS (Sodium Fire Phenomenology IN multi-Cell System) code has been developed for the safety analysis of sodium fire accident in a Fast Breeder Reactor. The main features of the SPHINCS code with respect to the sodium pool fire phenomena are multi-dimensional modeling of the thermal behavior in sodium pool and steel liner, modeling of the extension of sodium pool area based on the sodium mass conservation, and equilibrium model for the chemical reaction of pool fire on the flame sheet at the surface of sodium pool during. Therefore, the SPHINCS code is capable of temperature evaluation of the steel liner in detail during the small and/or medium scale sodium leakage accidents. In this study, Run-F7 experiment in which the sodium leakage rate is 11.8 kg/hour has been analyzed. In the experiment the diameter of the sodium pool is approximately 60 cm and the maximum steel liner temperature was 616 degree C. The analytical results tell us the agreement between the SPHINCS analysis and the experiment is excellent with respect to the time history and spatial distribution of the liner temperature, sodium pool extension behavior, as well as atmosphere gas temperature. It is concluded that the pool fire modeling of the SPHINCS code has been validated for this experiment. The SPHINCS code is currently applicable to the sodium pool fire phenomena and the temperature evaluation of the steel liner. The experiment series are continued to check some parameters, i.e., sodium leakage rate and the height of sodium leakage. Thus, the author will analyze the subsequent experiments to check the influence of the parameters and applies SPHINCS to the sodium fire consequence analysis of fast reactor. (author)

  17. Validation of large-angle scattering data via shadow-bar experiment

    Energy Technology Data Exchange (ETDEWEB)

    Ohnishi, S., E-mail: ohnishi@nmri.go.jp [National Maritime Research Institute, 6-38-1, Shinkawa, Mitaka, Tokyo 181-0004 (Japan); Tamaki, S.; Murata, I. [Osaka University, 1-14-16-1, Yamadaoka, Suita-si, Osaka 565-0871 (Japan)

    2016-11-15

    Highlights: • An experiment to validate large-angle scattering cross section is conducted. • Pieces of Nb foil are set behind a shadow bar to obtain the {sup 92m}Nb production rates. • The results calculated using ENDF/B-VI library data exhibit a 57% overestimation. • The adjustment of cross section in large-angle region makes the C/E close to 1. - Abstract: An experiment emphasizing the influence of large-angle scattering on nuclear data was conducted, in which a Fe shadow bar and a Fe slab target were placed before a deuterium–tritium fusion (DT) neutron source. Two Nb foils were set on both sides of the shadow bar in order to monitor the neutron source intensity and to measure the neutrons scattered from the slab target. The {sup 93}Nb(n,2n){sup 92m}Nb reaction rate of the foil was measured following the DT neutron irradiation and calculated using the MCNP5 Monte Carlo radiation transportation code. The {sup 92m}Nb production rates calculated using data from the JEFF-3.1 and JENDL-4.0 libraries agreed with that measured in the experiment, while the result calculated using data from the ENDF/B-VI library exhibited a 57% overestimation. Because the sensitivity of the {sup 92m}Nb production rate to the scattering angular distribution was large in the angular region between scattering direction cosines of −0.9 and −0.4, the scattering angular distribution was adjusted in that region. This adjustment resulted in a calculation-to-experiment ratio close to 1, but had little influence on the existing integral benchmark experiment.

  18. Fluid-structure interaction in non-rigid pipeline systems - large scale validation experiments

    International Nuclear Information System (INIS)

    Heinsbroek, A.G.T.J.; Kruisbrink, A.C.H.

    1993-01-01

    The fluid-structure interaction computer code FLUSTRIN, developed by DELFT HYDRAULICS, enables the user to determine dynamic fluid pressures, structural stresses and displacements in a liquid-filled pipeline system under transient conditions. As such, the code is a useful tool to process and mechanical engineers in the safe design and operation of pipeline systems in nuclear power plants. To validate FLUSTRIN, experiments have been performed in a large scale 3D test facility. The test facility consists of a flexible pipeline system which is suspended by wires, bearings and anchors. Pressure surges, which excite the system, are generated by a fast acting shut-off valve. Dynamic pressures, structural displacements and strains (in total 70 signals) have been measured under well determined initial and boundary conditions. The experiments have been simulated with FLUSTRIN, which solves the acoustic equations using the method of characteristics (fluid) and the finite element method (structure). The agreement between experiments and simulations is shown to be good: frequencies, amplitudes and wave phenomena are well predicted by the numerical simulations. It is demonstrated that an uncoupled water hammer computation would render unreliable and useless results. (author)

  19. CFD Validation with a Multi-Block Experiment to Evaluate the Core Bypass Flow in VHTR

    International Nuclear Information System (INIS)

    Yoon, Su Jong; Lee, Jeong Hun; Park, Goon Cherl; Kim, Min Hwan

    2010-01-01

    Core bypass flow of Very High Temperature Reactor (VHTR) is defined as the ineffective coolant which passes through the bypass gaps between the block columns and the crossflow gaps between the stacked blocks. This flows lead to the variation of the flow distribution in the core and affect the core thermal margin and the safety of VHTR. Therefore, bypass flow should be investigated and quantified. However, it is not a simple question, because the flow path of VHTR core is very complex. In particular, since dimensions of the bypass gap and the crossflow gap are of the order of few millimeters, it is very difficult to measure and to analyze the flow field at those gaps. Seoul National University (SNU) multi-block experiment was carried out to evaluate the bypass flow distribution and the flow characteristics. The coolant flow rate through outlet of each block column was measured, but the local flow field was measured restrictively in the experiment. Instead, CFD analysis was carried out to investigate the local phenomena of the experiment. A commercial CFD code CFX-12 was validated by comparing the simulation results and the experimental data

  20. Reactivity worth measurements on the CALIBAN reactor: interpretation of integral experiments for the nuclear data validation

    International Nuclear Information System (INIS)

    Richard, B.

    2012-01-01

    The good knowledge of nuclear data, input parameters for the neutron transport calculation codes, is necessary to support the advances of the nuclear industry. The purpose of this work is to bring pertinent information regarding the nuclear data integral validation process. Reactivity worth measurements have been performed on the Caliban reactor, they concern four materials of interest for the nuclear industry: gold, lutetium, plutonium and uranium 238. Experiments which have been conducted in order to improve the characterization of the core are also described and discussed, the latter are necessary to the good interpretation of reactivity worth measurements. The experimental procedures are described with their associated uncertainties, measurements are then compared to numerical results. The methods used in numerical calculations are reported, especially the multigroup cross sections generation for deterministic codes. The modeling of the experiments is presented along with the associated uncertainties. This comparison led to an interpretation concerning the qualification of nuclear data libraries. Discrepancies are reported, discussed and justify the need of such experiments. (author) [fr

  1. Validation of the Child HCAHPS survey to measure pediatric inpatient experience of care in Flanders.

    Science.gov (United States)

    Bruyneel, Luk; Coeckelberghs, Ellen; Buyse, Gunnar; Casteels, Kristina; Lommers, Barbara; Vandersmissen, Jo; Van Eldere, Johan; Van Geet, Chris; Vanhaecht, Kris

    2017-07-01

    The recently developed Child HCAHPS provides a standard to measure US hospitals' performance on pediatric inpatient experiences of care. We field-tested Child HCAHPS in Belgium to instigate international comparison. In the development stage, forward/backward translation was conducted and patients assessed content validity index as excellent. The draft Flemish Child HCAHPS included 63 items: 38 items for five topics hypothesized to be similar to those proposed in the US (communication with parent, communication with child, attention to safety and comfort, hospital environment, and global rating), 10 screeners, a 14-item demographic and descriptive section, and one open-ended item. A 6-week pilot test was subsequently performed in three pediatric wards (general ward, hematology and oncology ward, infant and toddler ward) at a JCI-accredited university hospital. An overall response rate of 90.99% (303/333) was achieved and was consistent across wards. Confirmatory factor analysis largely confirmed the configuration of the proposed composites. Composite and single-item measures related well to patients' global rating of the hospital. Interpretation of different patient experiences across types of wards merits further investigation. Child HCAHPS provides an opportunity for systematic and cross-national assessment of pediatric inpatient experiences. Sharing and implementing international best practices are the next logical step. What is Known: • Patient experience surveys are increasingly used to reflect on the quality, safety, and centeredness of patient care. • While adult inpatient experience surveys are routinely used across countries around the world, the measurement of pediatric inpatient experiences is a young field of research that is essential to reflect on family-centered care. What is New: • We demonstrate that the US-developed Child HCAHPS provides an opportunity for international benchmarking of pediatric inpatient experiences with care through parents

  2. Validation of the TRACR3D code for soil water flow under saturated/unsaturated conditions in three experiments

    International Nuclear Information System (INIS)

    Perkins, B.; Travis, B.; DePoorter, G.

    1985-01-01

    Validation of the TRACR3D code in a one-dimensional form was obtained for flow of soil water in three experiments. In the first experiment, a pulse of water entered a crushed-tuff soil and initially moved under conditions of saturated flow, quickly followed by unsaturated flow. In the second experiment, steady-state unsaturated flow took place. In the final experiment, two slugs of water entered crushed tuff under field conditions. In all three experiments, experimentally measured data for volumetric water content agreed, within experimental errors, with the volumetric water content predicted by the code simulations. The experiments and simulations indicated the need for accurate knowledge of boundary and initial conditions, amount and duration of moisture input, and relevant material properties as input into the computer code. During the validation experiments, limitations on monitoring of water movement in waste burial sites were also noted. 5 references, 34 figures, 9 tables

  3. The Depressive Experiences Questionnaire: validity and psychological correlates in a clinical sample.

    Science.gov (United States)

    Riley, W T; McCranie, E W

    1990-01-01

    This study sought to compare the original and revised scoring systems of the Depressive Experiences Questionnaire (DEQ) and to assess the construct validity of the Dependent and Self-Critical subscales of the DEQ in a clinically depressed sample. Subjects were 103 depressed inpatients who completed the DEQ, the Beck Depression Inventory (BDI), the Hopelessness Scale, the Automatic Thoughts Questionnaire (ATQ), the Rathus Assertiveness Schedule (RAS), and the Minnesota Multiphasic Personality Inventory (MMPI). The original and revised scoring systems of the DEQ evidenced good concurrent validity for each factor scale, but the revised system did not sufficiently discriminate dependent and self-critical dimensions. Using the original scoring system, self-criticism was significantly and positively related to severity of depression, whereas dependency was not, particularly for males. Factor analysis of the DEQ scales and the other scales used in this study supported the dependent and self-critical dimensions. For men, the correlation of the DEQ with the MMPI scales indicated that self-criticism was associated with psychotic symptoms, hostility/conflict, and a distress/exaggerated response set, whereas dependency did not correlate significantly with any MMPI scales. Females, however, did not exhibit a differential pattern of correlations between either the Dependency or the Self-Criticism scales and the MMPI. These findings suggest possible gender differences in the clinical characteristics of male and female dependent and self-critical depressive subtypes.

  4. The Space Technology-7 Disturbance Reduction System Precision Control Flight Validation Experiment Control System Design

    Science.gov (United States)

    O'Donnell, James R.; Hsu, Oscar C.; Maghami, Peirman G.; Markley, F. Landis

    2006-01-01

    As originally proposed, the Space Technology-7 Disturbance Reduction System (DRS) project, managed out of the Jet Propulsion Laboratory, was designed to validate technologies required for future missions such as the Laser Interferometer Space Antenna (LISA). The two technologies to be demonstrated by DRS were Gravitational Reference Sensors (GRSs) and Colloidal MicroNewton Thrusters (CMNTs). Control algorithms being designed by the Dynamic Control System (DCS) team at the Goddard Space Flight Center would control the spacecraft so that it flew about a freely-floating GRS test mass, keeping it centered within its housing. For programmatic reasons, the GRSs were descoped from DRS. The primary goals of the new mission are to validate the performance of the CMNTs and to demonstrate precise spacecraft position control. DRS will fly as a part of the European Space Agency (ESA) LISA Pathfinder (LPF) spacecraft along with a similar ESA experiment, the LISA Technology Package (LTP). With no GRS, the DCS attitude and drag-free control systems make use of the sensor being developed by ESA as a part of the LTP. The control system is designed to maintain the spacecraft s position with respect to the test mass, to within 10 nm/the square root of Hz over the DRS science frequency band of 1 to 30 mHz.

  5. Development of a monitoring tool to validate trigger level analysis in the ATLAS experiment

    CERN Document Server

    Hahn, Artur

    2014-01-01

    This report summarizes my thirteen week summer student project at CERN from June 30th until September 26th of 2014. My task was to contribute to a monitoring tool for the ATLAS experiment, comparing jets reconstructed by the trigger to fully offline reconstructed and saved events by creating a set of insightful histograms to be used during run 2 of the Large Hadron Collider, planned to start in early 2015. The motivation behind this project is to validate the use of data taken solely from the high level trigger for analysis purposes. Once the code generating the plots was completed, it was tested on data collected during run 1 up to the year 2012 and Monte Carlo simulated events with center-of-mass energies ps = 8TeV and ps = 14TeV.

  6. Continuously revised assurance cases with stakeholders’ cross-validation: a DEOS experience

    Directory of Open Access Journals (Sweden)

    Kimio Kuramitsu

    2016-12-01

    Full Text Available Recently, assurance cases have received much attention in the field of software-based computer systems and IT services. However, software changes very often, and there are no strong regulations for software. These facts are two main challenges to be addressed in the development of software assurance cases. We propose a method of developing assurance cases by means of continuous revision at every stage of the system life cycle, including in operation and service recovery in failure cases. Instead of a regulator, dependability arguments are validated by multiple stakeholders competing with each other. This paper reported our experience with the proposed method in the case of Aspen education service. The case study demonstrates that continuous revisions enable stakeholders to share dependability problems across software life cycle stages, which will lead to the long-term improvement of service dependability.

  7. The COSIMA-experiments, a data base for validation of two-phase flow computer codes

    International Nuclear Information System (INIS)

    Class, G.; Meyder, R.; Stratmanns, E.

    1985-12-01

    The report presents an overview on the large data base generated with COSIMA. The data base is to be used to validate and develop computer codes for two-phase flow. In terms of fuel rod behavior it was found that during blowdown under realistic conditions only small strains are reached. For clad rupture extremely high rod internal pressure is necessary. Additionally important results were found in the behavior of a fuel rod simulator and on the effect of thermocouples attached on the cladding outer surface. Post-test calculations, performed with the codes RELAP and DRUFAN show a good agreement with the experiments. This however can be improved if the phase separation models in the codes would be updated. (orig./HP) [de

  8. Monte Carlo validation experiments for the gas Cherenkov detectors at the National Ignition Facility and Omega

    Energy Technology Data Exchange (ETDEWEB)

    Rubery, M. S.; Horsfield, C. J. [Plasma Physics Department, AWE plc, Reading RG7 4PR (United Kingdom); Herrmann, H.; Kim, Y.; Mack, J. M.; Young, C.; Evans, S.; Sedillo, T.; McEvoy, A.; Caldwell, S. E. [Plasma Physics Department, Los Alamos National Laboratory, Los Alamos, New Mexico 87545 (United States); Grafil, E.; Stoeffl, W. [Physics, Lawrence Livermore National Laboratory, Livermore, California 94551 (United States); Milnes, J. S. [Photek Limited UK, 26 Castleham Road, St. Leonards-on-sea TN38 9NS (United Kingdom)

    2013-07-15

    The gas Cherenkov detectors at NIF and Omega measure several ICF burn characteristics by detecting multi-MeV nuclear γ emissions from the implosion. Of primary interest are γ bang-time (GBT) and burn width defined as the time between initial laser-plasma interaction and peak in the fusion reaction history and the FWHM of the reaction history respectively. To accurately calculate such parameters the collaboration relies on Monte Carlo codes, such as GEANT4 and ACCEPT, for diagnostic properties that cannot be measured directly. This paper describes a series of experiments performed at the High Intensity γ Source (HIγS) facility at Duke University to validate the geometries and material data used in the Monte Carlo simulations. Results published here show that model-driven parameters such as intensity and temporal response can be used with less than 50% uncertainty for all diagnostics and facilities.

  9. Preliminary characterization of materials for a reactive transport model validation experiment

    International Nuclear Information System (INIS)

    Siegel, M.D.; Ward, D.B.; Cheng, W.C.; Bryant, C.; Chocas, C.S.; Reynolds, C.G.

    1993-01-01

    The geochemical properties of a porous sand and several tracers (Ni, Br, and Li) have been characterized for use in a caisson experiment designed to validate sorption models used in models of inactive transport. The surfaces of the sand grains have been examined by a combination of techniques including potentiometric titration, acid leaching, optical microscopy, and scanning electron microscopy with energy-dispersive spectroscopy. The surface studies indicate the presence of small amounts of carbonate, kaolinite and iron-oxyhydroxides. Adsorption of nickel, lithium and bromide by the sand was measured using batch techniques. Bromide was not sorbed by the sand. A linear (K d ) or an isotherm sorption model may adequately describe transport of Li; however, a model describing the changes of pH and the concentrations of other solution species as a function of time and position within the caisson and the concomitant effects on Ni sorption may be required for accurate predictions of nickel transport

  10. Recent validation experience with multigroup cross-section libraries and scale

    International Nuclear Information System (INIS)

    Bowman, S.M.; Wright, R.Q.; DeHart, M.D.; Parks, C.V.; Petrie, L.M.

    1995-01-01

    This paper will discuss the results obtained and lessons learned from an extensive validation of new ENDF/B-V and ENDF/B-VI multigroup cross-section libraries using analyses of critical experiments. The KENO V. a Monte Carlo code in version 4.3 of the SCALE computer code system was used to perform the critical benchmark calculations via the automated SCALE sequence CSAS25. The cross-section data were processed by the SCALE automated problem-dependent resonance-processing procedure included in this sequence. Prior to calling KENO V.a, CSAS25 accesses BONAMI to perform resonance self-shielding for nuclides with Bondarenko factors and NITAWL-II to process nuclides with resonance parameter data via the Nordheim Integral Treatment

  11. Monitoring Ground Subsidence in Hong Kong via Spaceborne Radar: Experiments and Validation

    Directory of Open Access Journals (Sweden)

    Yuxiao Qin

    2015-08-01

    Full Text Available The persistent scatterers interferometry (PSI technique is gradually becoming known for its capability of providing up to millimeter accuracy of measurement on ground displacement. Nevertheless, there is still quite a good amount of doubt regarding its correctness or accuracy. In this paper, we carried out an experiment corroborating the capability of the PSI technique with the help of a traditional survey method in the urban area of Hong Kong, China. Seventy three TerraSAR-X (TSX and TanDEM-X (TDX images spanning over four years are used for the data process. There are three aims of this study. The first is to generate a displacement map of urban Hong Kong and to check for spots with possible ground movements. This information will be provided to the local surveyors so that they can check these specific locations. The second is to validate if the accuracy of the PSI technique can indeed reach the millimeter level in this real application scenario. For validating the accuracy of PSI, four corner reflectors (CR were installed at a construction site on reclaimed land in Hong Kong. They were manually moved up or down by a few to tens of millimeters, and the value derived from the PSI analysis was compared to the true value. The experiment, carried out in unideal conditions, nevertheless proved undoubtedly that millimeter accuracy can be achieved by the PSI technique. The last is to evaluate the advantages and limitations of the PSI technique. Overall, the PSI technique can be extremely useful if used in collaboration with other techniques, so that the advantages can be highlighted and the drawbacks avoided.

  12. Closing the patient experience chasm: A two-level validation of the Consumer Quality Index Inpatient Hospital Care

    NARCIS (Netherlands)

    Smirnova, A.; Lombarts, K.; Arah, O.A.; Vleuten, C.P.M. van der

    2017-01-01

    BACKGROUND: Evaluation of patients' health care experiences is central to measuring patient-centred care. However, different instruments tend to be used at the hospital or departmental level but rarely both, leading to a lack of standardization of patient experience measures. OBJECTIVE: To validate

  13. Closing the patient experience chasm: A two-level validation of the Consumer Quality Index Inpatient Hospital Care

    NARCIS (Netherlands)

    Smirnova, Alina; Lombarts, Kiki M. J. M. H.; Arah, Onyebuchi A.; van der Vleuten, Cees P. M.

    2017-01-01

    BackgroundEvaluation of patients' health care experiences is central to measuring patient-centred care. However, different instruments tend to be used at the hospital or departmental level but rarely both, leading to a lack of standardization of patient experience measures. ObjectiveTo validate the

  14. Validation of the U-238 inelastic scattering neutron cross section through the EXCALIBUR dedicated experiment

    Directory of Open Access Journals (Sweden)

    Leconte Pierre

    2017-01-01

    Full Text Available EXCALIBUR is an integral transmission experiment based on the fast neutron source produced by the bare highly enriched fast burst reactor CALIBAN, located in CEA/DAM Valduc (France. Two experimental campaigns have been performed, one using a sphere of diameter 17 cm and one using two cylinders of 17 cm diameter 9 cm height, both made of metallic Uranium 238. A set of 15 different dosimeters with specific threshold energies have been employed to provide information on the neutron flux attenuation as a function of incident energy. Measurements uncertainties are typically in the range of 0.5–3% (1σ. The analysis of these experiments is performed with the TRIPOLI4 continuous energy Monte Carlo code. A calculation benchmark with validated simplifications is defined in order to improve the statistical convergence under 2%. Various 238U evaluations have been tested: JEFF-3.1.1, ENDF/B-VII.1 and the IB36 evaluation from IAEA. A sensitivity analysis is presented to identify the contribution of each reaction cross section to the integral transmission rate. This feedback may be of interest for the international effort on 238U, through the CIELO project.

  15. An enhanced fire hazard assessment model and validation experiments for vertical cable trays

    International Nuclear Information System (INIS)

    Li, Lu; Huang, Xianjia; Bi, Kun; Liu, Xiaoshuang

    2016-01-01

    Highlights: • An enhanced model was developed for vertical cable fire hazard assessment in NPP. • The validated experiments on vertical cable tray fires were conducted. • The capability of the model for cable tray with different cable spacing were tested. - Abstract: The model, referred to as FLASH-CAT (Flame Spread over Horizontal Cable Trays), was developed to estimate the heat release rate for vertical cable tray fire. The focus of this work is to investigate the application of an enhanced model to the single vertical cable tray fires with different cable spacing. The experiments on vertical cable tray fires with three typical cable spacing were conducted. The histories of mass loss rate and flame length were recorded during the cable fire. From the experimental results, it is found that the space between cable lines intensifies the cable combustion and accelerates the flame spread. The predictions by the enhanced model show good agreements with the experimental data. At the same time, it is shown that the enhanced model is capable of predicting the different behaviors of cable fires with different cable spacing by adjusting the flame spread speed only.

  16. An enhanced fire hazard assessment model and validation experiments for vertical cable trays

    Energy Technology Data Exchange (ETDEWEB)

    Li, Lu [Sate Key Laboratory of Fire Science, University of Science and Technology of China, Hefei 230027 (China); Huang, Xianjia, E-mail: huangxianjia@gziit.ac.cn [Joint Laboratory of Fire Safety in Nuclear Power Plants, Institute of Industry Technology Guangzhou & Chinese Academy of Sciences, Guangzhou 511458 (China); Bi, Kun; Liu, Xiaoshuang [China Nuclear Power Design Co., Ltd., Shenzhen 518045 (China)

    2016-05-15

    Highlights: • An enhanced model was developed for vertical cable fire hazard assessment in NPP. • The validated experiments on vertical cable tray fires were conducted. • The capability of the model for cable tray with different cable spacing were tested. - Abstract: The model, referred to as FLASH-CAT (Flame Spread over Horizontal Cable Trays), was developed to estimate the heat release rate for vertical cable tray fire. The focus of this work is to investigate the application of an enhanced model to the single vertical cable tray fires with different cable spacing. The experiments on vertical cable tray fires with three typical cable spacing were conducted. The histories of mass loss rate and flame length were recorded during the cable fire. From the experimental results, it is found that the space between cable lines intensifies the cable combustion and accelerates the flame spread. The predictions by the enhanced model show good agreements with the experimental data. At the same time, it is shown that the enhanced model is capable of predicting the different behaviors of cable fires with different cable spacing by adjusting the flame spread speed only.

  17. Validation of the U-238 inelastic scattering neutron cross section through the EXCALIBUR dedicated experiment

    Science.gov (United States)

    Leconte, Pierre; Bernard, David

    2017-09-01

    EXCALIBUR is an integral transmission experiment based on the fast neutron source produced by the bare highly enriched fast burst reactor CALIBAN, located in CEA/DAM Valduc (France). Two experimental campaigns have been performed, one using a sphere of diameter 17 cm and one using two cylinders of 17 cm diameter 9 cm height, both made of metallic Uranium 238. A set of 15 different dosimeters with specific threshold energies have been employed to provide information on the neutron flux attenuation as a function of incident energy. Measurements uncertainties are typically in the range of 0.5-3% (1σ). The analysis of these experiments is performed with the TRIPOLI4 continuous energy Monte Carlo code. A calculation benchmark with validated simplifications is defined in order to improve the statistical convergence under 2%. Various 238U evaluations have been tested: JEFF-3.1.1, ENDF/B-VII.1 and the IB36 evaluation from IAEA. A sensitivity analysis is presented to identify the contribution of each reaction cross section to the integral transmission rate. This feedback may be of interest for the international effort on 238U, through the CIELO project.

  18. Experiments to populate and validate a processing model for polyurethane foam. BKC 44306 PMDI-10

    Energy Technology Data Exchange (ETDEWEB)

    Mondy, Lisa Ann [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rao, Rekha Ranjana [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Shelden, Bion [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Soehnel, Melissa Marie [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); O' Hern, Timothy J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grillet, Anne [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Celina, Mathias C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wyatt, Nicholas B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Russick, Edward Mark [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bauer, Stephen J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hileman, Michael Bryan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Urquhart, Alexander [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Thompson, Kyle Richard [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Smith, David Michael [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-03-01

    We are developing computational models to elucidate the expansion and dynamic filling process of a polyurethane foam, PMDI. The polyurethane of interest is chemically blown, where carbon dioxide is produced via the reaction of water, the blowing agent, and isocyanate. The isocyanate also reacts with polyol in a competing reaction, which produces the polymer. Here we detail the experiments needed to populate a processing model and provide parameters for the model based on these experiments. The model entails solving the conservation equations, including the equations of motion, an energy balance, and two rate equations for the polymerization and foaming reactions, following a simplified mathematical formalism that decouples these two reactions. Parameters for the polymerization kinetics model are reported based on infrared spectrophotometry. Parameters describing the gas generating reaction are reported based on measurements of volume, temperature and pressure evolution with time. A foam rheology model is proposed and parameters determined through steady-shear and oscillatory tests. Heat of reaction and heat capacity are determined through differential scanning calorimetry. Thermal conductivity of the foam as a function of density is measured using a transient method based on the theory of the transient plane source technique. Finally, density variations of the resulting solid foam in several simple geometries are directly measured by sectioning and sampling mass, as well as through x-ray computed tomography. These density measurements will be useful for model validation once the complete model is implemented in an engineering code.

  19. Status Update on the GPM Ground Validation Iowa Flood Studies (IFloodS) Field Experiment

    Science.gov (United States)

    Petersen, Walt; Krajewski, Witold

    2013-04-01

    The overarching objective of integrated hydrologic ground validation activities supporting the Global Precipitation Measurement Mission (GPM) is to provide better understanding of the strengths and limitations of the satellite products, in the context of hydrologic applications. To this end, the GPM Ground Validation (GV) program is conducting the first of several hydrology-oriented field efforts: the Iowa Flood Studies (IFloodS) experiment. IFloodS will be conducted in the central to northeastern part of Iowa in Midwestern United States during the months of April-June, 2013. Specific science objectives and related goals for the IFloodS experiment can be summarized as follows: 1. Quantify the physical characteristics and space/time variability of rain (rates, DSD, process/"regime") and map to satellite rainfall retrieval uncertainty. 2. Assess satellite rainfall retrieval uncertainties at instantaneous to daily time scales and evaluate propagation/impact of uncertainty in flood-prediction. 3. Assess hydrologic predictive skill as a function of space/time scales, basin morphology, and land use/cover. 4. Discern the relative roles of rainfall quantities such as rate and accumulation as compared to other factors (e.g. transport of water in the drainage network) in flood genesis. 5. Refine approaches to "integrated hydrologic GV" concept based on IFloodS experiences and apply to future GPM Integrated GV field efforts. These objectives will be achieved via the deployment of the NASA NPOL S-band and D3R Ka/Ku-band dual-polarimetric radars, University of Iowa X-band dual-polarimetric radars, a large network of paired rain gauge platforms with attendant soil moisture and temperature probes, a large network of both 2D Video and Parsivel disdrometers, and USDA-ARS gauge and soil-moisture measurements (in collaboration with the NASA SMAP mission). The aforementioned measurements will be used to complement existing operational WSR-88D S-band polarimetric radar measurements

  20. Site characterization and validation - Tracer migration experiment in the validation drift, report 2, Part 2: breakthrough curves in the validation drift appendices 5-9

    International Nuclear Information System (INIS)

    Birgersson, L.; Widen, H.; Aagren, T.; Neretnieks, I.; Moreno, L.

    1992-01-01

    Flowrate curves for the 53 sampling areas in the validation drift with measureable flowrates are given. The sampling area 267 is treated as three separate sampling areas; 267:1, 267:2 and 267:3. The total flowrate for these three sampling areas is given in a separate plot. The flowrates are given in ml/h. The time is given in hours since April 27 00:00, 1990. Disturbances in flowrates are observed after 8500 hours due to opening of boreholes C1 and W1. Results from flowrate measurements after 8500 hours are therefore excluded. The tracer breakthrough curves for 38 sampling areas in the validation drift are given as concentration values versus time. The sampling area 267 is treated as three separate sampling areas; 267:1, 267:2 and 267:3. This gives a total of 40 breakthrough curves for each tracer. (au)

  1. Validation of the ABBN/CONSYST constants system. Part 2: Validation through the critical experiments on cores with uranium solutions

    International Nuclear Information System (INIS)

    Ivanova, T.T.; Manturov, G.N.; Nikolaev, M.N.; Rozhikhin, E.V.; Semenov, M.Yu.; Tsiboulia, A.M.

    1999-01-01

    Results of calculations of critical assemblies with the cores of uranium solutions for the considered series of the experiments are presented in this paper. The conclusions about acceptability of the ABBN-93.1 cross sections for the calculations of such systems are made. (author)

  2. Validation of ASTECV2.1 based on the QUENCH-08 experiment

    Energy Technology Data Exchange (ETDEWEB)

    Gómez-García-Toraño, Ignacio, E-mail: ignacio.torano@kit.edu [Karlsruhe Institute of Technology, Institute for Neutron Physics and Reactor Technology (INR), Hermann-von-Helmholtz-Platz 1, D-76344 Eggenstein-Leopoldshafen (Germany); Sánchez-Espinoza, Víctor-Hugo; Stieglitz, Robert [Karlsruhe Institute of Technology, Institute for Neutron Physics and Reactor Technology (INR), Hermann-von-Helmholtz-Platz 1, D-76344 Eggenstein-Leopoldshafen (Germany); Stuckert, Juri [Karlsruhe Institute of Technology, Institute for Applied Materials-Applied Materials Physics (IAM-AWP), Hermann-von-Helmholtz-Platz 1, D-76344 Eggenstein-Leopoldshafen (Germany); Laborde, Laurent; Belon, Sébastien [Institut de Radioprotection et de Sûreté Nucléaire (IRSN), Nuclear Safety Division/Safety Research/Severe Accident Department, Saint Paul Lez Durance 13115 (France)

    2017-04-01

    Highlights: • ASTECV2.1 can reproduce QUENCH-08 experimental trends e.g. hydrogen generation. • Radial temperature gradient and heat transfer through argon gap are underestimated. • Mesh sizes lower than 55 mm needed to capture the strong axial temperature gradient. • Minor variations of external electrical resistance strongly affect bundle heat-up. • Modelling of a bypass and inclusion of currents partially overcome discrepancies. - Abstract: The Fukushima accidents have shown that further improvements of Severe Accident Management Guidelines (SAMGs) are still necessary. Hence, the enhancement of severe accident codes and their validation based on integral experiments is pursued worldwide. In particular, the capabilities of the European integral severe accident ASTECV2.1 code are being extended within the CESAM project through the improvement of physical models, code numerics and an extensive code validation. Among the different strategies encompassed in the plant SAMGs, one of the most important ones to prevent core damage is the injection of water into the overheated core (reflooding). However, under certain conditions, reflooding may trigger a sharp hydrogen generation that may jeopardize the containment. Within this work, ASTECV2.1 models describing the early in-vessel phase of the severe accident and its termination by core reflooding are validated against data from the QUENCH test facility. The QUENCH-08, involving the injection of 15 g/s (about 0.6 g/s/rod) of saturated steam at a bundle temperature of 2073 K, has been selected for this comparison. Results show that ASTECV2.1 is able to reproduce the experimental temperatures and oxide thicknesses at representative bundle locations. The predicted total hydrogen generation (76 g) is similar to the experimental one (84 g). In addition, the choices of an axial mesh size lower than 55 mm and of an external electrical resistance of a 7 mΩ/rod have been justified with parametric analyses. Finally, new

  3. Validation of scaffold design optimization in bone tissue engineering: finite element modeling versus designed experiments.

    Science.gov (United States)

    Uth, Nicholas; Mueller, Jens; Smucker, Byran; Yousefi, Azizeh-Mitra

    2017-02-21

    This study reports the development of biological/synthetic scaffolds for bone tissue engineering (TE) via 3D bioplotting. These scaffolds were composed of poly(L-lactic-co-glycolic acid) (PLGA), type I collagen, and nano-hydroxyapatite (nHA) in an attempt to mimic the extracellular matrix of bone. The solvent used for processing the scaffolds was 1,1,1,3,3,3-hexafluoro-2-propanol. The produced scaffolds were characterized by scanning electron microscopy, microcomputed tomography, thermogravimetric analysis, and unconfined compression test. This study also sought to validate the use of finite-element optimization in COMSOL Multiphysics for scaffold design. Scaffold topology was simplified to three factors: nHA content, strand diameter, and strand spacing. These factors affect the ability of the scaffold to bear mechanical loads and how porous the structure can be. Twenty four scaffolds were constructed according to an I-optimal, split-plot designed experiment (DE) in order to generate experimental models of the factor-response relationships. Within the design region, the DE and COMSOL models agreed in their recommended optimal nHA (30%) and strand diameter (460 μm). However, the two methods disagreed by more than 30% in strand spacing (908 μm for DE; 601 μm for COMSOL). Seven scaffolds were 3D-bioplotted to validate the predictions of DE and COMSOL models (4.5-9.9 MPa measured moduli). The predictions for these scaffolds showed relative agreement for scaffold porosity (mean absolute percentage error of 4% for DE and 13% for COMSOL), but were substantially poorer for scaffold modulus (51% for DE; 21% for COMSOL), partly due to some simplifying assumptions made by the models. Expanding the design region in future experiments (e.g., higher nHA content and strand diameter), developing an efficient solvent evaporation method, and exerting a greater control over layer overlap could allow developing PLGA-nHA-collagen scaffolds to meet the mechanical requirements for

  4. Copper benchmark experiment at the Frascati Neutron Generator for nuclear data validation

    Energy Technology Data Exchange (ETDEWEB)

    Angelone, M., E-mail: maurizio.angelone@enea.it; Flammini, D.; Loreti, S.; Moro, F.; Pillon, M.; Villari, R.

    2016-11-01

    Highlights: • A benchmark experiment was performed using pure copper with 14 MeV neutrons. • The experiment was performed at the Frascati Neutron Generator (FNG). • Activation foils, thermoluminescent dosimeters and scintillators were used to measure reactions rates (RR), nuclear heating and neutron spectra. • The paper presents the RR measurements and the post analysis using MCNP5 and JEFF-3.1.1, JEFF-3.2 and FENDL-3.1 libraries. • C/Es are presented showing the need for deep revision of Cu cross sections. - Abstract: A neutronics benchmark experiment on a pure Copper block (dimensions 60 × 70 × 60 cm{sup 3}), aimed at testing and validating the recent nuclear data libraries for fusion applications, was performed at the 14-MeV Frascati Neutron Generator (FNG) as part of a F4E specific grant (F4E-FPA-395-01) assigned to the European Consortium on Nuclear Data and Experimental Techniques. The relevant neutronics quantities (e.g., reaction rates, neutron flux spectra, doses, etc.) were measured using different experimental techniques and the results were compared to the calculated quantities using fusion relevant nuclear data libraries. This paper focuses on the analyses carried-out by ENEA through the activation foils techniques. {sup 197}Au(n,γ){sup 198}Au, {sup 186}W(n,γ){sup 187}W, {sup 115}In(n,n′){sup 115}In, {sup 58}Ni(n,p){sup 58}Co, {sup 27}Al(n,α){sup 24}Na, {sup 93}Nb(n,2n){sup 92}Nb{sup m} activation reactions were used. The foils were placed at eight different positions along the Cu block and irradiated with 14 MeV neutrons. Activation measurements were performed by means of High Purity Germanium (HPGe) detector. Detailed simulation of the experiment was carried-out using MCNP5 Monte Carlo code and the European JEFF-3.1.1 and 3.2 nuclear cross-sections data files for neutron transport and IRDFF-v1.05 library for the reaction rates in activation foils. The calculated reaction rates (C) were compared to the experimental quantities (E) and

  5. Optimization of the Severe Accident Management Strategy for Domestic Plants and Validation Experiments

    International Nuclear Information System (INIS)

    Kim, S. B.; Kim, H. D.; Koo, K. M.; Park, R. J.; Hong, S. H.; Cho, Y. R.; Kim, J. T.; Ha, K. S.; Kang, K. H.

    2007-04-01

    nuclear power plants, a technical basis report and computational aid tools were developed in parallel with the experimental and analytical works for the resolution of the uncertain safety issues. ELIAS experiments were carried out to quantify the boiling heat removal rate at the upper surface of a metallic layer for precise evaluations on the effect of a late in-vessel coolant injection. T-HERMES experiments were performed to examine the two-phase natural circulation phenomena through the gap between the reactor vessel and the insulator in the APR1400. Detailed analyses on the hydrogen control in the APR1400 containment were performed focused on the effect of spray system actuation on the hydrogen burning and the evaluation of the hydrogen behavior in the IRWST. To develop the technical basis report for the severe accident management, analyses using SCDAP/RELAP5 code were performed for the accident sequences of the OPR1000. Based on the experimental and analytical results performed in this study, the computational aids for the evaluations of hydrogen flammability in the containment, criteria of the in-vessel corium cooling, criteria of the external reactor vessel cooling were developed. An ASSA code was developed to validate the signal from the instrumentations during the severe accidents and to process the abnormal signal. Since ASSA can perform the signal processing from the direct input of the nuclear power plant during the severe accident, it can be platform of the computational aids. In this study, the ASSA was linked with the computaional aids for the hydrogen flammability

  6. Validation of VHTRC calculation benchmark of critical experiment using the MCB code

    Directory of Open Access Journals (Sweden)

    Stanisz Przemysław

    2016-01-01

    Full Text Available The calculation benchmark problem Very High Temperature Reactor Critical (VHTR a pin-in-block type core critical assembly has been investigated with the Monte Carlo Burnup (MCB code in order to validate the latest version of Nuclear Data Library based on ENDF format. Executed benchmark has been made on the basis of VHTR benchmark available from the International Handbook of Evaluated Reactor Physics Benchmark Experiments. This benchmark is useful for verifying the discrepancies in keff values between various libraries and experimental values. This allows to improve accuracy of the neutron transport calculations that may help in designing the high performance commercial VHTRs. Almost all safety parameters depend on the accuracy of neutron transport calculation results that, in turn depend on the accuracy of nuclear data libraries. Thus, evaluation of the libraries applicability to VHTR modelling is one of the important subjects. We compared the numerical experiment results with experimental measurements using two versions of available nuclear data (ENDF-B-VII.1 and JEFF-3.2 prepared for required temperatures. Calculations have been performed with the MCB code which allows to obtain very precise representation of complex VHTR geometry, including the double heterogeneity of a fuel element. In this paper, together with impact of nuclear data, we discuss also the impact of different lattice modelling inside the fuel pins. The discrepancies of keff have been successfully observed and show good agreement with each other and with the experimental data within the 1 σ range of the experimental uncertainty. Because some propagated discrepancies observed, we proposed appropriate corrections in experimental constants which can improve the reactivity coefficient dependency. Obtained results confirm the accuracy of the new Nuclear Data Libraries.

  7. Optimization of the Severe Accident Management Strategy for Domestic Plants and Validation Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Kim, S. B.; Kim, H. D.; Koo, K. M.; Park, R. J.; Hong, S. H.; Cho, Y. R.; Kim, J. T.; Ha, K. S.; Kang, K. H

    2007-04-15

    nuclear power plants, a technical basis report and computational aid tools were developed in parallel with the experimental and analytical works for the resolution of the uncertain safety issues. ELIAS experiments were carried out to quantify the boiling heat removal rate at the upper surface of a metallic layer for precise evaluations on the effect of a late in-vessel coolant injection. T-HERMES experiments were performed to examine the two-phase natural circulation phenomena through the gap between the reactor vessel and the insulator in the APR1400. Detailed analyses on the hydrogen control in the APR1400 containment were performed focused on the effect of spray system actuation on the hydrogen burning and the evaluation of the hydrogen behavior in the IRWST. To develop the technical basis report for the severe accident management, analyses using SCDAP/RELAP5 code were performed for the accident sequences of the OPR1000. Based on the experimental and analytical results performed in this study, the computational aids for the evaluations of hydrogen flammability in the containment, criteria of the in-vessel corium cooling, criteria of the external reactor vessel cooling were developed. An ASSA code was developed to validate the signal from the instrumentations during the severe accidents and to process the abnormal signal. Since ASSA can perform the signal processing from the direct input of the nuclear power plant during the severe accident, it can be platform of the computational aids. In this study, the ASSA was linked with the computaional aids for the hydrogen flammability.

  8. Validation of the new filters configuration for the RPC gas systems at LHC experiments

    CERN Document Server

    Mandelli, Beatrice; Guida, Roberto; Hahn, Ferdinand; Haider, Stefan

    2012-01-01

    Resistive Plate Chambers (RPCs) are widely employed as muon trigger systems at the Large Hadron Collider (LHC) experiments. Their large detector volume and the use of a relatively expensive gas mixture make a closed-loop gas circulation unavoidable. The return gas of RPCs operated in conditions similar to the experimental background foreseen at LHC contains large amount of impurities potentially dangerous for long-term operation. Several gas-cleaning agents, characterized during the past years, are currently in use. New test allowed understanding of the properties and performance of a large number of purifiers. On that basis, an optimal combination of different filters consisting of Molecular Sieve (MS) 5Å and 4Å, and a Cu catalyst R11 has been chosen and validated irradiating a set of RPCs at the CERN Gamma Irradiation Facility (GIF) for several years. A very important feature of this new configuration is the increase of the cycle duration for each purifier, which results in better system stabilit...

  9. Investigation of the uncertainty of a validation experiment due to uncertainty in its boundary conditions

    International Nuclear Information System (INIS)

    Harris, J.; Nani, D.; Jones, K.; Khodier, M.; Smith, B.L.

    2011-01-01

    Elements contributing to uncertainty in experimental repeatability are quantified for data acquisition in a bank of cylinders. The cylinder bank resembles the lower plenum of a high temperature reactor with cylinders arranged on equilateral triangles with a pitch to diameter ratio of 1.7. The 3-D as-built geometry was measured by imaging reflections off the internal surfaces of the facility. This information is useful for building CFD grids for Validation studies. Time-averaged Particle Image Velocimetry (PIV) measurements were acquired daily over several months along with the pressure drop between two cylinders. The atmospheric pressure was measured along with the data set. The PIV data and pressure drop were correlated with atmospheric conditions and changes in experimental setup. It was found that atmospheric conditions play little role in the channel velocity, but impact the pressure drop significantly. The adjustments made to the experiment setup did not change the results. However, in some cases, the wake behind a cylinder was shifted significantly from one day to the next. These changes did not correlate with ambient pressure, room temperature, nor tear down/rebuilds of the facility. (author)

  10. Validation Study of Unnotched Charpy and Taylor-Anvil Impact Experiments using Kayenta

    Energy Technology Data Exchange (ETDEWEB)

    Kamojjala, Krishna [Idaho National Lab. (INL), Idaho Falls, ID (United States); Lacy, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Chu, Henry S. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Brannon, Rebecca [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-03-01

    Validation of a single computational model with multiple available strain-to-failure fracture theories is presented through experimental tests and numerical simulations of the standardized unnotched Charpy and Taylor-anvil impact tests, both run using the same material model (Kayenta). Unnotched Charpy tests are performed on rolled homogeneous armor steel. The fracture patterns using Kayenta’s various failure options that include aleatory uncertainty and scale effects are compared against the experiments. Other quantities of interest include the average value of the absorbed energy and bend angle of the specimen. Taylor-anvil impact tests are performed on Ti6Al4V titanium alloy. The impact speeds of the specimen are 321 m/s and 393 m/s. The goal of the numerical work is to reproduce the damage patterns observed in the laboratory. For the numerical study, the Johnson-Cook failure model is used as the ductile fracture criterion, and aleatory uncertainty is applied to rate-dependence parameters to explore its effect on the fracture patterns.

  11. The reactor kinetics code tank: a validation against selected SPERT-1b experiments

    International Nuclear Information System (INIS)

    Ellis, R.J.

    1990-01-01

    The two-dimensional space-time analysis code TANK is being developed for the simulation of transient behaviour in the MAPLE class of research reactors. MAPLE research reactor cores are compact, light-water-cooled and -moderated, with a high degree of forced subcooling. The SPERT-1B(24/32) reactor core had many similarities to MAPLE-X10, and the results of the SPERT transient experiments are well documented. As a validation of TANK, a series of simulations of certain SPERT reactor transients was undertaken. Special features were added to the TANK code to model reactors with plate-type fuel and to allow for the simulation of rapid void production. The results of a series of super-prompt-critical reactivity step-insertion transient simulations are presented. The selected SPERT transients were all initiated from low power, at ambient temperatures, and with negligible coolant flow. Th results of the TANK simulations are in good agreement with the trends in the experimental SPERT data

  12. The List of Threatening Experiences: the reliability and validity of a brief life events questionnaire.

    Science.gov (United States)

    Brugha, T S; Cragg, D

    1990-07-01

    During the 23 years since the original work of Holmes & Rahe, research into stressful life events on human subjects has tended towards the development of longer and more complex inventories. The List of Threatening Experiences (LTE) of Brugha et al., by virtue of its brevity, overcomes difficulties of clinical application. In a study of 50 psychiatric patients and informants, the questionnaire version of the list (LTE-Q) was shown to have high test-retest reliability, and good agreement with informant information. Concurrent validity, based on the criterion of independently rated adversity derived from a semistructured life events interview, making use of the Life Events and Difficulties Scales (LEDS) method developed by Brown & Harris, showed both high specificity and sensitivity. The LTE-Q is particularly recommended for use in psychiatric, psychological and social studies in which other intervening variables such as social support, coping, and cognitive variables are of interest, and resources do not allow for the use of extensive interview measures of stress.

  13. Fission product release from nuclear fuel II. Validation of ASTEC/ELSA on analytical and large scale experiments

    International Nuclear Information System (INIS)

    Brillant, G.; Marchetto, C.; Plumecocq, W.

    2013-01-01

    Highlights: • A wide range of experiments is presented for the ASTEC/ELSA code validation. • Analytical tests such as AECL, ORNL and VERCORS are considered. • A large-scale experiment, PHEBUS FPT1, is considered. • The good agreement with measurements shows the efficiency of the ASTEC modelling. • Improvements concern the FP release modelling from MOX and high burn-up UO 2 fuels. - Abstract: This article is the second of two articles dedicated to the mechanisms of fission product release from a degraded core. The models of fission product release from nuclear fuel in the ASTEC code have been described in detail in the first part of this work (Brillant et al., this issue). In this contribution, the validation of ELSA, the module of ASTEC that deals with fission product and structural material release from a degraded core, is presented. A large range of experimental tests, with various temperature and conditions for the fuel surrounding atmosphere (oxidising and reducing), is thus simulated with the ASTEC code. The validation database includes several analytical experiments with both bare fuel (e.g. MCE1 experiments) and cladded fuel (e.g. HCE3, VERCORS). Furthermore, the PHEBUS large-scale experiments are used for the validation of ASTEC. The rather satisfactory comparison between ELSA calculations and experimental measurements demonstrates the efficiency of the analytical models to describe fission product release in severe accident conditions

  14. Validation of RNAi Silencing Efficiency Using Gene Array Data shows 18.5% Failure Rate across 429 Independent Experiments

    Directory of Open Access Journals (Sweden)

    Gyöngyi Munkácsy

    2016-01-01

    Full Text Available No independent cross-validation of success rate for studies utilizing small interfering RNA (siRNA for gene silencing has been completed before. To assess the influence of experimental parameters like cell line, transfection technique, validation method, and type of control, we have to validate these in a large set of studies. We utilized gene chip data published for siRNA experiments to assess success rate and to compare methods used in these experiments. We searched NCBI GEO for samples with whole transcriptome analysis before and after gene silencing and evaluated the efficiency for the target and off-target genes using the array-based expression data. Wilcoxon signed-rank test was used to assess silencing efficacy and Kruskal–Wallis tests and Spearman rank correlation were used to evaluate study parameters. All together 1,643 samples representing 429 experiments published in 207 studies were evaluated. The fold change (FC of down-regulation of the target gene was above 0.7 in 18.5% and was above 0.5 in 38.7% of experiments. Silencing efficiency was lowest in MCF7 and highest in SW480 cells (FC = 0.59 and FC = 0.30, respectively, P = 9.3E−06. Studies utilizing Western blot for validation performed better than those with quantitative polymerase chain reaction (qPCR or microarray (FC = 0.43, FC = 0.47, and FC = 0.55, respectively, P = 2.8E−04. There was no correlation between type of control, transfection method, publication year, and silencing efficiency. Although gene silencing is a robust feature successfully cross-validated in the majority of experiments, efficiency remained insufficient in a significant proportion of studies. Selection of cell line model and validation method had the highest influence on silencing proficiency.

  15. Improving the quality of discrete-choice experiments in health: how can we assess validity and reliability?

    Science.gov (United States)

    Janssen, Ellen M; Marshall, Deborah A; Hauber, A Brett; Bridges, John F P

    2017-12-01

    The recent endorsement of discrete-choice experiments (DCEs) and other stated-preference methods by regulatory and health technology assessment (HTA) agencies has placed a greater focus on demonstrating the validity and reliability of preference results. Areas covered: We present a practical overview of tests of validity and reliability that have been applied in the health DCE literature and explore other study qualities of DCEs. From the published literature, we identify a variety of methods to assess the validity and reliability of DCEs. We conceptualize these methods to create a conceptual model with four domains: measurement validity, measurement reliability, choice validity, and choice reliability. Each domain consists of three categories that can be assessed using one to four procedures (for a total of 24 tests). We present how these tests have been applied in the literature and direct readers to applications of these tests in the health DCE literature. Based on a stakeholder engagement exercise, we consider the importance of study characteristics beyond traditional concepts of validity and reliability. Expert commentary: We discuss study design considerations to assess the validity and reliability of a DCE, consider limitations to the current application of tests, and discuss future work to consider the quality of DCEs in healthcare.

  16. A systematic review of the reliability and validity of discrete choice experiments in valuing non-market environmental goods.

    Science.gov (United States)

    Rakotonarivo, O Sarobidy; Schaafsma, Marije; Hockley, Neal

    2016-12-01

    While discrete choice experiments (DCEs) are increasingly used in the field of environmental valuation, they remain controversial because of their hypothetical nature and the contested reliability and validity of their results. We systematically reviewed evidence on the validity and reliability of environmental DCEs from the past thirteen years (Jan 2003-February 2016). 107 articles met our inclusion criteria. These studies provide limited and mixed evidence of the reliability and validity of DCE. Valuation results were susceptible to small changes in survey design in 45% of outcomes reporting reliability measures. DCE results were generally consistent with those of other stated preference techniques (convergent validity), but hypothetical bias was common. Evidence supporting theoretical validity (consistency with assumptions of rational choice theory) was limited. In content validity tests, 2-90% of respondents protested against a feature of the survey, and a considerable proportion found DCEs to be incomprehensible or inconsequential (17-40% and 10-62% respectively). DCE remains useful for non-market valuation, but its results should be used with caution. Given the sparse and inconclusive evidence base, we recommend that tests of reliability and validity are more routinely integrated into DCE studies and suggest how this might be achieved. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  17. The Development of the Functional Literacy Experience Scale Based upon Ecological Theory (FLESBUET) and Validity-Reliability Study

    Science.gov (United States)

    Özenç, Emine Gül; Dogan, M. Cihangir

    2014-01-01

    This study aims to perform a validity-reliability test by developing the Functional Literacy Experience Scale based upon Ecological Theory (FLESBUET) for primary education students. The study group includes 209 fifth grade students at Sabri Taskin Primary School in the Kartal District of Istanbul, Turkey during the 2010-2011 academic year.…

  18. Examining the Internal Validity and Statistical Precision of the Comparative Interrupted Time Series Design by Comparison with a Randomized Experiment

    Science.gov (United States)

    St.Clair, Travis; Cook, Thomas D.; Hallberg, Kelly

    2014-01-01

    Although evaluators often use an interrupted time series (ITS) design to test hypotheses about program effects, there are few empirical tests of the design's validity. We take a randomized experiment on an educational topic and compare its effects to those from a comparative ITS (CITS) design that uses the same treatment group as the experiment…

  19. EVLncRNAs: a manually curated database for long non-coding RNAs validated by low-throughput experiments

    Science.gov (United States)

    Zhao, Huiying; Yu, Jiafeng; Guo, Chengang; Dou, Xianghua; Song, Feng; Hu, Guodong; Cao, Zanxia; Qu, Yuanxu

    2018-01-01

    Abstract Long non-coding RNAs (lncRNAs) play important functional roles in various biological processes. Early databases were utilized to deposit all lncRNA candidates produced by high-throughput experimental and/or computational techniques to facilitate classification, assessment and validation. As more lncRNAs are validated by low-throughput experiments, several databases were established for experimentally validated lncRNAs. However, these databases are small in scale (with a few hundreds of lncRNAs only) and specific in their focuses (plants, diseases or interactions). Thus, it is highly desirable to have a comprehensive dataset for experimentally validated lncRNAs as a central repository for all of their structures, functions and phenotypes. Here, we established EVLncRNAs by curating lncRNAs validated by low-throughput experiments (up to 1 May 2016) and integrating specific databases (lncRNAdb, LncRANDisease, Lnc2Cancer and PLNIncRBase) with additional functional and disease-specific information not covered previously. The current version of EVLncRNAs contains 1543 lncRNAs from 77 species that is 2.9 times larger than the current largest database for experimentally validated lncRNAs. Seventy-four percent lncRNA entries are partially or completely new, comparing to all existing experimentally validated databases. The established database allows users to browse, search and download as well as to submit experimentally validated lncRNAs. The database is available at http://biophy.dzu.edu.cn/EVLncRNAs. PMID:28985416

  20. [Validation of knowledge acquired from experience: opportunity or threat for nurses working in operating theatres?].

    Science.gov (United States)

    Chauvat-Bouëdec, Cécile

    2005-06-01

    The law n 2002-73, dated 17 January 2002, of social modernisation, as it is called, reformed continuing professional training in France. It established a new system of professional certification, the validation of the knowledge acquired from experience (VAE in French). Since 2003, the Health Ministry has been studying a project to set up the VAE for health professions, among which, in particular, the profession of the state registered nurse working in operating theatres (IBODES in French). A state diploma sanctions the training enabling to practise this profession. In the future, the VAE will open a new access way to this diploma. Does this evolution constitute a threat for the profession, and a risk or an opportunity for individual people? The aim of this thesis is to characterise the impacts of the VAE on the IBODE profession and its current system of training. Two sociological and educational approaches are comforted by a field survey. A historical background of the IBODE profession develops the evolution of the caring practices, and presents the evolution of the training systems. A sociological approach enables to analyse the vocational focus of the IBODE on looking at functionalist theories. Therefore, the study enables to think that the VAE will have no consequences on the vocational focus of the IBODE. The VAE is then the object of an educational approach within the context of continuing professional training. The topics on which it could apply and the resistances it causes are studied. Some examples are taken within other Ministries. This study shows that the VAE involves an adaptation of training centres. The VAE constitutes a genuine opportunity for the IBODE profession. However, to manage its setting up in a delicate human context, the field professionals should be involved as early as possible in the reflection initiated by the Ministry.

  1. Model validation of GAMMA code with heat transfer experiment for KO TBM in ITER

    International Nuclear Information System (INIS)

    Yum, Soo Been; Lee, Eo Hwak; Lee, Dong Won; Park, Goon Cherl

    2013-01-01

    Highlights: ► In this study, helium supplying system was constructed. ► Preparation for heat transfer experiment in KO TBM condition using helium supplying system was progressed. ► To get more applicable results, test matrix was made to cover the condition for KO TBM. ► Using CFD code; CFX 11, validation and modification for system code GAMMA was performed. -- Abstract: By considering the requirements for a DEMO-relevant blanket concept, Korea (KO) has proposed a He cooled molten lithium (HCML) test blanket module (TBM) for testing in ITER. A performance analysis for the thermal–hydraulics and a safety analysis for the KO TBM have been carried out using a commercial CFD code, ANSYS-CFX, and a system code, GAMMA (GAs multicomponent mixture analysis), which was developed by the gas cooled reactor in Korea. To verify the codes, a preliminary study was performed by Lee using a single TBM first wall (FW) mock-up made from the same material as the KO TBM, ferritic martensitic steel, using a 6 MPa nitrogen gas loop. The test was performed at pressures of 1.1, 1.9 and 2.9 MPa, and under various ranges of flow rate from 0.0105 to 0.0407 kg/s with a constant wall temperature condition. In the present study, a thermal–hydraulic test was performed with the newly constructed helium supplying system, in which the design pressure and temperature were 9 MPa and 500 °C, respectively. In the experiment, the same mock-up was used, and the test was performed under the conditions of 3 MPa pressure, 30 °C inlet temperature and 70 m/s helium velocity, which are almost same conditions of the KO TBM FW. One side of the mock-up was heated with a constant heat flux of 0.3–0.5 MW/m 2 using a graphite heating system, KoHLT-2 (Korea heat load test facility-2). Because the comparison result between CFX 11 and GAMMA showed a difference tendency, the modification of heat transfer correlation included in GAMMA was performed. And the modified GAMMA showed the strong parity with CFX

  2. Applying the Mixed Methods Instrument Development and Construct Validation Process: the Transformative Experience Questionnaire

    Science.gov (United States)

    Koskey, Kristin L. K.; Sondergeld, Toni A.; Stewart, Victoria C.; Pugh, Kevin J.

    2018-01-01

    Onwuegbuzie and colleagues proposed the Instrument Development and Construct Validation (IDCV) process as a mixed methods framework for creating and validating measures. Examples applying IDCV are lacking. We provide an illustrative case integrating the Rasch model and cognitive interviews applied to the development of the Transformative…

  3. Evaluation of microplastics in Jurujuba Cove, Niterói, RJ, Brazil, an area of mussels farming.

    Science.gov (United States)

    Castro, Rebeca Oliveira; Silva, Melanie L; Marques, Mônica Regina C; de Araújo, Fábio V

    2016-09-15

    Once non-biodegradable, microplastics remain on the environment absorbing toxic hydrophobic compounds making them a risk to biodiversity when ingested or filtered by organisms and entering in the food chain. To evaluate the potential of the contamination by microplastics in mussels cultivated in Jurujuba Cove, Niterói, RJ, waters of three stations were collected during a rain and dry seasons using a plankton net and later filtered. Microplastics were quantified and characterized morphologically and chemically. The results showed a high concentration of microplastics in both seasons with diversity of colors, types and sizes. Synthetic polymers were present in all samples. The presence of microplastics was probably due to a high and constant load of effluent that this area receives and to the mussel farming activity that use many plastic materials. Areas with high concentrations of microplastics could not be used for mussel cultivation due to the risk of contamination to consumers. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Characterization of materials for a reactive transport model validation experiment: Interim report on the caisson experiment. Yucca Mountain Site Characterization Project

    International Nuclear Information System (INIS)

    Siegel, M.D.; Cheng, W.C.; Ward, D.B.; Bryan, C.R.

    1995-08-01

    Models used in performance assessment and site characterization activities related to nuclear waste disposal rely on simplified representations of solute/rock interactions, hydrologic flow field and the material properties of the rock layers surrounding the repository. A crucial element in the design of these models is the validity of these simplifying assumptions. An intermediate-scale experiment is being carried out at the Experimental Engineered Test Facility at Los Alamos Laboratory by the Los Alamos and Sandia National Laboratories to develop a strategy to validate key geochemical and hydrological assumptions in performance assessment models used by the Yucca Mountain Site Characterization Project

  5. Characterization of materials for a reactive transport model validation experiment: Interim report on the caisson experiment. Yucca Mountain Site Characterization Project

    Energy Technology Data Exchange (ETDEWEB)

    Siegel, M.D.; Cheng, W.C. [Sandia National Labs., Albuquerque, NM (United States); Ward, D.B.; Bryan, C.R. [Univ. of New Mexico, Albuquerque, NM (United States). Dept. of Earth and Planetary Sciences

    1995-08-01

    Models used in performance assessment and site characterization activities related to nuclear waste disposal rely on simplified representations of solute/rock interactions, hydrologic flow field and the material properties of the rock layers surrounding the repository. A crucial element in the design of these models is the validity of these simplifying assumptions. An intermediate-scale experiment is being carried out at the Experimental Engineered Test Facility at Los Alamos Laboratory by the Los Alamos and Sandia National Laboratories to develop a strategy to validate key geochemical and hydrological assumptions in performance assessment models used by the Yucca Mountain Site Characterization Project.

  6. Development and validation of the Consumer Quality index instrument to measure the experience and priority of chronic dialysis patients.

    Science.gov (United States)

    van der Veer, Sabine N; Jager, Kitty J; Visserman, Ella; Beekman, Robert J; Boeschoten, Els W; de Keizer, Nicolette F; Heuveling, Lara; Stronks, Karien; Arah, Onyebuchi A

    2012-08-01

    Patient experience is an established indicator of quality of care. Validated tools that measure both experiences and priorities are lacking for chronic dialysis care, hampering identification of negative experiences that patients actually rate important. We developed two Consumer Quality (CQ) index questionnaires, one for in-centre haemodialysis (CHD) and the other for peritoneal dialysis and home haemodialysis (PHHD) care. The instruments were validated using exploratory factor analyses, reliability analysis of identified scales and assessing the association between reliable scales and global ratings. We investigated opportunities for improvement by combining suboptimal experience with patient priority. Sixteen dialysis centres participated in our study. The pilot CQ index for CHD care consisted of 71 questions. Based on data of 592 respondents, we identified 42 core experience items in 10 scales with Cronbach's α ranging from 0.38 to 0.88; five were reliable (α ≥ 0.70). The instrument identified information on centres' fire procedures as the aspect of care exhibiting the biggest opportunity for improvement. The pilot CQ index PHHD comprised 56 questions. The response of 248 patients yielded 31 core experience items in nine scales with Cronbach's α ranging between 0.53 and 0.85; six were reliable. Information on kidney transplantation during pre-dialysis showed most room for improvement. However, for both types of care, opportunities for improvement were mostly limited. The CQ index reliably and validly captures dialysis patient experience. Overall, most care aspects showed limited room for improvement, mainly because patients participating in our study rated their experience to be optimal. To evaluate items with high priority, but with which relatively few patients have experience, more qualitative instruments should be considered.

  7. A validation study for the gas migration modelling of the compacted bentonite using existing experiment data

    International Nuclear Information System (INIS)

    Tawara, Y.; Mori, K.; Tada, K.; Shimura, T.; Sato, S.; Yamamoto, S.; Hayashi, H.

    2010-01-01

    Document available in extended abstract form only. After the field-scaled Gas Migration Test (GMT) was carried out at Grimsel Test Site (GTS) in Switzerland from 1997 through 2005, a study on advanced gas migration modelling has been conducted as a part of R and D programs of the RWMC (Radioactive Waste Management funding and Research Center) to evaluate long-term behaviour of the Engineered Barrier System (EBS) for the TRU waste disposal system in Japan. One of main objectives of this modelling study is to provide the qualified models and parameters in order to predict long-term gas migration behaviour in compacted bentonite. In addition, from a perspective of coupled THMC (Thermal, Hydrological, Mechanical and Chemical) processes, the specific processes which may have considerable impact to the gas migration behaviour are discussed by means of scoping calculations. Literature survey was conducted to collect experimental data related to gas migration in compacted bentonite in order to discuss an applicability of the existing gas migration models in the bentonite. The well-known flow rate controlled-gas injection experiment by Horseman, et al. and the pressure-controlled-gas injection test using several data with wide range of clay density and water content by Graham, et al, were selected. These literatures show the following characteristic behaviour of gas migration in high compacted and water-saturated bentonite. The observed gas flow rate from the outlet in the experiment by Horseman et al. was numerically reproduced by using the different conceptual models and computer codes, and then an applicability of the models and the identified key parameters such as relative permeability and capillary pressure were discussed. Helium gas was repeatedly injected into fully water-saturated and isotropically consolidated MX-80 bentonite (dry density: 1.6 Mg/m 3 ) in the experiment. One of the most important conclusions from this experiment is that it's impossible for

  8. Use of integral experiments in support to the validation of JEFF-3.2 nuclear data evaluation

    Science.gov (United States)

    Leclaire, Nicolas; Cochet, Bertrand; Jinaphanh, Alexis; Haeck, Wim

    2017-09-01

    For many years now, IRSN has developed its own Monte Carlo continuous energy capability, which allows testing various nuclear data libraries. In that prospect, a validation database of 1136 experiments was built from cases used for the validation of the APOLLO2-MORET 5 multigroup route of the CRISTAL V2.0 package. In this paper, the keff obtained for more than 200 benchmarks using the JEFF-3.1.1 and JEFF-3.2 libraries are compared to benchmark keff values and main discrepancies are analyzed regarding the neutron spectrum. Special attention is paid on benchmarks for which the results have been highly modified between both JEFF-3 versions.

  9. Use of Sensitivity and Uncertainty Analysis to Select Benchmark Experiments for the Validation of Computer Codes and Data

    International Nuclear Information System (INIS)

    Elam, K.R.; Rearden, B.T.

    2003-01-01

    Sensitivity and uncertainty analysis methodologies under development at Oak Ridge National Laboratory were applied to determine whether existing benchmark experiments adequately cover the area of applicability for the criticality code and data validation of PuO 2 and mixed-oxide (MOX) powder systems. The study examined three PuO 2 powder systems and four MOX powder systems that would be useful for establishing mass limits for a MOX fuel fabrication facility. Using traditional methods to choose experiments for criticality analysis validation, 46 benchmark critical experiments were identified as applicable to the PuO 2 powder systems. However, only 14 experiments were thought to be within the area of applicability for dry MOX powder systems.The applicability of 318 benchmark critical experiments, including the 60 experiments initially identified, was assessed. Each benchmark and powder system was analyzed using the Tools for Sensitivity and UNcertainty Analysis Methodology Implementation (TSUNAMI) one-dimensional (TSUNAMI-1D) or TSUNAMI three-dimensional (TSUNAMI-3D) sensitivity analysis sequences, which will be included in the next release of the SCALE code system. This sensitivity data and cross-section uncertainty data were then processed with TSUNAMI-IP to determine the correlation of each application to each experiment in the benchmarking set. Correlation coefficients are used to assess the similarity between systems and determine the applicability of one system for the code and data validation of another.The applicability of most of the experiments identified using traditional methods was confirmed by the TSUNAMI analysis. In addition, some PuO 2 and MOX powder systems were determined to be within the area of applicability of several other benchmarks that would not have been considered using traditional methods. Therefore, the number of benchmark experiments useful for the validation of these systems exceeds the number previously expected. The TSUNAMI analysis

  10. Complementary role of critical integral experiment and power reactor start-up experiments for LMFBR neutronics data and method validation

    International Nuclear Information System (INIS)

    Salvatores, M.

    1986-09-01

    Both critical experiments and power reactor results play at present a complementary role in reducing the uncertainties in Key design parameters for LMFBR, which can be relevant for the economic performances of this type of reactors

  11. Father for the first time - development and validation of a questionnaire to assess fathers’ experiences of first childbirth (FTFQ

    Directory of Open Access Journals (Sweden)

    Premberg Åsa

    2012-05-01

    Full Text Available Abstract Background A father’s experience of the birth of his first child is important not only for his birth-giving partner but also for the father himself, his relationship with the mother and the newborn. No validated questionnaire assessing first-time fathers' experiences during childbirth is currently available. Hence, the aim of this study was to develop and validate an instrument to assess first-time fathers’ experiences of childbirth. Method Domains and items were initially derived from interviews with first-time fathers, and supplemented by a literature search and a focus group interview with midwives. The comprehensibility, comprehension and relevance of the items were evaluated by four paternity research experts and a preliminary questionnaire was pilot tested in eight first-time fathers. A revised questionnaire was completed by 200 first-time fathers (response rate = 81% Exploratory factor analysis using principal component analysis with varimax rotation was performed and multitrait scaling analysis was used to test scaling assumptions. External validity was assessed by means of known-groups analysis. Results Factor analysis yielded four factors comprising 22 items and accounting 48% of the variance. The domains found were Worry, Information, Emotional support and Acceptance. Multitrait analysis confirmed the convergent and discriminant validity of the domains; however, Cronbach’s alpha did not meet conventional reliability standards in two domains. The questionnaire was sensitive to differences between groups of fathers hypothesized to differ on important socio demographic or clinical variables. Conclusions The questionnaire adequately measures important dimensions of first-time fathers’ childbirth experience and may be used to assess aspects of fathers’ experiences during childbirth. To obtain the FTFQ and permission for its use, please contact the corresponding author.

  12. The Sensed Presence Questionnaire (SenPQ: initial psychometric validation of a measure of the “Sensed Presence” experience

    Directory of Open Access Journals (Sweden)

    Joseph M. Barnby

    2017-03-01

    Full Text Available Background The experience of ‘sensed presence’—a feeling or sense that another entity, individual or being is present despite no clear sensory or perceptual evidence—is known to occur in the general population, appears more frequently in religious or spiritual contexts, and seems to be prominent in certain psychiatric or neurological conditions and may reflect specific functions of social cognition or body-image representation systems in the brain. Previous research has relied on ad-hoc measures of the experience and no specific psychometric scale to measure the experience exists to date. Methods Based on phenomenological description in the literature, we created the 16-item Sensed Presence Questionnaire (SenPQ. We recruited participants from (i a general population sample, and; (ii a sample including specific selection for religious affiliation, to complete the SenPQ and additional measures of well-being, schizotypy, social anxiety, social imagery, and spiritual experience. We completed an analysis to test internal reliability, the ability of the SenPQ to distinguish between religious and non-religious participants, and whether the SenPQ was specifically related to positive schizotypical experiences and social imagery. A factor analysis was also conducted to examine underlying latent variables. Results The SenPQ was found to be reliable and valid, with religious participants significantly endorsing more items than non-religious participants, and the scale showing a selective relationship with construct relevant measures. Principal components analysis indicates two potential underlying factors interpreted as reflecting ‘benign’ and ‘malign’ sensed presence experiences. Discussion The SenPQ appears to be a reliable and valid measure of sensed presence experience although further validation in neurological and psychiatric conditions is warranted.

  13. CANDU radiotoxicity inventories estimation: A calculated experiment cross-check for data verification and validation

    International Nuclear Information System (INIS)

    Pavelescu, Alexandru Octavian; Cepraga, Dan Gabriel

    2007-01-01

    This paper is related to the Clearance Potential Index, Ingestion and Inhalation Hazard Factors of the nuclear spent fuel and radioactive wastes. This study required a complex activity that consisted of various phases such us: the acquisition, setting up, validation and application of procedures, codes and libraries. The paper reflects the validation phase of this study. Its objective was to compare the measured inventories of selected actinide and fission products radionuclides in an element from a Pickering CANDU reactor with inventories predicted using a recent version of the ORIGEN-ARP from SCALE 5 coupled with the time dependent cross sections library, CANDU 28.lib, produced by the sequence SAS2H of SCALE 4.4a. In this way, the procedures, codes and libraries for the characterization of radioactive material in terms of radioactive inventories, clearance, and biological hazard factors are being qualified and validated, in support for the safety management of the radioactive wastes. (authors)

  14. Mold-filling experiments for validation of modeling encapsulation. Part 1, "wine glass" mold.

    Energy Technology Data Exchange (ETDEWEB)

    Castaneda, Jaime N.; Grillet, Anne Mary; Altobelli, Stephen A. (New Mexico Resonance, Albuquerque, NM); Cote, Raymond O.; Mondy, Lisa Ann

    2005-06-01

    The C6 project 'Encapsulation Processes' has been designed to obtain experimental measurements for discovery of phenomena critical to improving these processes, as well as data required in the verification and validation plan (Rao et al. 2001) for model validation of flow in progressively complex geometries. We have observed and recorded the flow of clear, Newtonian liquids and opaque, rheologically complex suspensions in two mold geometries. The first geometry is a simple wineglass geometry in a cylinder and is reported here in Part 1. The results in a more realistic encapsulation geometry are reported in Part 2.

  15. Reactivity worth measurements on fast burst reactor Caliban - description and interpretation of integral experiments for the validation of nuclear data

    Energy Technology Data Exchange (ETDEWEB)

    Richard, B. [Commissariat a l' Energie Atomique et Aux Energies Alternatives CEA, DAM, VALDUC, F-21120 Is-sur-Tille (France)

    2012-07-01

    Reactivity perturbation experiments using various materials are being performed on the HEU fast core CALIBAN, an experimental device operated by the CEA VALDUC Criticality and Neutron Transport Research Laboratory. These experiments provide valuable information to contribute to the validation of nuclear data for the materials used in such measurements. This paper presents the results obtained in a first series of measurements performed with Au-197 samples. Experiments which have been conducted in order to improve the characterization of the core are also described and discussed. The experimental results have been compared to numerical calculation using both deterministic and Monte Carlo neutron transport codes with a simplified model of the reactor. This early work led to a methodology which will be applied to the future experiments which will concern other materials of interest. (authors)

  16. A comparison of measurements and calculations for the Stripa validation drift inflow experiment

    International Nuclear Information System (INIS)

    Hodgkinson, D.P.; Cooper, N.S.

    1992-01-01

    This data presents a comparison of measurements and predictions for groundwater flow to the validation drift and remaining portions of the D-holes in the Site Characterisation and Validation (SCV) block. The comparison was carried out of behalf of the Stripa task force on fracture flow modelling. The paper summarises the characterisation data and their preliminary interpretation, and reviews the fracture flow modelling approaches and predictions made by teams from AEA Technology/Fracflow, Golder Associated and Lawrence Berkely Laboratory. The predictions are compared with the inflow measurements on the basis of the validation process and criteria defined by the Task Force. The results of all three modelling groups meet the validation criteria, with the predictions of the inflow being of the same order of magnitude as the observations. Also the AEA/Fracflow and Golder approaches allow the inflow pattern to be predicted and this too is reproduced with reasonable accuracy. The successful completion of this project demonstrates the feasibility of discrete fracture flow modelling, and in particular the ability to collect and analyse all the necessary characterization data in a timely and economic manner. (32 refs.) (au)

  17. Development and validation of the BRIGHTLIGHT Survey, a patient-reported experience measure for young people with cancer.

    Science.gov (United States)

    Taylor, Rachel M; Fern, Lorna A; Solanki, Anita; Hooker, Louise; Carluccio, Anna; Pye, Julia; Jeans, David; Frere-Smith, Tom; Gibson, Faith; Barber, Julie; Raine, Rosalind; Stark, Dan; Feltbower, Richard; Pearce, Susie; Whelan, Jeremy S

    2015-07-28

    Patient experience is increasingly used as an indicator of high quality care in addition to more traditional clinical end-points. Surveys are generally accepted as appropriate methodology to capture patient experience. No validated patient experience surveys exist specifically for adolescents and young adults (AYA) aged 13-24 years at diagnosis with cancer. This paper describes early work undertaken to develop and validate a descriptive patient experience survey for AYA with cancer that encompasses both their cancer experience and age-related issues. We aimed to develop, with young people, an experience survey meaningful and relevant to AYA to be used in a longitudinal cohort study (BRIGHTLIGHT), ensuring high levels of acceptability to maximise study retention. A three-stage approach was employed: Stage 1 involved developing a conceptual framework, conducting literature/Internet searches and establishing content validity of the survey; Stage 2 confirmed the acceptability of methods of administration and consisted of four focus groups involving 11 young people (14-25 years), three parents and two siblings; and Stage 3 established survey comprehension through telephone-administered cognitive interviews with a convenience sample of 23 young people aged 14-24 years. Stage 1: Two-hundred and thirty eight questions were developed from qualitative reports of young people's cancer and treatment-related experience. Stage 2: The focus groups identified three core themes: (i) issues directly affecting young people, e.g. impact of treatment-related fatigue on ability to complete survey; (ii) issues relevant to the actual survey, e.g. ability to answer questions anonymously; (iii) administration issues, e.g. confusing format in some supporting documents. Stage 3: Cognitive interviews indicated high levels of comprehension requiring minor survey amendments. Collaborating with young people with cancer has enabled a survey of to be developed that is both meaningful to young

  18. RCCS Experiments and Validation for High Temperature Gas-Cooled Reactor

    International Nuclear Information System (INIS)

    Chang Oh; Cliff Davis; Goon C. Park

    2007-01-01

    A reactor cavity cooling system (RCCS), an air-cooled helical coil RCCS unit immersed in the water pool, was proposed to overcome the disadvantages of the weak cooling ability of air-cooled RCCS and the complex structure of water-cooled RCCS for the high temperature gas-cooled reactor (HTGR). An experimental apparatus was constructed to investigate the various heat transfer phenomena in the water pool type RCCS, such as the natural convection of air inside the cavity, radiation in the cavity, the natural convection of water in the water pool and the forced convection of air in the cooling pipe. The RCCS experimental results were compared with published correlations. The CFX code was validated using data from the air-cooled portion of the RCCS. The RELAP5 code was validated using measured temperatures from the reactor vessel and cavity walls

  19. WEC-SIM Phase 1 Validation Testing -- Numerical Modeling of Experiments: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Ruehl, Kelley; Michelen, Carlos; Bosma, Bret; Yu, Yi-Hsiang

    2016-08-01

    The Wave Energy Converter Simulator (WEC-Sim) is an open-source code jointly developed by Sandia National Laboratories and the National Renewable Energy Laboratory. It is used to model wave energy converters subjected to operational and extreme waves. In order for the WEC-Sim code to be beneficial to the wave energy community, code verification and physical model validation is necessary. This paper describes numerical modeling of the wave tank testing for the 1:33-scale experimental testing of the floating oscillating surge wave energy converter. The comparison between WEC-Sim and the Phase 1 experimental data set serves as code validation. This paper is a follow-up to the WEC-Sim paper on experimental testing, and describes the WEC-Sim numerical simulations for the floating oscillating surge wave energy converter.

  20. Validation of spectral gas radiation models under oxyfuel conditions. Part A: Gas cell experiments

    DEFF Research Database (Denmark)

    Becher, Valentin; Clausen, Sønnik; Fateev, Alexander

    2011-01-01

    from different databases, two statistical-narrow-band models and the exponential wide band model. The two statistical-narrow-band models EM2C and RADCAL showed a good agreement with a maximal band transmissivity deviation of 3%. The exponential-wide-band model showed a deviation of 6%. The new line......-by-line database HITEMP2010 had the lowest band transmissivity deviation of 2.2% and was recommended as a reference model for the validation of simplified CFD models....

  1. CFD Validation Experiment of a Mach 2.5 Axisymmetric Shock-Wave/Boundary-Layer Interaction

    Science.gov (United States)

    Davis, David Owen

    2015-01-01

    Preliminary results of an experimental investigation of a Mach 2.5 two-dimensional axisymmetric shock-wave/ boundary-layer interaction (SWBLI) are presented. The purpose of the investigation is to create a SWBLI dataset specifically for CFD validation purposes. Presented herein are the details of the facility and preliminary measurements characterizing the facility and interaction region. These results will serve to define the region of interest where more detailed mean and turbulence measurements will be made.

  2. Langmuir probe-based observables for plasma-turbulence code validation and application to the TORPEX basic plasma physics experiment

    International Nuclear Information System (INIS)

    Ricci, Paolo; Theiler, C.; Fasoli, A.; Furno, I.; Labit, B.; Mueller, S. H.; Podesta, M.; Poli, F. M.

    2009-01-01

    The methodology for plasma-turbulence code validation is discussed, with focus on the quantities to use for the simulation-experiment comparison, i.e., the validation observables, and application to the TORPEX basic plasma physics experiment [A. Fasoli et al., Phys. Plasmas 13, 055902 (2006)]. The considered validation observables are deduced from Langmuir probe measurements and are ordered into a primacy hierarchy, according to the number of model assumptions and to the combinations of measurements needed to form each of them. The lowest levels of the primacy hierarchy correspond to observables that require the lowest number of model assumptions and measurement combinations, such as the statistical and spectral properties of the ion saturation current time trace, while at the highest levels, quantities such as particle transport are considered. The comparison of the observables at the lowest levels in the hierarchy is more stringent than at the highest levels. Examples of the use of the proposed observables are applied to a specific TORPEX plasma configuration characterized by interchange-driven turbulence.

  3. Reactivity loss validation of high burn-up PWR fuels with pile-oscillation experiments in MINERVE

    Energy Technology Data Exchange (ETDEWEB)

    Leconte, P.; Vaglio-Gaudard, C.; Eschbach, R.; Di-Salvo, J.; Antony, M.; Pepino, A. [CEA, DEN, DER, Cadarache, F-13108 Saint-Paul-Lez-Durance (France)

    2012-07-01

    The ALIX experimental program relies on the experimental validation of the spent fuel inventory, by chemical analysis of samples irradiated in a PWR between 5 and 7 cycles, and also on the experimental validation of the spent fuel reactivity loss with bum-up, obtained by pile-oscillation measurements in the MINERVE reactor. These latter experiments provide an overall validation of both the fuel inventory and of the nuclear data responsible for the reactivity loss. This program offers also unique experimental data for fuels with a burn-up reaching 85 GWd/t, as spent fuels in French PWRs never exceeds 70 GWd/t up to now. The analysis of these experiments is done in two steps with the APOLLO2/SHEM-MOC/CEA2005v4 package. In the first one, the fuel inventory of each sample is obtained by assembly calculations. The calculation route consists in the self-shielding of cross sections on the 281 energy group SHEM mesh, followed by the flux calculation by the Method Of Characteristics in a 2D-exact heterogeneous geometry of the assembly, and finally a depletion calculation by an iterative resolution of the Bateman equations. In the second step, the fuel inventory is used in the analysis of pile-oscillation experiments in which the reactivity of the ALIX spent fuel samples is compared to the reactivity of fresh fuel samples. The comparison between Experiment and Calculation shows satisfactory results with the JEFF3.1.1 library which predicts the reactivity loss within 2% for burn-up of {approx}75 GWd/t and within 4% for burn-up of {approx}85 GWd/t. (authors)

  4. A newly designed multichannel scaling system: Validated by Feynman-α experiment in EHWZPR

    Energy Technology Data Exchange (ETDEWEB)

    Arkani, Mohammad, E-mail: markani@aeoi.org.ir; Mataji-Kojouri, Naimeddin

    2016-08-15

    Highlights: • An embedded measuring system with enhanced operational capabilities is introduced to the scientists. • The design is low cost and reprogrammable. • The system design is dedicated to multi-detector experiments with huge data collection. • Non count loss effect Feynman-α experiment is performed in EHWZPR. • The results is compared with endogenous/inherent pulsed neutron source experiment. - Abstract: In this work, an embedded multi-input multi-million-channel MCS in a newly design is constructed for multi-detector experimental research applications. Important characteristics of the system are possible to be tuned based on experimental case studies utilizing the reprogrammable nature of the silicon. By means of differentiation of the integrated counts registered in memory, this system is featured as a zero channel advance time measuring tool ideal for experiments on time correlated random processes. Using this equipment, Feynman-α experiment is performed in Esfahan Heavy Water Zero Power Reactor (EHWZPR) utilizing three different in-core neutron detectors. One million channel data is collected by the system in 5 ms gate time from each neutron detector simultaneously. As heavy water moderated reactors are significantly slow systems, a huge number of data channels is required to be collected. Then, by making in use of bunching method, the data is analyzed and prompt neutron decay constant of the system is estimated for each neutron detector positioned in the core. The results are compared with the information provided by endogenous pulsed neutron source experiment and a good agreement is seen within the statistical uncertainties of the results. This equipment makes further research in depth possible in a range of stochastic experiments in nuclear physics such as cross correlation analysis of multi-detector experiments.

  5. Use of integral experiments in support to the validation of JEFF-3.2 nuclear data evaluation

    Directory of Open Access Journals (Sweden)

    Leclaire Nicolas

    2017-01-01

    Full Text Available For many years now, IRSN has developed its own Monte Carlo continuous energy capability, which allows testing various nuclear data libraries. In that prospect, a validation database of 1136 experiments was built from cases used for the validation of the APOLLO2-MORET 5 multigroup route of the CRISTAL V2.0 package. In this paper, the keff obtained for more than 200 benchmarks using the JEFF-3.1.1 and JEFF-3.2 libraries are compared to benchmark keff values and main discrepancies are analyzed regarding the neutron spectrum. Special attention is paid on benchmarks for which the results have been highly modified between both JEFF-3 versions.

  6. Direct-contact condensers for open-cycle OTEC applications: Model validation with fresh water experiments for structured packings

    Energy Technology Data Exchange (ETDEWEB)

    Bharathan, D.; Parsons, B.K.; Althof, J.A.

    1988-10-01

    The objective of the reported work was to develop analytical methods for evaluating the design and performance of advanced high-performance heat exchangers for use in open-cycle thermal energy conversion (OC-OTEC) systems. This report describes the progress made on validating a one-dimensional, steady-state analytical computer of fresh water experiments. The condenser model represents the state of the art in direct-contact heat exchange for condensation for OC-OTEC applications. This is expected to provide a basis for optimizing OC-OTEC plant configurations. Using the model, we examined two condenser geometries, a cocurrent and a countercurrent configuration. This report provides detailed validation results for important condenser parameters for cocurrent and countercurrent flows. Based on the comparisons and uncertainty overlap between the experimental data and predictions, the model is shown to predict critical condenser performance parameters with an uncertainty acceptable for general engineering design and performance evaluations. 33 refs., 69 figs., 38 tabs.

  7. Readout electronics validation and target detector assessment for the Neutrinos Angra experiment

    International Nuclear Information System (INIS)

    Alvarenga, T.A.; Anjos, J.C.; Azzi, G.; Cerqueira, A.S.; Chimenti, P.; Costa, J.A.; Dornelas, T.I.; Farias, P.C.M.A.; Guedes, G.P.; Gonzalez, L.F.G.; Kemp, E.; Lima, H.P.; Machado, R.; Nóbrega, R.A.; Pepe, I.M.; Ribeiro, D.B.S.; Simas Filho, E.F.; Valdiviesso, G.A.; Wagner, S.

    2016-01-01

    A compact surface detector designed to identify the inverse beta decay interaction produced by anti-neutrinos coming from near operating nuclear reactors is being developed by the Neutrinos Angra Collaboration. In this document we describe and test the detector and its readout system by means of cosmic rays acquisition. In this measurement campaign, the target detector has been equipped with 16 8-in PMTs and two scintillator paddles have been used to trigger cosmic ray events. The achieved results disclosed the main operational characteristics of the Neutrinos Angra system and have been used to assess the detector and to validate its readout system.

  8. Measuring Black men's police-based discrimination experiences: Development and validation of the Police and Law Enforcement (PLE) Scale.

    Science.gov (United States)

    English, Devin; Bowleg, Lisa; Del Río-González, Ana Maria; Tschann, Jeanne M; Agans, Robert P; Malebranche, David J

    2017-04-01

    Although social science research has examined police and law enforcement-perpetrated discrimination against Black men using policing statistics and implicit bias studies, there is little quantitative evidence detailing this phenomenon from the perspective of Black men. Consequently, there is a dearth of research detailing how Black men's perspectives on police and law enforcement-related stress predict negative physiological and psychological health outcomes. This study addresses these gaps with the qualitative development and quantitative test of the Police and Law Enforcement (PLE) Scale. In Study 1, we used thematic analysis on transcripts of individual qualitative interviews with 90 Black men to assess key themes and concepts and develop quantitative items. In Study 2, we used 2 focus groups comprised of 5 Black men each (n = 10), intensive cognitive interviewing with a separate sample of Black men (n = 15), and piloting with another sample of Black men (n = 13) to assess the ecological validity of the quantitative items. For Study 3, we analyzed data from a sample of 633 Black men between the ages of 18 and 65 to test the factor structure of the PLE, as we all as its concurrent validity and convergent/discriminant validity. Qualitative analyses and confirmatory factor analyses suggested that a 5-item, 1-factor measure appropriately represented respondents' experiences of police/law enforcement discrimination. As hypothesized, the PLE was positively associated with measures of racial discrimination and depressive symptoms. Preliminary evidence suggests that the PLE is a reliable and valid measure of Black men's experiences of discrimination with police/law enforcement. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  9. TU-D-201-05: Validation of Treatment Planning Dose Calculations: Experience Working with MPPG 5.a

    Energy Technology Data Exchange (ETDEWEB)

    Xue, J; Park, J; Kim, L; Wang, C [MD Anderson Cancer Center at Cooper, Camden, NJ (United States); Balter, P; Ohrt, J; Kirsner, S; Ibbott, G [UT MD Anderson Cancer Center, Houston, TX (United States)

    2016-06-15

    Purpose: Newly published medical physics practice guideline (MPPG 5.a.) has set the minimum requirements for commissioning and QA of treatment planning dose calculations. We present our experience in the validation of a commercial treatment planning system based on MPPG 5.a. Methods: In addition to tests traditionally performed to commission a model-based dose calculation algorithm, extensive tests were carried out at short and extended SSDs, various depths, oblique gantry angles and off-axis conditions to verify the robustness and limitations of a dose calculation algorithm. A comparison between measured and calculated dose was performed based on validation tests and evaluation criteria recommended by MPPG 5.a. An ion chamber was used for the measurement of dose at points of interest, and diodes were used for photon IMRT/VMAT validations. Dose profiles were measured with a three-dimensional scanning system and calculated in the TPS using a virtual water phantom. Results: Calculated and measured absolute dose profiles were compared at each specified SSD and depth for open fields. The disagreement is easily identifiable with the difference curve. Subtle discrepancy has revealed the limitation of the measurement, e.g., a spike at the high dose region and an asymmetrical penumbra observed on the tests with an oblique MLC beam. The excellent results we had (> 98% pass rate on 3%/3mm gamma index) on the end-to-end tests for both IMRT and VMAT are attributed to the quality beam data and the good understanding of the modeling. The limitation of the model and the uncertainty of measurement were considered when comparing the results. Conclusion: The extensive tests recommended by the MPPG encourage us to understand the accuracy and limitations of a dose algorithm as well as the uncertainty of measurement. Our experience has shown how the suggested tests can be performed effectively to validate dose calculation models.

  10. Declining metal levels at Foundry Cove (Hudson River, New York): Response to localized dredging of contaminated sediments

    International Nuclear Information System (INIS)

    Mackie, Joshua A.; Natali, Susan M.; Levinton, Jeffrey S.; Sanudo-Wilhelmy, Sergio A.

    2007-01-01

    This study examines the effectiveness of remediating a well-recognized case of heavy metal pollution at Foundry Cove (FC), Hudson River, New York. This tidal freshwater marsh was polluted with battery-factory wastes (1953-1979) and dredged in 1994-1995. Eight years after remediation, dissolved and particulate metals (Cd, Co, Cu, Pb, Ni, and Ag) were found to be lower than levels in the lower Hudson near New York City. Levels of metals (Co, Ni, Cd) on suspended particles were comparatively high. Concentrations of surface sediment Cd throughout the marsh system remain high, but have decreased both in the dredged and undredged areas: Cd was 2.4-230 mg/kg dw of sediment in 2005 vs. 109-1500 mg/kg in the same area in 1983. The rate of tidal export of Cd from FC has decreased by >300-fold, suggesting that dredging successfully stemmed a major source of Cd to the Hudson River. - Dredging of a hotspot of metal-contaminated sediment is associated with a recognizable local and river-wide decline in cadmium in the Hudson River, New York

  11. Sessile macro-epibiotic community of solitary ascidians, ecosystem engineers in soft substrates of Potter Cove, Antarctica

    Directory of Open Access Journals (Sweden)

    Clara Rimondino

    2015-01-01

    Full Text Available The muddy bottoms of inner Potter Cove, King George Island (Isla 25 de Mayo, South Shetlands, Antarctica, show a high density and richness of macrobenthic species, particularly ascidians. In other areas, ascidians have been reported to play the role of ecosystem engineers, as they support a significant number of epibionts, increasing benthic diversity. In this study, a total of 21 sessile macro-epibiotic taxa present on the ascidian species Corella antarctica Sluiter, 1905, Cnemidocarpa verrucosa (Lesson, 1830 and Molgula pedunculata Herdman, 1881 were identified, with Bryozoa being the most diverse. There were differences between the three ascidian species in terms of richness, percent cover and diversity of sessile macro-epibionts. The morphological characteristics of the tunic surface, the available area for colonization (and its relation with the age of the basibiont individuals and the pH of the ascidian tunic seem to explain the observed differences. Recent environmental changes in the study area (increase of suspended particulate matter caused by glaciers retreat have been related to observed shifts in the benthic community structure, negatively affecting the abundance and distribution of the studied ascidian species. Considering the diversity of sessile macro-epibionts found on these species, the impact of environmental shifts may be greater than that estimated so far.

  12. Geothermal investment analysis with site-specific applications to Roosevelt Hot Springs and Cove Fort-Sulphurdale, Utah

    Energy Technology Data Exchange (ETDEWEB)

    Cassel, T.A.V.; Edelstein, R.H.; Blair, P.D.

    1978-12-01

    The analysis and modeling of investment behavior in the development of hydrothermal electric power facilities are reported. This investment behavior reflects a degree of sensitivity to public policy alternatives concerning taxation and regulation of the resource and its related energy conversion facilities. The objective of the current research is to provide a realistic and theoretically sound means for estimating the impacts of such public policy alternatives. A stochastic simulation model was developed which offers an efficient means for site-specific investment analysis of private sector firms and investors. The results of the first year of work are discussed including the identification, analysis, quantification and modeling of: a decision tree reflecting the sequence of procedures, timing and stochastic elements of hydrothermal resource development projects; investment requirements, expenses and revenues incurred in the exploration, development and utilization of hydrothermal resources for electric power generation; and multiattribute investment decision criteria of the several types of firms in the geothermal industry. An application of the investment model to specific resource sites in the state of Utah is also described. Site specific data for the Known Geothermal Resource Areas of Roosevelt Hot Springs and Cove Fort-Sulphurdale are given together with hypothesized generation capacity growth rates.

  13. Geochronology of the Swift Current granite and host volcanic rocks of the Love Cove group, southwestern Avalon zone, Newfoundland

    International Nuclear Information System (INIS)

    Dallmeyer, R.D.; O'Driscoll, C.F.; Hussey, E.M.

    1981-01-01

    Zircon fractions from the variably deformed and metamorphosed Swift Current granite and host volcanic rocks of the Love Cove Group record individually discordant U-Pb ages with well-defined upper concordia intercept ages of 580 +- 20 and 590 +- 30 Ma, respectively. These are interpreted to be crystallization dates and indicate a late Proterozoic cogmagmatic relationship. Primary hornblende from the pluton record disturbed 40 Ar/ 39 Ar age spectra that suggest postcrystallization argon loss, probably during Acadian (Devonian) regional metamorphism. 40 Ar/ 39 Ar plateau ages of 560-566 Ma are well defined for the hornblende and are interpreted to date times of postmagmatic cooling. The similarity between zircon and hornblende dates suggests relatively rapid postmagmatic cooling. A six-point, Rb-Sr whole-rock isochron age of 548 +- 11 Ma is defined for the pluton. The slight discordancy of this date in comparison with the zircon and hornblende ages may reflect a minor disturbance of whole-rock isotopic systems during Acadian regional metamorphism. (author)

  14. The role of CFD combustion modeling in hydrogen safety management-II: Validation based on homogeneous hydrogen-air experiments

    Energy Technology Data Exchange (ETDEWEB)

    Sathiah, Pratap, E-mail: sathiah@nrg.eu [Nuclear Research and Consultancy Group (NRG), Westerduinweg 3, 1755 ZG Petten (Netherlands); Haren, Steven van, E-mail: vanharen@nrg.eu [Nuclear Research and Consultancy Group (NRG), Westerduinweg 3, 1755 ZG Petten (Netherlands); Komen, Ed, E-mail: komen@nrg.eu [Nuclear Research and Consultancy Group (NRG), Westerduinweg 3, 1755 ZG Petten (Netherlands); Roekaerts, Dirk, E-mail: d.j.e.m.roekaerts@tudelft.nl [Department of Multi-Scale Physics, Delft University of Technology, P.O. Box 5, 2600 AA Delft (Netherlands)

    2012-11-15

    Highlights: Black-Right-Pointing-Pointer A CFD based method is proposed for the simulation of hydrogen deflagration. Black-Right-Pointing-Pointer A dynamic grid adaptation method is proposed to resolve turbulent flame brush thickness. Black-Right-Pointing-Pointer The predictions obtained using this method is in good agreement with the static grid method. Black-Right-Pointing-Pointer TFC model results are in good agreement with large-scale homogeneous hydrogen-air experiments. - Abstract: During a severe accident in a PWR, large quantities of hydrogen can be generated and released into the containment. The generated hydrogen, when mixed with air, can lead to hydrogen combustion. The dynamic pressure loads resulting from hydrogen combustion can be detrimental to the structural integrity of the reactor safety systems and the reactor containment. Therefore, accurate prediction of these pressure loads is an important safety issue. In a previous article, we presented a CFD based method to determine these pressure loads. This CFD method is based on the application of a turbulent flame speed closure combustion model. The validation analyses in our previous paper demonstrated that it is of utmost importance to apply successive mesh and time step refinement in order to get reliable results. In this article, we first determined to what extent the required computational effort required for our CFD approach can be reduced by the application of adaptive mesh refinement, while maintaining the accuracy requirements. Experiments performed within a small fan stirred explosion bomb were used for this purpose. It could be concluded that adaptive grid adaptation is a reliable and efficient method for usage in hydrogen deflagration analyses. For the two-dimensional validation analyses, the application of dynamic grid adaptation resulted in a reduction of the required computational effort by about one order of magnitude. In a second step, the considered CFD approach including adaptive

  15. Our experience with the acceptance and dosimetric validation of Somatom Force dual head MDCT in the Royal Hospital, Oman

    International Nuclear Information System (INIS)

    Al-Harthi, Ruqaia; Al-Kalbani, Munira; Arun Kumar, L.S.; Al-Shanfari, Jamal

    2017-01-01

    Computed Tomography (CT) has revolutionized diagnostic imaging since its discovery in early 70's. In Oman; 70,353 CT examinations were carried out in the year 2015. The increase in CT examinations will eventually result in the increase of population dose and the consequent risk of cancer in adults and particularly in children. Here, we discuss and share our experience with the acceptance and dosimetric validation of second Dual Head Somatom Force MDCT installed in the Royal Hospital, Oman using Ministry of Health's radiation acceptance and quality assurance protocol, before handing over for routine patient care work

  16. Analysis of the impact of correlated benchmark experiments on the validation of codes for criticality safety analysis

    International Nuclear Information System (INIS)

    Bock, M.; Stuke, M.; Behler, M.

    2013-01-01

    The validation of a code for criticality safety analysis requires the recalculation of benchmark experiments. The selected benchmark experiments are chosen such that they have properties similar to the application case that has to be assessed. A common source of benchmark experiments is the 'International Handbook of Evaluated Criticality Safety Benchmark Experiments' (ICSBEP Handbook) compiled by the 'International Criticality Safety Benchmark Evaluation Project' (ICSBEP). In order to take full advantage of the information provided by the individual benchmark descriptions for the application case, the recommended procedure is to perform an uncertainty analysis. The latter is based on the uncertainties of experimental results included in most of the benchmark descriptions. They can be performed by means of the Monte Carlo sampling technique. The consideration of uncertainties is also being introduced in the supplementary sheet of DIN 25478 'Application of computer codes in the assessment of criticality safety'. However, for a correct treatment of uncertainties taking into account the individual uncertainties of the benchmark experiments is insufficient. In addition, correlations between benchmark experiments have to be handled correctly. For example, these correlations can arise due to different cases of a benchmark experiment sharing the same components like fuel pins or fissile solutions. Thus, manufacturing tolerances of these components (e.g. diameter of the fuel pellets) have to be considered in a consistent manner in all cases of the benchmark experiment. At the 2012 meeting of the Expert Group on 'Uncertainty Analysis for Criticality Safety Assessment' (UACSA) of the OECD/NEA a benchmark proposal was outlined that aimed for the determination of the impact on benchmark correlations on the estimation of the computational bias of the neutron multiplication factor (k eff ). The analysis presented here is based on this proposal. (orig.)

  17. How the IS has validated the building of the experiment chamber of the Megajoule Laser

    International Nuclear Information System (INIS)

    Anon.

    2005-01-01

    Imagine an aluminium sphere of more than 11 meters of external diameter and of a mass of about an hundred of tons. Here is the frame of the Megajoule laser experiments chamber in building in the CESTA site of the CEA, near Arcachon. The IS has been in charge of the inspection of the studies and of the fabrication. Narration. (O.M.)

  18. Development of a structure-validated Sexual Dream Experience Questionnaire (SDEQ) in Chinese university students.

    Science.gov (United States)

    Chen, Wanzhen; Qin, Ke; Su, Weiwei; Zhao, Jialian; Zhu, Zhouyu; Fang, Xiangming; Wang, Wei

    2015-01-01

    Sexual dreams reflect the waking-day life, social problems and ethical concerns. The related experience includes different people and settings, and brings various feelings, but there is no systematic measure available to date. We have developed a statement-matrix measuring the sexual dream experience and trialed it in a sample of 390 young Chinese university students who had a life-long sexual dream. After both exploratory and confirmatory factor analyses, we have established a satisfactory model of four-factor (32 items). Together with an item measuring the sexual dream frequency, we developed a Sexual Dream Experience Questionnaire (SDEQ) based on the 32 items, and subsequently named four factors (scales) as joyfulness, aversion, familiarity and bizarreness. No gender differences were found on the four scale scores, and no correlations were found between the four scales and the sexual dream frequency or the sexual experience in real life. The SDEQ might help to characterize the sexual dreams in the healthy people and psychiatric patients. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. ELECTRO-THERMAL AND MECHANICAL VALIDATION EXPERIMENT ON THE LHC MAIN BUSBAR SPLICE CONSOLIDATION

    CERN Document Server

    Willering, GP; Bourcey, N; Bottura, L; Charrondiere, M; Cerqueira Bastos, M; Deferne, G; Dib, G; Giloux, Chr; Grand-Clement, L; Heck, S; Hudson, G; Kudryavtsev, D; Perret, P; Pozzobon, M; Prin, H; Scheuerlein, Chr; Rijllart, A; Triquet, S; Verweij, AP

    2012-01-01

    To eliminate the risk of thermal runaways in LHC interconnections a consolidation by placing shunts on the main bus bar interconnections is proposed by the Task Force Splices Consolidation. To validate the design two special SSS magnet spares are placed on a test bench in SM-18 to measure the interconnection in between with conditions as close as possible to the LHC conditions. Two dipole interconnections are instrumented and prepared with worst-case-conditions to study the thermo-electric stability limits. Two quadrupole interconnections are instrumented and prepared for studying the effect of current cycling on the mechanical stability of the consolidation design. All 4 shunted interconnections showed very stable behaviour, well beyond the LHC design current cycle.

  20. Compression instrument for tissue experiments (cite) at the meso-scale: device validation - biomed 2011.

    Science.gov (United States)

    Evans, Douglas W; Rajagopalan, Padma; Devita, Raffaella; Sparks, Jessica L

    2011-01-01

    Liver sinusoidal endothelial cells (LSECs) are the primary site of numerous transport and exchange processes essential for liver function. LSECs rest on a sparse extracellular matrix layer housed in the space of Disse, a 0.5-1LSECs from hepatocytes. To develop bioengineered liver tissue constructs, it is important to understand the mechanical interactions among LSECs, hepatocytes, and the extracellular matrix in the space of Disse. Currently the mechanical properties of space of Disse matrix are not well understood. The objective of this study was to develop and validate a device for performing mechanical tests at the meso-scale (100nm-100m), to enable novel matrix characterization within the space of Disse. The device utilizes a glass micro-spherical indentor attached to a cantilever made from a fiber optic cable. The 3-axis translation table used to bring the specimen in contact with the indentor and deform the cantilever. A position detector monitors the location of a laser passing through the cantilever and allows for the calculation of subsequent tissue deformation. The design allows micro-newton and nano-newton stress-strain tissue behavior to be quantified. To validate the device accuracy, 11 samples of silicon rubber in two formulations were tested to experimentally confirm their Young's moduli. Prior macroscopic unconfined compression tests determined the formulations of EcoFlex030 (n-6) and EcoFlex010 (n-5) to posses Young's moduli of 92.67+-6.22 and 43.10+-3.29 kPa respectively. Optical measurements taken utilizing CITE's position control and fiber optic cantilever found the moduli to be 106.4 kPa and 47.82 kPa.

  1. The WATERMED field experiment: validation of the AATSR LST product with in situ measurements

    Science.gov (United States)

    Noyes, E.; Soria, G.; Sobrino, J.; Remedios, J.; Llewellyn-Jones, D.; Corlett, G.

    The Advanced Along-Track Scanning Radiometer (AATSR) onboard ESA's Envisat Satellite, is the third in a series of a precision radiometers designed to measure Sea Surface Temperature (SST) with accuracies of better than ± 0.3 K (within 1-sigma limit). Since its launch in March 2001, a prototype AATSR Land Surface Temperature (LST) product has been produced for validation purposes only, with the product becoming operational from mid-2004. The (A)ATSR instrument design is unique in that it has both a nadir- and a forward-view, allowing the Earth's surface to be viewed along two different atmospheric path lengths, thus enabling an improved atmospheric correction to be made when retrieving surface temperature. It also uses an innovative and exceptionally stable on-board calibration system for its infrared channels, which, together with actively cooled detectors, gives extremely high radiometric sensitivity and precision. In this presentation, results from a comparison of the prototype LST product with ground-based measurements obtained at the WATERMED (WATer use Efficiency in natural vegetation and agricultural areas by Remote sensing in the MEDiterranean basin) field site near Marrakech, Morocco, are presented. The comparison shows that the AATSR has a positive bias of + 1.5 K, with a standard deviation of 0.7 K, indicating that the product is operating within the target specification (± 2.5 K) over the WATERMED field site. However, several anomalous validation points were observed during the analysis and we will discuss possible reasons for the occurrence of these data, including their coincidence with the presence of an Envisat blanking pulse (indicating the presence of a radar pulse at the time of AATSR pixel integration). Further investigation into this matter is required as previous investigations have always indicated that the presence of a payload radar pulse does not have any effect on (A)ATSR data quality.

  2. 78 FR 13376 - Draft Environmental Impact Statement for the Cottonwood Cove and Katherine Landing Development...

    Science.gov (United States)

    2013-02-27

    ..., enhance the visitor experience, and mitigate flood hazards. The lake management plan established water... Current Management Trends (no action alternative) reflects current management direction and serves as a... flood mitigation. Alternative 3 Enhance Visitor Experience and Park Operations (agency-preferred...

  3. The correlation of in vivo and ex vivo tissue dielectric properties to validate electromagnetic breast imaging: initial clinical experience

    International Nuclear Information System (INIS)

    Halter, Ryan J; Zhou, Tian; Meaney, Paul M; Hartov, Alex; Barth, Richard J Jr; Rosenkranz, Kari M; Wells, Wendy A; Kogel, Christine A; Borsic, Andrea; Rizzo, Elizabeth J; Paulsen, Keith D

    2009-01-01

    Electromagnetic (EM) breast imaging provides low-cost, safe and potentially a more specific modality for cancer detection than conventional imaging systems. A primary difficulty in validating these EM imaging modalities is that the true dielectric property values of the particular breast being imaged are not readily available on an individual subject basis. Here, we describe our initial experience in seeking to correlate tomographic EM imaging studies with discrete point spectroscopy measurements of the dielectric properties of breast tissue. The protocol we have developed involves measurement of in vivo tissue properties during partial and full mastectomy procedures in the operating room (OR) followed by ex vivo tissue property recordings in the same locations in the excised tissue specimens in the pathology laboratory immediately after resection. We have successfully applied all of the elements of this validation protocol in a series of six women with cancer diagnoses. Conductivity and permittivity gauged from ex vivo samples over the frequency range 100 Hz–8.5 GHz are found to be similar to those reported in the literature. A decrease in both conductivity and permittivity is observed when these properties are gauged from ex vivo samples instead of in vivo. We present these results in addition to a case study demonstrating how discrete point spectroscopy measurements of the tissue can be correlated and used to validate EM imaging studies

  4. Validating Bayesian truth serum in large-scale online human experiments.

    Science.gov (United States)

    Frank, Morgan R; Cebrian, Manuel; Pickard, Galen; Rahwan, Iyad

    2017-01-01

    Bayesian truth serum (BTS) is an exciting new method for improving honesty and information quality in multiple-choice survey, but, despite the method's mathematical reliance on large sample sizes, existing literature about BTS only focuses on small experiments. Combined with the prevalence of online survey platforms, such as Amazon's Mechanical Turk, which facilitate surveys with hundreds or thousands of participants, BTS must be effective in large-scale experiments for BTS to become a readily accepted tool in real-world applications. We demonstrate that BTS quantifiably improves honesty in large-scale online surveys where the "honest" distribution of answers is known in expectation on aggregate. Furthermore, we explore a marketing application where "honest" answers cannot be known, but find that BTS treatment impacts the resulting distributions of answers.

  5. Calculation of LWR kinetic parameter βeff. validation on the mistral experiments

    International Nuclear Information System (INIS)

    Santamarina, Alain; Erradi, Lahoussine

    2011-01-01

    This work presents the analysis of the MISTRAL experiments on the determination of the effective delayed neutron fraction β eff for UOX and MOX Light Water Reactor cores using the APOLLO2.8 code and JEFF-3.1.1 nuclear data library. The objective is to check if the new 8 time groups data in JEFF3 library (instead of the classical 6 groups) allows reducing the Calculation - Experiment discrepancy observed when using ENDF/B-VII or the previous JEF-2 library. Our analysis has shown that the C/E bias is reduced from +2.8% to +0.8% ± 1.6% for the UOX cores and from +0.8% to +0.2% ± 1.6% for the MOX cores. (author)

  6. Validating Bayesian truth serum in large-scale online human experiments

    OpenAIRE

    Pickard, Galen; Frank, Morgan Ryan; Cebrian, Manuel; Rahwan, Iyad

    2016-01-01

    This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Bayesian truth serum (BTS) is an exciting new method for improving honesty and information quality in multiple-choice survey, but, despite the method's mathematical reliance on large sample sizes, existing literature about BTS only focuses on small experiments....

  7. Two-phase CFD PTS validation in an extended range of thermohydraulics conditions covered by the COSI experiment

    International Nuclear Information System (INIS)

    Coste, P.; Ortolan, A.

    2014-01-01

    Highlights: • Models for large interfaces in two-phase CFD were developed for PTS. • The COSI experiment is used for NEPTUNE C FD integral validation. • COSI is a PWR cold leg scaled 1/100 for volume. • Fifty runs are calculated, covering a large range of flow configurations. • The CFD predicting capability is analysed using global and local measurements. - Abstract: In the context of the Pressurized Water Reactors (PWR) life duration safety studies, some models were developed to address the Pressurized Thermal Shock (PTS) from the two-phase CFD angle, dealing with interfaces much larger than cells size and with direct contact condensation. Such models were implemented in NEPTUNE C FD, a 3D transient Eulerian two-fluid model. The COSI experiment is used for its integral validation. It represents a cold leg scaled 1/100 for volume and power from a 900 MW PWR under a large range of LOCA PTS conditions. In this study, the CFD is evaluated in the whole range of parameters and flow configurations covered by the experiment. In a first step, a single choice of mesh and CFD models parameters is fixed and justified. In a second step, fifty runs are calculated. The CFD predicting capability is analysed, comparing the liquid temperature and the total condensation rate with the experiment, discussing their dependency on the inlet cold liquid rate, on the liquid level in the cold leg and on the difference between co-current and counter-current runs. It is shown that NEPTUNE C FD 1.0.8 calculates with a fair agreement a large range of flow configurations related to ECCS injection and steam condensation

  8. In Situ Experiment and Numerical Model Validation of a Borehole Heat Exchanger in Shallow Hard Crystalline Rock

    Directory of Open Access Journals (Sweden)

    Mateusz Janiszewski

    2018-04-01

    Full Text Available Accurate and fast numerical modelling of the borehole heat exchanger (BHE is required for simulation of long-term thermal energy storage in rocks using boreholes. The goal of this study was to conduct an in situ experiment to validate the proposed numerical modelling approach. In the experiment, hot water was circulated for 21 days through a single U-tube BHE installed in an underground research tunnel located at a shallow depth in crystalline rock. The results of the simulations using the proposed model were validated against the measurements. The numerical model simulated the BHE’s behaviour accurately and compared well with two other modelling approaches from the literature. The model is capable of replicating the complex geometrical arrangement of the BHE and is considered to be more appropriate for simulations of BHE systems with complex geometries. The results of the sensitivity analysis of the proposed model have shown that low thermal conductivity, high density, and high heat capacity of rock are essential for maximising the storage efficiency of a borehole thermal energy storage system. Other characteristics of BHEs, such as a high thermal conductivity of the grout, a large radius of the pipe, and a large distance between the pipes, are also preferred for maximising efficiency.

  9. The benchmark experiment on slab beryllium with D–T neutrons for validation of evaluated nuclear data

    Energy Technology Data Exchange (ETDEWEB)

    Nie, Y., E-mail: nieyb@ciae.ac.cn [Science and Technology on Nuclear Data Laboratory, China Institute of Atomic Energy, Beijing 102413 (China); Ren, J.; Ruan, X.; Bao, J. [Science and Technology on Nuclear Data Laboratory, China Institute of Atomic Energy, Beijing 102413 (China); Han, R. [Institute of Modern Physics, Chinese Academy of Sciences, Lanzhou 730000 (China); Zhang, S. [Institute of Modern Physics, Chinese Academy of Sciences, Lanzhou 730000 (China); Inner Mongolia University for the Nationalities, Inner Mongolia, Tongliao 028000 (China); Huang, H.; Li, X. [Science and Technology on Nuclear Data Laboratory, China Institute of Atomic Energy, Beijing 102413 (China); Ding, Y. [Science and Technology on Nuclear Data Laboratory, China Institute of Atomic Energy, Beijing 102413 (China); School of Nuclear Science and Technology, Lanzhou University, Lanzhou 730000 (China); Wu, H.; Liu, P.; Zhou, Z. [Science and Technology on Nuclear Data Laboratory, China Institute of Atomic Energy, Beijing 102413 (China)

    2016-04-15

    Highlights: • Evaluated data for beryllium are validated by a high precision benchmark experiment. • Leakage neutron spectra from pure beryllium slab are measured at 61° and 121° using time-of-flight method. • The experimental results are compared with the MCNP-4B calculations with the evaluated data from different libraries. - Abstract: Beryllium is the most favored neutron multiplier candidate for solid breeder blankets of future fusion power reactors. However, beryllium nuclear data are differently presented in modern nuclear data evaluations. In order to validate the evaluated nuclear data on beryllium, in the present study, a benchmark experiment has been performed at China Institution of Atomic Energy (CIAE). Neutron leakage spectra from pure beryllium slab samples were measured at 61° and 121° using time-of-flight method. The experimental results were compared with the calculated ones by MCNP-4B simulation, using the evaluated data of beryllium from the CENDL-3.1, ENDF/B-VII.1 and JENDL-4.0 libraries. From the comparison between the measured and the calculated spectra, it was found that the calculation results based on CENDL-3.1 caused overestimation in the energy range from about 3–12 MeV at 61°, while at 121°, all the libraries led to underestimation below 3 MeV.

  10. The benchmark experiment on slab beryllium with D–T neutrons for validation of evaluated nuclear data

    International Nuclear Information System (INIS)

    Nie, Y.; Ren, J.; Ruan, X.; Bao, J.; Han, R.; Zhang, S.; Huang, H.; Li, X.; Ding, Y.; Wu, H.; Liu, P.; Zhou, Z.

    2016-01-01

    Highlights: • Evaluated data for beryllium are validated by a high precision benchmark experiment. • Leakage neutron spectra from pure beryllium slab are measured at 61° and 121° using time-of-flight method. • The experimental results are compared with the MCNP-4B calculations with the evaluated data from different libraries. - Abstract: Beryllium is the most favored neutron multiplier candidate for solid breeder blankets of future fusion power reactors. However, beryllium nuclear data are differently presented in modern nuclear data evaluations. In order to validate the evaluated nuclear data on beryllium, in the present study, a benchmark experiment has been performed at China Institution of Atomic Energy (CIAE). Neutron leakage spectra from pure beryllium slab samples were measured at 61° and 121° using time-of-flight method. The experimental results were compared with the calculated ones by MCNP-4B simulation, using the evaluated data of beryllium from the CENDL-3.1, ENDF/B-VII.1 and JENDL-4.0 libraries. From the comparison between the measured and the calculated spectra, it was found that the calculation results based on CENDL-3.1 caused overestimation in the energy range from about 3–12 MeV at 61°, while at 121°, all the libraries led to underestimation below 3 MeV.

  11. SAS4A and FPIN2X validation for slow ramp TOP accidents: experiments TS-1 and TS-2

    International Nuclear Information System (INIS)

    Hill, D.J.

    1986-01-01

    The purpose of this paper is to present further results in the series of experimental analyses being performed using SAS4A and FPIN2X in order to provide a systematic validation of these codes. The two experiments discussed here, TS-1 and TS-2, were performed by Westinghouse Hanford/Hanford Engineering Development Laboratory (WHC/HEDL) in the Transient Reactor Test (TREAT) Facility. They were slow ramp transient overpowers (TOPs) of ∼ 5 cent/s equivalent Fast Flux Test Facility (FFTF) ramp rate, single-pin experiments in flowing sodium loops. The good agreement found here adds significantly to the experimental data base that provides the foundation for SAS4A and FPIN2X validation. It also shows that prefailure internal fuel motion is a phenomenon that has to be correctly accounted for, not only as a potential inherent safety mechanism, but also before any accurate prediction of fuel failure and subsequent fuel motion and the associated reactivity effects can be made. This is also true for metal-fueled pins. This capability is provided by PINACLE, which is being incorporated into SAS4A

  12. Instrumented anvil-on-rod impact experiments for validating constitutive strength model for simulating transient dynamic deformation response of metals

    International Nuclear Information System (INIS)

    Martin, M.; Shen, T.; Thadhani, N.N.

    2008-01-01

    Instrumented anvil-on-rod impact experiments were performed to access the applicability of this approach for validating a constitutive strength model for dynamic, transient-state deformation and elastic-plastic wave interactions in vanadium, 21-6-9 stainless steel, titanium, and Ti-6Al-4V. In addition to soft-catching the impacted rod-shaped samples, their transient deformation states were captured by high-speed imaging, and velocity interferometry was used to record the sample back (free) surface velocity and monitor elastic-plastic wave interactions. Simulations utilizing AUTODYN-2D hydrocode with Steinberg-Guinan constitutive equation were used to generate simulated free surface velocity traces and final/transient deformation profiles for comparisons with experiments. The simulations were observed to under-predict the radial strain for bcc vanadium and fcc steel, but over-predict the radial strain for hcp titanium and Ti-6Al-4V. The correlations illustrate the applicability of the instrumented anvil-on-rod impact test as a method for providing robust model validation based on the entire deformation event, and not just the final deformed state

  13. External validation of the Cardiff model of information sharing to reduce community violence: natural experiment.

    Science.gov (United States)

    Boyle, Adrian A; Snelling, Katrina; White, Laura; Ariel, Barak; Ashelford, Lawrence

    2013-12-01

    Community violence is a substantial problem for the NHS. Information sharing of emergency department data with community safety partnerships (CSP) has been associated with substantial reductions in assault attendances in emergency departments supported by academic institutions. We sought to validate these findings in a setting not supported by a public health or academic structure. We instituted anonymous data sharing with the police to reduce community violence, and increased involvement with the local CSP. We measured the effectiveness of this approach with routinely collected data at the emergency department and the police. We used police data from 2009, and emergency department data from 2000. Initially, the number of assault patients requiring emergency department treatment rose after we initiated data sharing. After improving the data flows, the number of assault patients fell back to the predata-sharing level. There was no change in the number of hospital admissions during the study period. There were decreases in the numbers of violent crimes against the person, with and without injury, recorded by the police. We have successfully implemented data sharing in our institution without the support of an academic institution. This has been associated with reductions in violent crime, but it is not clear whether this association is causal.

  14. R6 validation exercise: through thickness residual stress measurements on an experiment test vessel ring

    International Nuclear Information System (INIS)

    Mitchell, D.H.

    1988-06-01

    A series of bursting tests on thick-walled pressure vessels has been carried out as part of a validation exercise for the CEGB R6 failure assessment procedure. The objective of these tests was the examination of the behaviour of typical PWR primary vessel material subject to residual stresses in addition to primary loading with particular reference to the R6 assessment procedure. To this end, a semi-elliptic part-through defect was sited in the vessel longitudinal seam, which was a submerged arc weld in the non stress-relieved condition; it was then pressure tested to failure. Prior to the final assembly of this vessel, a ring of material was cut from it to act as a test-piece on which a residual stress survey could be made. Surface measurements using the centre-hole technique were made by CERL personnel, and this has been followed by two through- thickness measurements at BNL using the deep-hole technique. This paper describes these deep-hole measurements and presents the results from them. (author)

  15. Compatible validated spectrofluorimetric and spectrophotometric methods for determination of vildagliptin and saxagliptin by factorial design experiments

    Science.gov (United States)

    Abdel-Aziz, Omar; Ayad, Miriam F.; Tadros, Mariam M.

    2015-04-01

    Simple, selective and reproducible spectrofluorimetric and spectrophotometric methods have been developed for the determination of vildagliptin and saxagliptin in bulk and their pharmaceutical dosage forms. The first proposed spectrofluorimetric method is based on the dansylation reaction of the amino group of vildagliptin with dansyl chloride to form a highly fluorescent product. The formed product was measured spectrofluorimetrically at 455 nm after excitation at 345 nm. Beer's law was obeyed in a concentration range of 100-600 μg ml-1. The second proposed spectrophotometric method is based on the charge transfer complex of saxagliptin with tetrachloro-1,4-benzoquinone (p-chloranil). The formed charge transfer complex was measured spectrophotometrically at 530 nm. Beer's law was obeyed in a concentration range of 100-850 μg ml-1. The third proposed spectrophotometric method is based on the condensation reaction of the primary amino group of saxagliptin with formaldehyde and acetyl acetone to form a yellow colored product known as Hantzsch reaction, measured at 342.5 nm. Beer's law was obeyed in a concentration range of 50-300 μg ml-1. All the variables were studied to optimize the reactions' conditions using factorial design. The developed methods were validated and proved to be specific and accurate for quality control of vildagliptin and saxagliptin in their pharmaceutical dosage forms.

  16. Experiments to Populate and Validate a Processing Model for Polyurethane Foam: Additional Data for Structural Foams

    Energy Technology Data Exchange (ETDEWEB)

    Rao, Rekha R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Celina, Mathias C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Giron, Nicholas Henry [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Long, Kevin Nicholas [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Russick, Edward M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-01-01

    We are developing computational models to help understand manufacturing processes, final properties and aging of structural foam, polyurethane PMDI. Th e resulting model predictions of density and cure gradients from the manufacturing process will be used as input to foam heat transfer and mechanical models. BKC 44306 PMDI-10 and BKC 44307 PMDI-18 are the most prevalent foams used in structural parts. Experiments needed to parameterize models of the reaction kinetics and the equations of motion during the foam blowing stages were described for BKC 44306 PMDI-10 in the first of this report series (Mondy et al. 2014). BKC 44307 PMDI-18 is a new foam that will be used to make relatively dense structural supports via over packing. It uses a different catalyst than those in the BKC 44306 family of foams; hence, we expect that the reaction kineti cs models must be modified. Here we detail the experiments needed to characteriz e the reaction kinetics of BKC 44307 PMDI-18 and suggest parameters for the model based on these experiments. In additi on, the second part of this report describes data taken to provide input to the preliminary nonlinear visco elastic structural response model developed for BKC 44306 PMDI-10 foam. We show that the standard cu re schedule used by KCP does not fully cure the material, and, upon temperature elevation above 150°C, oxidation or decomposition reactions occur that alter the composition of the foam. These findings suggest that achieving a fully cured foam part with this formulation may be not be possible through therma l curing. As such, visco elastic characterization procedures developed for curing thermosets can provide only approximate material properties, since the state of the material continuously evolves during tests.

  17. Validation of the U-238 inelastic scattering neutron cross section through the EXCALIBUR dedicated experiment

    OpenAIRE

    Leconte Pierre; Bernard David

    2017-01-01

    EXCALIBUR is an integral transmission experiment based on the fast neutron source produced by the bare highly enriched fast burst reactor CALIBAN, located in CEA/DAM Valduc (France). Two experimental campaigns have been performed, one using a sphere of diameter 17 cm and one using two cylinders of 17 cm diameter 9 cm height, both made of metallic Uranium 238. A set of 15 different dosimeters with specific threshold energies have been employed to provide information on the neutron flux attenua...

  18. Validation of Cross Sections with Criticality Experiment and Reaction Rates: the Neptunium Case

    CERN Document Server

    Leong, L S; Audouin, L; Berthier, B; Le Naour, C; Stéphan, C; Paradela, C; Tarrío, D; Duran, I

    2014-01-01

    The Np-237 neutron-induced fission cross section has been recently measured in a large energy range (from eV to GeV) at the n\\_TOF facility at CERN. When compared to previous measurements the n\\_TOF fission cross section appears to be higher by 5-7\\% beyond the fission threshold. To check the relevance of the n\\_TOF data, we considered a criticality experiment performed at Los Alamos with a 6 kg sphere of Np-237, surrounded by uranium highly enriched in U-235 so as to approach criticality with fast neutrons. The multiplication factor k(eff) of the calculation is in better agreement with the experiment when we replace the ENDF/B-VII. 0 evaluation of the Np-237 fission cross section by the n\\_TOF data. We also explored the hypothesis of deficiencies of the inelastic cross section in U-235 which has been invoked by some authors to explain the deviation of 750 pcm. The large modification needed to reduce the deviation seems to be incompatible with existing inelastic cross section measurements. Also we show that t...

  19. Experiments to validate the assumptions on Pu release in an aircraft crash

    International Nuclear Information System (INIS)

    Seehars, H.D.; Hochrainer, D.

    1983-01-01

    This report describes simulation experiments with the substitute powder CeO 2 to study the release and dispersion of PuO 2 -powder induced by kerosene fires after an aeroplane crash on a Plutonium processing fuel element plant. The release rates of CeO 2 -powder were found to be a nonlinear function of te kerosene combustion rate. The release rates during a ''micro-scale'' fire inside the glovebox (pool area some 20 cm 2 ) were characterized by values of less than 10 μg/s, those during a conflagration (pool area some 200 m 2 ) by values of somewhat more than 25 mg/s. Because of the lack of other weather conditions the dispersion experiments were exclusively realized during weak to moderate winds. Small scale fire induced maximum inhalation hazards from PuO 2 -powder used in production essentially exceeded those of large scale conflagrations. Obviously the activity intake by inhalation exceeded to some extent the admissable threshold of the annual activity intake. (orig.) [de

  20. Intercomparison and validation of operational coastal-scale models, the experience of the project MOMAR.

    Science.gov (United States)

    Brandini, C.; Coudray, S.; Taddei, S.; Fattorini, M.; Costanza, L.; Lapucci, C.; Poulain, P.; Gerin, R.; Ortolani, A.; Gozzini, B.

    2012-04-01

    The need for regional governments to implement operational systems for the sustainable management of coastal waters, in order to meet the requirements imposed by legislation (e.g. EU directives such as WFD, MSFD, BD and relevant national legislation) often lead to the implementation of coastal measurement networks and to the construction of computational models that surround and describe parts of regional seas without falling in the classic definition of regional/coastal models. Although these operational models may be structured to cover parts of different oceanographic basins, they can have considerable advantages and highlight relevant issues, such as the role of narrow channels, straits and islands in coastal circulation, as both in physical and biogeochemical processes such as in the exchanges of water masses among basins. Two models of this type were made in the context of cross-border European project MOMAR: an operational model of the Tuscan Archipelago sea and one around the Corsica coastal waters, which are both located between the Tyrrhenian and the Algerian-Ligurian-Provençal basins. Although these two models were based on different computer codes (MARS3D and ROMS), they have several elements in common, such as a 400 m resolution, boundary conditions from the same "father" model, and an important area of overlap, the Corsica channel, which has a key role in the exchange of water masses between the two oceanographic basins. In this work we present the results of the comparison of these two ocean forecasting systems in response to different weather and oceanographic forcing. In particular, we discuss aspects related to the validation of the two systems, and a systematic comparison between the forecast/hindcast based on such hydrodynamic models, as regards to both operational models available at larger scale, both to in-situ measurements made by fixed or mobile platforms. In this context we will also present the results of two oceanographic cruises in the

  1. Design and validation of a slender guideway for Maglev vehicle by simulation and experiment

    Science.gov (United States)

    Han, Jong-Boo; Han, Hyung-Suk; Kim, Sung-Soo; Yang, Seok-Jo; Kim, Ki-Jung

    2016-03-01

    Normally, Maglev (magnetic levitation) vehicles run on elevated guideways. The elevated guideway must satisfy various load conditions of the vehicle, and has to be designed to ensure ride quality, while ensuring that the levitation stability of the vehicle is not affected by the deflection of the guideway. However, because the elevated guideways of Maglev vehicles in South Korea and other countries fabricated so far have been based on over-conservative design criteria, the size of the structures has increased. Further, from the cost perspective, they are unfavourable when compared with other light rail transits such as monorail, rubber wheel, and steel wheel automatic guided transit. Therefore, a slender guideway that does have an adverse effect on the levitation stability of the vehicle is required through optimisation of design criteria. In this study, to predict the effect of various design parameters of the guideway on the dynamic behaviour of the vehicle, simulations were carried out using a dynamics model similar to the actual vehicle and guideway, and a limiting value of deflection ratio of the slender guideway to ensure levitation control is proposed. A guideway that meets the requirement as per the proposed limit for deflection ratio was designed and fabricated, and through a driving test of the vehicle, the validity of the slender guideway was verified. From the results, it was confirmed that although some increase in airgap and cabin acceleration was observed with the proposed slender guideway when compared with the conventional guideway, there was no notable adverse effect on the levitation stability and ride quality of the vehicle. Therefore, it can be inferred that the results of this study will become the basis for establishing design criteria for slender guideways of Maglev vehicles in future.

  2. Validation of Friction Models in MARS-MultiD Module with Two-Phase Cross Flow Experiment

    International Nuclear Information System (INIS)

    Choi, Chi-Jin; Yang, Jin-Hwa; Cho, Hyoung-Kyu; Park, Goon-Cher; Euh, Dong-Jin

    2015-01-01

    In the downcomer of Advanced Power Reactor 1400 (APR1400) which has direct vessel injection (DVI) lines as an emergency core cooling system, multidimensional two-phase flow may occur due to the Loss-of-Coolant-Accident (LOCA). The accurate prediction about that is high relevance to evaluation of the integrity of the reactor core. For this reason, Yang performed an experiment that was to investigate the two-dimensional film flow which simulated the two-phase cross flow in the upper downcomer, and obtained the local liquid film velocity and thickness data. From these data, it could be possible to validate the multidimensional modules of system analysis codes. In this study, MARS-MultiD was used to simulate the Yang's experiment, and obtained the local variables. Then, the friction models used in MARS-MultiD were validated by comparing the two-phase flow experimental results with the calculated local variables. In this study, the two-phase cross flow experiment was modeled by the MARS-MultiD. Compared with the experimental results, the calculated results by the code properly presented mass conservation which could be known from the relation between the liquid film velocity and thickness at the same flow rate. The magnitude and direction of the liquid film, however, did not follow well with experimental results. According to the results of Case-2, wall friction should be increased, and interfacial friction should be decreased in MARS-MultiD. These results show that it is needed to modify the friction models in the MARS-MultiD to simulate the two-phase cross flow

  3. Validation of extended magnetohydrodynamic simulations of the HIT-SI3 experiment using the NIMROD code

    Science.gov (United States)

    Morgan, K. D.; Jarboe, T. R.; Hossack, A. C.; Chandra, R. N.; Everson, C. J.

    2017-12-01

    The HIT-SI3 experiment uses a set of inductively driven helicity injectors to apply a non-axisymmetric current drive on the edge of the plasma, driving an axisymmetric spheromak equilibrium in a central confinement volume. These helicity injectors drive a non-axisymmetric perturbation that oscillates in time, with relative temporal phasing of the injectors modifying the mode structure of the applied perturbation. A set of three experimental discharges with different perturbation spectra are modelled using the NIMROD extended magnetohydrodynamics code, and comparisons are made to both magnetic and fluid measurements. These models successfully capture the bulk dynamics of both the perturbation and the equilibrium, though disagreements related to the pressure gradients experimentally measured exist.

  4. Experiments for the validation of computer codes uses to assess the protection factors afforded by dwellings

    International Nuclear Information System (INIS)

    Le Grand, J.; Roux, Y.; Kerlau, G.

    1988-09-01

    Two experimental campaigns were carried out to verify: 1) the method of assessing the mean kerma in a household used in the computer code BILL calculating the protection factor afforded by dwellings; 2) in what conditions the kerma calculated in cubic meshes of a given size (code PIECE) agreed with TLD measurements. To that purpose, a house was built near the caesium 137 source of the Ecosystem irradiator located at the Cadarache Nuclear Research Center. During the first campaign, four experiments with different house characteristics were conducted. Some 50 TLSs locations describing the inhabitable volume were defined in order to obtain the mean kerma. 16 locations were considered outside the house. During the second campaign a cobalt 60 source was installed on the side. Only five measurement locations were defined, each with 6 TLDs. The results of dosimetric measurements are presented and compared with the calculations of the two computer codes. The effects of wall heterogeneity were also studied [fr

  5. A Large-Scale Multibody Manipulator Soft Sensor Model and Experiment Validation

    Directory of Open Access Journals (Sweden)

    Wu Ren

    2014-01-01

    Full Text Available Stress signal is difficult to obtain in the health monitoring of multibody manipulator. In order to solve this problem, a soft sensor method is presented. In the method, stress signal is considered as dominant variable and angle signal is regarded as auxiliary variable. By establishing the mathematical relationship between them, a soft sensor model is proposed. In the model, the stress information can be deduced by angle information which can be easily measured for such structures by experiments. Finally, test of ground and wall working conditions is done on a multibody manipulator test rig. The results show that the stress calculated by the proposed method is closed to the test one. Thus, the stress signal is easier to get than the traditional method. All of these prove that the model is correct and the method is feasible.

  6. Development and validation of the Overlap Muon Track Finder for the CMS experiment

    Science.gov (United States)

    Dobosz, J.; Mietki, P.; Zawistowski, K.; Żarnecki, G.

    2016-09-01

    Present article is a description of the authors contribution in upgrade and analysis of performance of the Level-1 Muon Trigger of the CMS experiment. The authors are students of University of Warsaw and Gdansk University of Technology. They are collaborating with the CMS Warsaw Group. This article summarises students' work presented during the Students session during the Workshop XXXVIII-th IEEE-SPIE Joint Symposium Wilga 2016. In the first section the CMS experiment is briefly described and the importance of the trigger system is explained. There is also shown basic difference between old muon trigger strategy and the upgraded one. The second section is devoted to Overlap Muon Track Finder (OMTF). This is one of the crucial components of the Level-1 Muon Trigger. The algorithm of OMTF is described. In the third section there is discussed one of the event selection aspects - cut on the muon transverse momentum pT . Sometimes physical muon with pT bigger than a certain threshold is unnecessarily cut and physical muon with lower pT survives. To improve pT selection modified algorithm was proposed and its performance was studied. One of the features of the OMTF is that one physical muon often results in several muon candidates. The Ghost-Buster algorithm is designed to eliminate surplus candidates. In the fourth section this algorithm and its performance on different data samples are discussed. In the fifth section Local Data Acquisition System (Local DAQ) is briefly described. It supports initial system commissioning. The test done with OMTF Local DAQ are described. In the sixth section there is described development of web application used for the control and monitoring of CMS electronics. The application provides access to graphical user interface for manual control and the connection to the CMS hierarchical Run Control.

  7. Estimates of Nutrient Loading by Ground-Water Discharge into the Lynch Cove Area of Hood Canal, Washington

    Science.gov (United States)

    Simonds, F. William; Swarzenski, Peter W.; Rosenberry, Donald O.; Reich, Christopher D.; Paulson, Anthony J.

    2008-01-01

    field investigations show that ground-water discharge into the Lynch Cove area of Hood Canal is highly dynamic and strongly affected by the large tidal range. In areas with a steep shoreline and steep hydraulic gradient, ground-water discharge is spatially concentrated in or near the intertidal zone, with increased discharge during low tide. Topographically flat areas with weak hydraulic gradients had more spatial variability, including larger areas of seawater recirculation and more widely dispersed discharge. Measured total-dissolved-nitrogen concentrations in ground water ranged from below detection limits to 2.29 milligrams per liter and the total load entering Lynch Cove was estimated to be approximately 98 ? 10.3 metric tons per year (MT/yr). This estimate is based on net freshwater seepage rates from Lee-type seepage meter measurements and can be compared to estimates derived from geochemical tracer mass balance estimates (radon and radium) of 231 to 749 MT/yr, and previous water-mass-balance estimates (14 to 47 MT/ yr). Uncertainty in these loading estimates is introduced by complex biogeochemical cycles of relevant nutrient species, the representativeness of measurement sites, and by energetic dynamics at the coastal aquifer-seawater interface caused by tidal forcing.

  8. A systematic review of the reliability and validity of discrete choice experiments in valuing non-market environmental goods

    DEFF Research Database (Denmark)

    Rokotonarivo, Sarobidy; Schaafsma, Marije; Hockley, Neal

    2016-01-01

    reliability measures. DCE results were generally consistent with those of other stated preference techniques (convergent validity), but hypothetical bias was common. Evidence supporting theoretical validity (consistency with assumptions of rational choice theory) was limited. In content validity tests, 2...

  9. Validation of CFD simulation of recoilless EOD water cannon by firing experiments with high speed camera

    Science.gov (United States)

    Chantrasmi, Tonkid; Hongthong, Premsiri; Kongkaniti, Manop

    2018-01-01

    Water cannon used by Explosive Ordnance Disposal (EOD) were designed to propel a burst of water jet moving at high speed to target and disrupt an improvised explosive device (IED). The cannon could be mounted on a remotely controlled robot, so it is highly desirable for the cannon to be recoilless in order not to damage the robot after firing. In the previous work, a nonconventional design of the water cannon was conceived. The recoil was greatly reduced by backward sprays of water through a ring of slotted holes around the muzzle. This minimizes the need to manufacture new parts by utilizing all off-the-shelf components except the tailor-made muzzle. The design was then investigated numerically by a series of Computational Fluid Dynamics (CFD) simulations. In this work, high speed camera was employed in firing experiments to capture the motion of the water jet and the backward sprays. It was found that the experimental data agreed well with the simulation results in term of averaged exit velocities.

  10. Validation of NO2 and NO from the Atmospheric Chemistry Experiment (ACE

    Directory of Open Access Journals (Sweden)

    M. Schneider

    2008-10-01

    Full Text Available Vertical profiles of NO2 and NO have been obtained from solar occultation measurements by the Atmospheric Chemistry Experiment (ACE, using an infrared Fourier Transform Spectrometer (ACE-FTS and (for NO2 an ultraviolet-visible-near-infrared spectrometer, MAESTRO (Measurement of Aerosol Extinction in the Stratosphere and Troposphere Retrieved by Occultation. In this paper, the quality of the ACE-FTS version 2.2 NO2 and NO and the MAESTRO version 1.2 NO2 data are assessed using other solar occultation measurements (HALOE, SAGE II, SAGE III, POAM III, SCIAMACHY, stellar occultation measurements (GOMOS, limb measurements (MIPAS, OSIRIS, nadir measurements (SCIAMACHY, balloon-borne measurements (SPIRALE, SAOZ and ground-based measurements (UV-VIS, FTIR. Time differences between the comparison measurements were reduced using either a tight coincidence criterion, or where possible, chemical box models. ACE-FTS NO2 and NO and the MAESTRO NO2 are generally consistent with the correlative data. The ACE-FTS and MAESTRO NO2 volume mixing ratio (VMR profiles agree with the profiles from other satellite data sets to within about 20% between 25 and 40 km, with the exception of MIPAS ESA (for ACE-FTS and SAGE II (for ACE-FTS (sunrise and MAESTRO and suggest a negative bias between 23 and 40 km of about 10%. MAESTRO reports larger VMR values than the ACE-FTS. In comparisons with HALOE, ACE-FTS NO VMRs typically (on average agree to ±8% from 22 to 64 km and to +10% from 93 to 105 km, with maxima of 21% and 36%, respectively. Partial column comparisons for NO2 show that there is quite good agreement between the ACE instruments and the FTIRs, with a mean difference of +7.3% for ACE-FTS and +12.8% for MAESTRO.

  11. Validation of the factor structure of the adolescent dissociative experiences scale in a sample of trauma-exposed detained youth.

    Science.gov (United States)

    Kerig, Patricia K; Charak, Ruby; Chaplo, Shannon D; Bennett, Diana C; Armour, Cherie; Modrowski, Crosby A; McGee, Andrew B

    2016-09-01

    The inclusion of a dissociative subtype in the Diagnostic and Statistical Manual of Mental Disorders (5th ed.; DSM–5 ) criteria for the diagnosis of posttraumatic stress disorder (PTSD) has highlighted the need for valid and reliable measures of dissociative symptoms across developmental periods. The Adolescent Dissociative Experiences Scale (A-DES) is 1 of the few measures validated for young persons, but previous studies have yielded inconsistent results regarding its factor structure. Further, research to date on the A-DES has been based upon nonclinical samples of youth or those without a known history of trauma. To address these gaps in the literature, the present study investigated the factor structure and construct validity of the A-DES in a sample of highly trauma-exposed youth involved in the juvenile justice system. A sample of 784 youth (73.7% boys) recruited from a detention center completed self-report measures of trauma exposure and the A-DES, a subset of whom (n = 212) also completed a measure of PTSD symptoms. Confirmatory factor analyses revealed a best fitting 3-factor structure comprised of depersonalization or derealization, amnesia, and loss of conscious control, with configural and metric invariance across gender. Logistic regression analyses indicated that the depersonalization or derealization factor effectively distinguished between those youth who did and did not likely meet criteria for a diagnosis of PTSD as well as those with PTSD who did and did not likely meet criteria for the dissociative subtype. These results provide support for the multidimensionality of the construct of posttraumatic dissociation and contribute to the understanding of the dissociative subtype of PTSD among adolescents. (PsycINFO Database Record PsycINFO Database Record (c) 2016 APA, all rights reserved

  12. Model and experiences of initiating collaboration with traditional healers in validation of ethnomedicines for HIV/AIDS in Namibia

    Directory of Open Access Journals (Sweden)

    Chinsembu Kazhila C

    2009-10-01

    Full Text Available Abstract Many people with Human Immunodeficiency Virus/Acquired Immunodeficiency Syndrome (HIV/AIDS in Namibia have access to antiretroviral drugs but some still use traditional medicines to treat opportunistic infections and offset side-effects from antiretroviral medication. Namibia has a rich biodiversity of indigenous plants that could contain novel anti-HIV agents. However, such medicinal plants have not been identified and properly documented. Various ethnomedicines used to treat HIV/AIDS opportunistic infections have not been scientifically validated for safety and efficacy. These limitations are mostly attributable to the lack of collaboration between biomedical scientists and traditional healers. This paper presents a five-step contextual model for initiating collaboration with Namibian traditional healers in order that candidate plants that may contain novel anti-HIV agents are identified, and traditional medicines used to treat HIV/AIDS opportunistic infections are subjected to scientific validation. The model includes key structures and processes used to initiate collaboration with traditional healers in Namibia; namely, the National Biosciences Forum, a steering committee with the University of Namibia (UNAM as the focal point, a study tour to Zambia and South Africa where other collaborative frameworks were examined, commemorations of the African Traditional Medicine Day (ATMD, and consultations with stakeholders in north-eastern Namibia. Experiences from these structures and processes are discussed. All traditional healers in north-eastern Namibia were willing to collaborate with UNAM in order that their traditional medicines could be subjected to scientific validation. The current study provides a framework for future collaboration with traditional healers and the selection of candidate anti-HIV medicinal plants and ethnomedicines for scientific testing in Namibia.

  13. New set of convective heat transfer coefficients established for pools and validated against CLARA experiments for application to corium pools

    Energy Technology Data Exchange (ETDEWEB)

    Michel, B., E-mail: benedicte.michel@irsn.fr

    2015-05-15

    Highlights: • A new set of 2D convective heat transfer correlations is proposed. • It takes into account different horizontal and lateral superficial velocities. • It is based on previously established correlations. • It is validated against recent CLARA experiments. • It has to be implemented in a 0D MCCI (molten core concrete interaction) code. - Abstract: During an hypothetical Pressurized Water Reactor (PWR) or Boiling Water Reactor (BWR) severe accident with core meltdown and vessel failure, corium would fall directly on the concrete reactor pit basemat if no water is present. The high temperature of the corium pool maintained by the residual power would lead to the erosion of the concrete walls and basemat of this reactor pit. The thermal decomposition of concrete will lead to the release of a significant amount of gases that will modify the corium pool thermal hydraulics. In particular, it will affect heat transfers between the corium pool and the concrete which determine the reactor pit ablation kinetics. A new set of convective heat transfer coefficients in a pool with different lateral and horizontal superficial gas velocities is modeled and validated against the recent CLARA experimental program. 155 tests of this program, in two size configurations and a high range of investigated viscosity, have been used to validate the model. Then, a method to define different lateral and horizontal superficial gas velocities in a 0D code is proposed together with a discussion about the possible viscosity in the reactor case when the pool is semi-solid. This model is going to be implemented in the 0D ASTEC/MEDICIS code in order to determine the impact of the convective heat transfer in the concrete ablation by corium.

  14. Validity And Practicality of Experiment Integrated Guided Inquiry-Based Module on Topic of Colloidal Chemistry for Senior High School Learning

    Science.gov (United States)

    Andromeda, A.; Lufri; Festiyed; Ellizar, E.; Iryani, I.; Guspatni, G.; Fitri, L.

    2018-04-01

    This Research & Development study aims to produce a valid and practical experiment integrated guided inquiry based module on topic of colloidal chemistry. 4D instructional design model was selected in this study. Limited trial of the product was conducted at SMAN 7 Padang. Instruments used were validity and practicality questionnaires. Validity and practicality data were analyzed using Kappa moment. Analysis of the data shows that Kappa moment for validity was 0.88 indicating a very high degree of validity. Kappa moments for the practicality from students and teachers were 0.89 and 0.95 respectively indicating high degree of practicality. Analysis on the module filled in by students shows that 91.37% students could correctly answer critical thinking, exercise, prelab, postlab and worksheet questions asked in the module. These findings indicate that the integrated guided inquiry based module on topic of colloidal chemistry was valid and practical for chemistry learning in senior high school.

  15. The Tromso Infant Faces Database (TIF): Development, Validation and Application to Assess Parenting Experience on Clarity and Intensity Ratings.

    Science.gov (United States)

    Maack, Jana K; Bohne, Agnes; Nordahl, Dag; Livsdatter, Lina; Lindahl, Åsne A W; Øvervoll, Morten; Wang, Catharina E A; Pfuhl, Gerit

    2017-01-01

    Newborns and infants are highly depending on successfully communicating their needs; e.g., through crying and facial expressions. Although there is a growing interest in the mechanisms of and possible influences on the recognition of facial expressions in infants, heretofore there exists no validated database of emotional infant faces. In the present article we introduce a standardized and freely available face database containing Caucasian infant face images from 18 infants 4 to 12 months old. The development and validation of the Tromsø Infant Faces (TIF) database is presented in Study 1. Over 700 adults categorized the photographs by seven emotion categories (happy, sad, disgusted, angry, afraid, surprised, neutral) and rated intensity, clarity and their valance. In order to examine the relevance of TIF, we then present its first application in Study 2, investigating differences in emotion recognition across different stages of parenthood. We found a small gender effect in terms of women giving higher intensity and clarity ratings than men. Moreover, parents of young children rated the images as clearer than all the other groups, and parents rated "neutral" expressions as more clearly and more intense. Our results suggest that caretaking experience provides an implicit advantage in the processing of emotional expressions in infant faces, especially for the more difficult, ambiguous expressions.

  16. Measures of aggression and victimization in portuguese adolescents: Cross-cultural validation of the Revised Peer Experience Questionnaire.

    Science.gov (United States)

    Queirós, Andreia N; Vagos, Paula

    2016-10-01

    The goal of this research was to develop and psychometrically evaluate the Portuguese version of the Revised Peer Experience Questionnaire, which assesses aggression, victimization and prosocial behavior. Victimization and aggression among adolescents in school settings is a growing problem, not yet fully understood or properly evaluated, particularly in Portugal. A sample of 1320 adolescents was recruited (52.7% female), with ages varying from 10 to 18 years old, attending middle and high school. Confirmatory factor analysis confirms the measurement model of the instrument's bully and victim versions, as evaluating overt, relational, and reputational aggression/victimization and providing/receiving prosocial behavior, respectively. This measurement model was invariant across schooling and gender, showed adequate internal consistency indicators, and presented evidence for construct validity in relation to other variables. Descriptive analyses indicate that boys are more aggressive in overt and relational forms and victimized through overt aggression, whereas girls are more aggressive and victimized relationally. More than any form of aggression or victimization, boys and girls revealed higher values for engaging in and receiving prosocial behavior. These results suggest that this instrument is a reliable, valid, and structurally sound measure of aggression, victimization and prosocial behavior in this Portuguese school-based community sample. Hence, its use may assist researchers in gaining a better understanding of adolescent aggression and victimization. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  17. Light ion fusion experiment (L.I.F.E.) concept validation studies. Final report, July 1979-May 1980

    International Nuclear Information System (INIS)

    Christensen, T.E.; Orthel, J.L.; Thomson, J.J.

    1980-12-01

    This report reflects the considerable advances made for the objectives of the contractual program, validating by detailed anaytical studies the concept of a new Light Ion Fusion Experiment for Inertial Confinement Fusion. The studies have produced an analytical design of a novel electrostatic accelerator based on separate function and strong channel focusing principles, to launch 3 to 10 MeV, 23 kA, He + neutralized beams in 400 ns pulses, delivering on a 5 mm radius target located 10 m downstream, 50 kJ of implosion energy in approx. 20 ns impact times The control, stability and focusing of beams is made by electrostatic quadrupoles, producing overall beam normalized emittance of approx. 3 x 10 -5 m-rad

  18. Site characterization and validation - equipment design and techniques used in single borehole hydraulic testing, simulated drift experiment and crosshole testing

    International Nuclear Information System (INIS)

    Holmes, D.C.; Sehlstedt, M.

    1991-10-01

    This report describes the equipment and techniques used to investigate the variation of hydrogeological parameters within a fractured crystalline rock mass. The testing program was performed during stage 3 of the site characterization and validation programme at the Stripa mine in Sweden. This programme used a multidisciplinary approach, combining geophysical, geological and hydrogeological methods, to determine how groundwater moved through the rock mass. The hydrogeological work package involved three components. Firstly, novel single borehole techniques (focused packer testing) were used to determine the distribution of hydraulic conductivity and head along individual boreholes. Secondly, water was abstracted from boreholes which were drilled to simulate a tunnel (simulated drift experiment). Locations and magnitudes of flows were measured together with pressure responses at various points in the SCV rock mass. Thirdly, small scale crosshole tests, involving detailed interference testing, were used to determine the variability of hydrogeological parameters within previously identified, significant flow zones. (au)

  19. Fundamental validation of simulation method for thermal stratification in upper plenum of fast reactors. Analysis of sodium experiment

    International Nuclear Information System (INIS)

    Ohno, Shuji; Ohshima, Hiroyuki; Sugahara, Akihiro; Ohki, Hiroshi

    2010-01-01

    Three-dimensional thermal-hydraulic analyses have been carried out for a sodium experiment in a relatively simple axis-symmetric geometry using a commercial CFD code in order to validate simulating methods for thermal stratification behavior in an upper plenum of sodium-cooled fast reactor. Detailed comparison between simulated results and experimental measurement has demonstrated that the code reproduced fairly well the fundamental thermal stratification behaviors such as vertical temperature gradient and upward movement of a stratification interface when utilizing high-order discretization scheme and appropriate mesh size. Furthermore, the investigation has clarified the influence of RANS type turbulence models on phenomena predictability; i.e. the standard k-ε model, the RNG k-ε model and the Reynolds Stress Model. (author)

  20. Three-dimensional fuel pin model validation by prediction of hydrogen distribution in cladding and comparison with experiment

    Energy Technology Data Exchange (ETDEWEB)

    Aly, A. [North Carolina State Univ., Raleigh, NC (United States); Avramova, Maria [North Carolina State Univ., Raleigh, NC (United States); Ivanov, Kostadin [Pennsylvania State Univ., University Park, PA (United States); Motta, Arthur [Pennsylvania State Univ., University Park, PA (United States); Lacroix, E. [Pennsylvania State Univ., University Park, PA (United States); Manera, Annalisa [Univ. of Michigan, Ann Arbor, MI (United States); Walter, D. [Univ. of Michigan, Ann Arbor, MI (United States); Williamson, R. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Gamble, K. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2017-10-29

    To correctly describe and predict this hydrogen distribution there is a need for multi-physics coupling to provide accurate three-dimensional azimuthal, radial, and axial temperature distributions in the cladding. Coupled high-fidelity reactor-physics codes with a sub-channel code as well as with a computational fluid dynamics (CFD) tool have been used to calculate detailed temperature distributions. These high-fidelity coupled neutronics/thermal-hydraulics code systems are coupled further with the fuel-performance BISON code with a kernel (module) for hydrogen. Both hydrogen migration and precipitation/dissolution are included in the model. Results from this multi-physics analysis is validated utilizing calculations of hydrogen distribution using models informed by data from hydrogen experiments and PIE data.

  1. Mini-channel flow experiments and CFD validation analyses with the IFMIF Thermo- Hydraulic Experimental facility (ITHEX)

    International Nuclear Information System (INIS)

    Arbeiter, F.; Heinzel, V.; Leichtle, D.; Stratmanns, E.; Gordeev, S.

    2006-01-01

    The design of the IFMIF High Flux Test Module (HFTM) is based on the predictions for the heat transfer in narrow channels conducting helium flow of 50 o C inlet temperature at 0.3 MPa. The emerging helium flow conditions are in the transition regime of laminar to turbulent flow. The rectangular cooling channels are too short for the full development of the coolant flow. Relaminarization along the cooling passage is expected. At the shorter sides of the channels secondary flow occurs, which may have an impact on the temperature field inside the irradiation specimen's stack. As those conditions are not covered by available experimental data, the dedicated gas loop ITHEX has been constructed to operate up to a pressure of 0.42 MPa and temperatures of 200 o C. It's objective is to conduct experiments for the validation of the STAR-CD CFD code used for the design of the HFTM. As a first stage, two annular test-sections with hydraulic diameter of 1.2 mm have been used, where the experiments have been varied with respect to gas species (N 2 , He), inlet pressure, dimensionless heating span and Reynolds number encompassing the range of operational parameters of the HFTM. Local friction factors and Nusselt numbers have been obtained giving evidence that the transition regime will extend to Reynolds 10,000. For heating rates comparable to the HFTM filled with RAFM steels, local heat transfer coefficients are in consistence with the measured friction data. To validate local velocity profiles the ITHEX facility was further equipped with a flat rectangular test-section and a Laser Doppler Anemometry (LDA) system. An appropriate optical system has been developed and tested for the tiny observation volume of 40 μm diameter. Velocity profiles as induced by the transition of a wide inlet plenum to the flat mini-channels have been measured. Whereas the CFD models were able to reproduce the patterns far away from the nozzle, they show some disagreement for the conditions at the

  2. Summary of public participation : Environmental impact assessment : Proposal by the New Brunswick Power Commission to refurbish the Coleson Cove generating station

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-09-01

    A proposal was made by New Brunswick (NB) Power for the refurbishment of the Coleson Cove generating station, to the Public Utilities Board (PUB). It was determined by the PUB that a requirement existed for 1050 megawatt (MW) of power at Coleson Cove. This document presented a summary of public participation. Two meetings were held in support of the Environmental Impact Assessment (EIA). Several concerns were raised by various groups at both meetings. Some of the issues discussed included: (1) Orimulsion{sup R} fuel, reputed to be the dirtiest fuel in the world, (2) fuel supply, (3) project agenda, (4) project costs and others. It appeared that most of the participants in the public consultation process were against the proposal. It was felt that the emissions of sulphur dioxide and nitrous oxide should be considered in the greater context of reducing present and future emissions of greenhouse gases. The participants recommended that the New Brunswick government support a 400 MW combined cycle natural gas turbine unit instead of the proposal under review. Much opposition to the project concerned the potential degradation of the environment and the health of the citizens. Environmental representatives were concerned since the Solid Waste Management Area would be located in the vicinity of a proposed marine and wildlife sanctuary. Industry representatives were eager for the opportunities offered by the proposal. refs.

  3. Proposed stratotype for the base of the highest Cambrian stage at the first appearance datum of Cordylodus andresi, Lawson Cove section, Utah, USA

    Science.gov (United States)

    Miller, J.F.; Ethington, Raymond L.; Evans, K.R.; Holmer, L.E.; Loch, James D.; Popov, L.E.; Repetski, J.E.; Ripperdan, R.L.; Taylor, John F.

    2006-01-01

    We propose a candidate for the Global Standard Stratotype-section and Point (GSSP) for the base of the highest stage of the Furongian Series of the Cambrian System. The section is at Lawson Cove in the Ibex area of Millard County, Utah, USA. The marker horizon is the first appearance datum (FAD) of the conodont Cordylodus andresi Viira et Sergeyeva in Kaljo et al. [Kaljo, D., Borovko, N., Heinsalu, H., Khazanovich, K., Mens, K., Popov, L., Sergeyeva, S., Sobolevskaya, R., Viira, V., 1986. The Cambrian-Ordovician boundary in the Baltic-Ladoga clint area (North Estonia and Leningrad Region, USSR). Eesti NSV Teaduste Akadeemia Toimetised. Geologia 35, 97-108]. At this section and elsewhere this horizon also is the FAD of the trilobite Eurekia apopsis (Winston et Nicholls, 1967). This conodont characterizes the base of the Cordylodus proavus Zone, which has been recognized in many parts of the world. This trilobite characterizes the base of the Eurekia apopsis Zone, which has been recognized in many parts of North America. The proposed boundary is 46.7 m above the base of the Lava Dam Member of the Notch Peak Formation at the Lawson Cove section. Brachiopods, sequence stratigraphy, and carbon-isotope geochemistry are other tools that characterize this horizon and allow it to be recognized in other areas. ?? 2006 Nanjing Institute of Geology and Palaeontology, CAS.

  4. Evaluating the experiences and support needs of people living with chronic cancer: development and initial validation of the Chronic Cancer Experiences Questionnaire (CCEQ).

    Science.gov (United States)

    Harley, Clare; Pini, Simon; Kenyon, Lucille; Daffu-O'Reilly, Amrit; Velikova, Galina

    2016-08-10

    Many advanced cancers are managed as chronic diseases, yet there are currently no international guidelines for the support of patients living with chronic cancer. It is important to understand whether care and service arrangements meet the needs of this rapidly growing patient group. This study aimed to develop and validate a questionnaire to capture patients' experiences of living with chronic cancer and their views of clinical and support services. The research was carried out between 1 July 2010 and 21 February 2013. A conceptual framework and initial item bank were derived from prior interviews with 56 patients with chronic cancer. Items were reviewed by 4 oncologists and 1 clinical nurse specialist and during 2 focus groups with 9 patients. Pilot questionnaires were completed by 416 patients across 5 cancer units. Item selection and scale reliability was explored using descriptive data, exploratory factor analysis, internal consistency analyses, multitrait scaling analyses and known-groups comparisons. The final Chronic Cancer Experiences Questionnaire (CCEQ) includes 75 items. 62 items contribute to 14 subscales with internal consistency between α 0·68-0·88 and minimal scaling errors. Known-groups comparisons confirmed subscale utility in distinguishing between patient groups. Subscales were labelled: managing appointments, coordination of care, general practitioner involvement, clinical trials, information and questions, making treatment decisions, symptom non-reporting, key worker, limitations, sustaining normality, financial advice, worries and anxieties, sharing feelings with others, and accessing support. 13 items assessing symptom experiences were retained as single items. The CCEQ has the potential to be used as a clinical instrument to assess patient experiences of chronic cancer or to screen for patient needs. It may also be used as an outcome measure for evaluating programmes and models of care and may identify areas for service development that

  5. Studies on calibration and validation of data provided by the Global Ozone Monitoring Experiment GOME on ERS-2 (CAVEAT). Final report; Studie zur Kalibrierung und Validation von Daten des Global Ozone Monitoring Experiments GOME auf ERS-2 (CAVEAT). Endbericht

    Energy Technology Data Exchange (ETDEWEB)

    Burrows, J.P.; Kuenzi, K.; Ladstaetter-Weissenmayer, A.; Langer, J. [Bremen Univ. (Germany). Inst. fuer Umweltphysik; Neuber, R.; Eisinger, M. [Alfred-Wegener-Institut fuer Polar- und Meeresforschung, Potsdam (Germany)

    2000-04-01

    The Global Ozone Monitoring Experiment (GOME) was launched on 21 April 1995 as one of six scientific instruments on board the second European remote sensing satellite (ERS-2) of the ESA. The investigations presented here aimed at assessing and improving the accuracy of the GOME measurements of sun-standardized and absolute radiation density and the derived data products. For this purpose, the GOME data were compared with measurements pf terrestrial, airborne and satellite-borne systems. For scientific reasons, the measurements will focus on the medium and high latitudes of both hemispheres, although equatorial regions were investigated as well. In the first stage, operational data products of GOME were validated, i.e. radiation measurements (spectra, level1 product) and trace gas column densities (level2 product). [German] Am 21. April 1995 wurde das Global Ozone Monitoring Experiment (GOME) als eines von insgesamt sechs wissenschaftlichen Instrumenten an Bord des zweiten europaeischen Fernerkundungssatelliten (ERS-2) der ESA ins All gebracht. Das Ziel dieses Vorhabens ist es, die Genauigkeit der von GOME durchgefuehrten Messungen von sonnennormierter und absoluter Strahlungsdichte sowie der aus ihnen abgeleiteten Datenprodukte zu bewerten und zu verbessern. Dazu sollten die GOME-Daten mit Messungen von boden-, flugzeug- und satellitengestuetzten Systemen verglichen werden. Aus wissenschaftlichen Gruenden wird der Schwerpunkt auf Messungen bei mittleren und hohen Breitengraden in beiden Hemisphaeren liegen. Jedoch wurden im Laufe des Projektzeitraumes auch Regionen in Aequatornaehe untersucht. Im ersten Schritt sollen operationelle Datenprodukte von GOME validiert werden. Dieses sind Strahlungsmessungen (Spektren, Level1-Produkt) und Spurengas-Saeulendichten (Level2-Produkt). (orig.)

  6. ER-2 #809 on the SAGE III Ozone Loss and Validation Experiment (SOLVE) with pilot Dee Porter prepari

    Science.gov (United States)

    2000-01-01

    Lockheed Martin pilot Dee Porter climbs up the ladder wearing a heavy tan pressure suit, preparing to board NASA ER-2 #809 at Kiruna, Sweden, for the third flight in the SAGE III Ozone Loss and Validation Experiment. Assisting him is Jim Sokolik, a Lockheed Martin life support technician. Number 809, one of Dryden's two high-flying ER-2 Airborne Science aircraft, a civilian variant of Lockheed's U-2, and another NASA flying laboratory, Dryden's DC-8, were based north of the Arctic Circle in Kiruna, Sweden during the winter of 2000 to study ozone depletion as part of the SAGE III Ozone Loss and Validation Experiment (SOLVE). A large hangar built especially for research, 'Arena Arctica' housed the instrumented aircraft and the scientists. Scientists have observed unusually low levels of ozone over the Arctic during recent winters, raising concerns that ozone depletion there could become more widespread as in the Antarctic ozone hole. The NASA-sponsored international mission took place between November 1999 and March 2000 and was divided into three phases. The DC-8 was involved in all three phases returning to Dryden between each phase. The ER-2 flew sample collection flights between January and March, remaining in Sweden from Jan. 9 through March 16. 'The collaborative campaign will provide an immense new body of information about the Arctic stratosphere,' said program scientist Dr. Michael Kurylo, NASA Headquarters. 'Our understanding of the Earth's ozone will be greatly enhanced by this research.' ER-2s bearing tail numbers 806 and 809 are used as airborne science platforms by NASA's Dryden Flight Research Center. The aircraft are platforms for a variety of high-altitude science missions flown over various parts of the world. They are also used for earth science and atmospheric sensor research and development, satellite calibration and data validation. The ER-2s are capable of carrying a maximum payload of 2,600 pounds of experiments in a nose bay, the main

  7. Development and initial validation of the Parental PELICAN Questionnaire (PaPEQu)--an instrument to assess parental experiences and needs during their child's end-of-life care.

    Science.gov (United States)

    Zimmermann, Karin; Cignacco, Eva; Eskola, Katri; Engberg, Sandra; Ramelet, Anne-Sylvie; Von der Weid, Nicolas; Bergstraesser, Eva

    2015-12-01

    To develop and test the Parental PELICAN Questionnaire, an instrument to retrospectively assess parental experiences and needs during their child's end-of-life care. To offer appropriate care for dying children, healthcare professionals need to understand the illness experience from the family perspective. A questionnaire specific to the end-of-life experiences and needs of parents losing a child is needed to evaluate the perceived quality of paediatric end-of-life care. This is an instrument development study applying mixed methods based on recommendations for questionnaire design and validation. The Parental PELICAN Questionnaire was developed in four phases between August 2012-March 2014: phase 1: item generation; phase 2: validity testing; phase 3: translation; phase 4: pilot testing. Psychometric properties were assessed after applying the Parental PELICAN Questionnaire in a sample of 224 bereaved parents in April 2014. Validity testing covered the evidence based on tests of content, internal structure and relations to other variables. The Parental PELICAN Questionnaire consists of approximately 90 items in four slightly different versions accounting for particularities of the four diagnostic groups. The questionnaire's items were structured according to six quality domains described in the literature. Evidence of initial validity and reliability could be demonstrated with the involvement of healthcare professionals and bereaved parents. The Parental PELICAN Questionnaire holds promise as a measure to assess parental experiences and needs and is applicable to a broad range of paediatric specialties and settings. Future validation is needed to evaluate its suitability in different cultures. © 2015 John Wiley & Sons Ltd.

  8. Numerical Simulation of Tuff Dissolution and Precipitation Experiments: Validation of Thermal-Hydrologic-Chemical (THC) Coupled-Process Modeling

    Science.gov (United States)

    Dobson, P. F.; Kneafsey, T. J.

    2001-12-01

    As part of an ongoing effort to evaluate THC effects on flow in fractured media, we performed a laboratory experiment and numerical simulations to investigate mineral dissolution and precipitation. To replicate mineral dissolution by condensate in fractured tuff, deionized water equilibrated with carbon dioxide was flowed for 1,500 hours through crushed Yucca Mountain tuff at 94° C. The reacted water was collected and sampled for major dissolved species, total alkalinity, electrical conductivity, and pH. The resulting steady-state fluid composition had a total dissolved solids content of about 140 mg/L; silica was the dominant dissolved constituent. A portion of the steady-state reacted water was flowed at 10.8 mL/hr into a 31.7-cm tall, 16.2-cm wide vertically oriented planar fracture with a hydraulic aperture of 31 microns in a block of welded Topopah Spring tuff that was maintained at 80° C at the top and 130° C at the bottom. The fracture began to seal within five days. A 1-D plug-flow model using the TOUGHREACT code developed at Berkeley Lab was used to simulate mineral dissolution, and a 2-D model was developed to simulate the flow of mineralized water through a planar fracture, where boiling conditions led to mineral precipitation. Predicted concentrations of the major dissolved constituents for the tuff dissolution were within a factor of 2 of the measured average steady-state compositions. The fracture-plugging simulations result in the precipitation of amorphous silica at the base of the boiling front, leading to a hundred-fold decrease in fracture permeability in less than 6 days, consistent with the laboratory experiment. These results help validate the use of the TOUGHREACT code for THC modeling of the Yucca Mountain system. The experiment and simulations indicate that boiling and concomitant precipitation of amorphous silica could cause significant reductions in fracture porosity and permeability on a local scale. The TOUGHREACT code will be used

  9. Validation of multigroup neutron cross sections and calculational methods for the advanced neutron source against the FOEHN critical experiments measurements

    International Nuclear Information System (INIS)

    Smith, L.A.; Gallmeier, F.X.; Gehin, J.C.

    1995-05-01

    The FOEHN critical experiment was analyzed to validate the use of multigroup cross sections and Oak Ridge National Laboratory neutronics computer codes in the design of the Advanced Neutron Source. The ANSL-V 99-group master cross section library was used for all the calculations. Three different critical configurations were evaluated using the multigroup KENO Monte Carlo transport code, the multigroup DORT discrete ordinates transport code, and the multigroup diffusion theory code VENTURE. The simple configuration consists of only the fuel and control elements with the heavy water reflector. The intermediate configuration includes boron endplates at the upper and lower edges of the fuel element. The complex configuration includes both the boron endplates and components in the reflector. Cross sections were processed using modules from the AMPX system. Both 99-group and 20-group cross sections were created and used in two-dimensional models of the FOEHN experiment. KENO calculations were performed using both 99-group and 20-group cross sections. The DORT and VENTURE calculations were performed using 20-group cross sections. Because the simple and intermediate configurations are azimuthally symmetric, these configurations can be explicitly modeled in R-Z geometry. Since the reflector components cannot be modeled explicitly using the current versions of these codes, three reflector component homogenization schemes were developed and evaluated for the complex configuration. Power density distributions were calculated with KENO using 99-group cross sections and with DORT and VENTURE using 20-group cross sections. The average differences between the measured values and the values calculated with the different computer codes range from 2.45 to 5.74%. The maximum differences between the measured and calculated thermal flux values for the simple and intermediate configurations are ∼ 13%, while the average differences are < 8%

  10. Jendl-3.1 iron validation on the PCA-REPLICA (H2O/Fe) shielding benchmark experiment

    International Nuclear Information System (INIS)

    Pescarini, M.; Borgia, M. G.

    1997-03-01

    The PCA-REPLICA (H 2 O/Fe) neutron shielding benchmarks experiment is analysed using the SN 2-D DOT 3.5-E code and the 3-D-equivalent flux synthesis method. This engineering benchmark reproduces the ex-core radial geometry of a PWR, including a mild steel reactor pressure vessel (RPV) simulator, and is designed to test the accuracy of the calculation of the in-vessel neutron exposure parameters. This accuracy is strongly dependent on the quality of the iron neutron cross sections used to describe the nuclear reactions within the RPV simulator. In particular, in this report, the cross sections based on the JENDL-3.1 iron data files are tested, through a comparison of the calculated integral and spectral results with the corresponding experimental data. In addition, the present results are compared, on the same benchmark experiment, with those of a preceding ENEA-Bologna validation of the ENDF/B VI iron cross sections. The integral result comparison indicates that, for all the threshold detectors considered (Rh-103 (n, n') Rh-103m, In-115 (n, n') In-115m and S-32 (n, p) P-32), the JENDL-3.1 natural iron data produce satisfactory results similar to those obtained with the ENDF/B VI iron data. On the contrary, when the JENDL/3.1 Fe-56 data file is used, strongly underestimated results are obtained for the lower energy threshold detectors, Rh-103 and In-115. This fact, in particular, becomes more evident with increasing the neutron penetration depth in the RPV simulator

  11. Validation of a δ2Hn-alkane-δ18Ohemicellulose based paleohygrometer: Implications from a climate chamber experiment

    Science.gov (United States)

    Hepp, Johannes; Kathrin Schäfer, Imke; Tuthorn, Mario; Wüthrich, Lorenz; Zech, Jana; Glaser, Bruno; Juchelka, Dieter; Rozanski, Kazimierz; Zech, Roland; Mayr, Christoph; Zech, Michael

    2017-04-01

    Leaf wax-derived biomarkers, e.g. long chain n-alkanes and fatty acids, and their hydrogen isotopic composition are proved to be of a value in paleoclimatology/-hydrology research. However, the alteration of the isotopic signal as a result of the often unknown amount of leaf water enrichment challenges a direct reconstruction of the isotopic composition of paleoprecipitation. The coupling of ^2H/^1H results of leaf wax-derived biomarkers with 18O/16O results of hemicellulose-derived sugars has the potential to overcome this limitation and additionally allows reconstructing relative air humidity (RH) (Zech et al., 2013). This approach was recently validated by Tuthorn et al. (2015) by applying it to topsoil samples along a climate transect in Argentina. Accordingly, the biomarker-derived RH values correlate significantly with modern actual RH values from the respective study sites, showing the potential of the established 'paleohygrometer' approach. However, a climate chamber validation study to answer open questions regarding this approach, e.g. how robust biosynthetic fractionation factors are, is still missing. Here we present coupled δ2Hn-alkane-δ18Ohemicellulose results obtained for leaf material from a climate chamber experiment, in which Eucalyptus globulus, Vicia faba and Brassica oleracea were grown under controlled conditions (Mayr, 2003). First, the 2H and 18O enrichment of leaf water strongly reflects actual RH values of the climate chambers. Second, the biomarker-based reconstructed RH values correlate well with the actual RH values of the respective climate chamber, validating the proposed 'paleohygrometer' approach. And third, the calculated fractionation factors between the investigated leaf biomarkers (n-C29 and n-C31 for alkanes; arabinose and xylose for hemicellulose) and leaf water are close to the expected once reviewed from the literature (+27\\permil for hemicellulose; -155\\permil for n-alkanes). Nevertheless, minor dependencies of these

  12. Validation of CESAR Thermal-hydraulic Module of ASTEC V1.2 Code on BETHSY Experiments

    Science.gov (United States)

    Tregoures, Nicolas; Bandini, Giacomino; Foucher, Laurent; Fleurot, Joëlle; Meloni, Paride

    The ASTEC V1 system code is being jointly developed by the French Institut de Radioprotection et Sûreté Nucléaire (IRSN) and the German Gesellschaft für Anlagen und ReaktorSicherheit (GRS) to address severe accident sequences in a nuclear power plant. Thermal-hydraulics in primary and secondary system is addressed by the CESAR module. The aim of this paper is to present the validation of the CESAR module, from the ASTEC V1.2 version, on the basis of well instrumented and qualified integral experiments carried out in the BETHSY facility (CEA, France), which simulates a French 900 MWe PWR reactor. Three tests have been thoroughly investigated with CESAR: the loss of coolant 9.1b test (OECD ISP N° 27), the loss of feedwater 5.2e test, and the multiple steam generator tube rupture 4.3b test. In the present paper, the results of the code for the three analyzed tests are presented in comparison with the experimental data. The thermal-hydraulic behavior of the BETHSY facility during the transient phase is well reproduced by CESAR: the occurrence of major events and the time evolution of main thermal-hydraulic parameters of both primary and secondary circuits are well predicted.

  13. Experiments at the GELINA facility for the validation of the self-indication neutron resonance densitometry technique

    Directory of Open Access Journals (Sweden)

    Rossa Riccardo

    2017-01-01

    Full Text Available Self-Indication Neutron Resonance Densitometry (SINRD is a passive non-destructive method that is being investigated to quantify the 239Pu content in a spent fuel assembly. The technique relies on the energy dependence of total cross sections for neutron induced reaction. The cross sections show resonance structures that can be used to quantify the presence of materials in objects, e.g. the total cross-section of 239Pu shows a strong resonance close to 0.3 eV. This resonance will cause a reduction of the number of neutrons emitted from spent fuel when 239Pu is present. Hence such a reduction can be used to quantify the amount of 239Pu present in the fuel. A neutron detector with a high sensitivity to neutrons in this energy region is used to enhance the sensitivity to 239Pu. This principle is similar to self-indication cross section measurements. An appropriate detector can be realized by surrounding a 239Pu-loaded fission chamber with appropriate neutron absorbing material. In this contribution experiments performed at the GELINA time-of-flight facility of the JRC at Geel (Belgium to validate the simulations are discussed. The results confirm that the strongest sensitivity to the target material was achieved with the self-indication technique, highlighting the importance of using a 239Pu fission chamber for the SINRD measurements.

  14. Experimental validation for combustion analysis of GOTHIC 6.1b code in 2-dimensional premixed combustion experiments

    International Nuclear Information System (INIS)

    Lee, J. Y.; Lee, J. J.; Park, K. C.

    2003-01-01

    In this study, the prediction capability of GOTHIC code for hydrogen combustion phenomena was validated with the results of two-dimensional premixed hydrogen combustion experiment executed by Seoul National University. In the experimental results, we could confirm the propagation characteristics of hydrogen flame such as buoyancy effect, flame front shape etc.. The combustion time of the tests was about 0.1 sec.. In the GOTHIC analyses results, the GOTHIC code could predict the overall hydrogen flame propagation characteristics but the buoyancy effect and flame shape did not compare well with the experimental results. Especially, in case of the flame propagate to the dead-end, GOTHIC predicted the flame did not affected by the flow and this cause quite different results in flame propagation from experimental results. Moreover the combustion time of the analyses was about 1 sec. which is ten times longer than the experimental result. To obtain more reasonable analysis results, it is necessary that combustion model parameters in GOTHIC code apply appropriately and hydrogen flame characteristics be reflected in solving governing equations

  15. Numerical experiment to estimate the validity of negative ion diagnostic using photo-detachment combined with Langmuir probing

    Energy Technology Data Exchange (ETDEWEB)

    Oudini, N. [Laboratoire des plasmas de décharges, Centre de Développement des Technologies Avancées, Cité du 20 Aout BP 17 Baba Hassen, 16081 Algiers (Algeria); Sirse, N.; Ellingboe, A. R. [Plasma Research Laboratory, School of Physical Sciences and NCPST, Dublin City University, Dublin 9 (Ireland); Benallal, R. [Unité de Recherche Matériaux et Energies Renouvelables, BP 119, Université Abou Bekr Belkaïd, Tlemcen 13000 (Algeria); Taccogna, F. [Istituto di Metodologie Inorganiche e di Plasmi, CNR, via Amendola 122/D, 70126 Bari (Italy); Aanesland, A. [Laboratoire de Physique des Plasmas, (CNRS, Ecole Polytechnique, Sorbonne Universités, UPMC Univ Paris 06, Univ Paris-Sud), École Polytechnique, 91128 Palaiseau Cedex (France); Bendib, A. [Laboratoire d' Electronique Quantique, Faculté de Physique, USTHB, El Alia BP 32, Bab Ezzouar, 16111 Algiers (Algeria)

    2015-07-15

    This paper presents a critical assessment of the theory of photo-detachment diagnostic method used to probe the negative ion density and electronegativity α = n{sub -}/n{sub e}. In this method, a laser pulse is used to photo-detach all negative ions located within the electropositive channel (laser spot region). The negative ion density is estimated based on the assumption that the increase of the current collected by an electrostatic probe biased positively to the plasma is a result of only the creation of photo-detached electrons. In parallel, the background electron density and temperature are considered as constants during this diagnostics. While the numerical experiments performed here show that the background electron density and temperature increase due to the formation of an electrostatic potential barrier around the electropositive channel. The time scale of potential barrier rise is about 2 ns, which is comparable to the time required to completely photo-detach the negative ions in the electropositive channel (∼3 ns). We find that neglecting the effect of the potential barrier on the background plasma leads to an erroneous determination of the negative ion density. Moreover, the background electron velocity distribution function within the electropositive channel is not Maxwellian. This is due to the acceleration of these electrons through the electrostatic potential barrier. In this work, the validity of the photo-detachment diagnostic assumptions is questioned and our results illustrate the weakness of these assumptions.

  16. Validation of the 2012 Fukuoka Consensus Guideline for Intraductal Papillary Mucinous Neoplasm of the Pancreas From a Single Institution Experience.

    Science.gov (United States)

    Yu, Songfeng; Takasu, Naoki; Watanabe, Toshihiro; Fukumoto, Tsuyoshi; Okazaki, Shinji; Tezuka, Koji; Sugawara, Shuichiro; Hirai, Ichiro; Kimura, Wataru

    2017-08-01

    The 2012 Fukuoka consensus guideline has stratified the risks of malignant intraductal papillary mucinous neoplasm (IPMN) of the pancreas into "high-risk stigmata" (HRS) and "worrisome feature" (WF). This study aimed to evaluate its clinical validity based on a single institution experience. Eighty-nine patients who underwent surgical resection with pathological diagnosis of IPMN were retrospectively studied. High-risk stigmata was significantly correlated with the prevalence of malignant IPMN as compared with WF. The positive predictive values of HRS and WF were 66.7% and 35.7% for branch duct IPMN and 80% and 38.1% for main duct IPMN, respectively. Univariate analysis indicated that all the factors in HRS and WF had statistical significance. Whereas multivariate analysis revealed only enhanced solid component (odds ratio [OR], 50.01; P = 0.008), presence of mural nodule (OR, 73.83; P < 0.001) and lymphadenopathy (OR, 20.85; P = 0.03) were independent predictors. Scoring HRS and WF by different numbers of positive factors resulted in improved predictive value. The area under the curve of HRS score was significantly lower than that of WF or HRS + WF score (0.680 vs 0.900 or 0.902, respectively; P < 0.001). As supplementary to the 2012 Fukuoka guideline, we suggest that calculating scores of WF and HRS may have superior diagnostic accuracy in predicting malignant IPMN.

  17. An intercomparison of a large ensemble of statistical downscaling methods for Europe: Overall results from the VALUE perfect predictor cross-validation experiment

    Science.gov (United States)

    Gutiérrez, Jose Manuel; Maraun, Douglas; Widmann, Martin; Huth, Radan; Hertig, Elke; Benestad, Rasmus; Roessler, Ole; Wibig, Joanna; Wilcke, Renate; Kotlarski, Sven

    2016-04-01

    VALUE is an open European network to validate and compare downscaling methods for climate change research (http://www.value-cost.eu). A key deliverable of VALUE is the development of a systematic validation framework to enable the assessment and comparison of both dynamical and statistical downscaling methods. This framework is based on a user-focused validation tree, guiding the selection of relevant validation indices and performance measures for different aspects of the validation (marginal, temporal, spatial, multi-variable). Moreover, several experiments have been designed to isolate specific points in the downscaling procedure where problems may occur (assessment of intrinsic performance, effect of errors inherited from the global models, effect of non-stationarity, etc.). The list of downscaling experiments includes 1) cross-validation with perfect predictors, 2) GCM predictors -aligned with EURO-CORDEX experiment- and 3) pseudo reality predictors (see Maraun et al. 2015, Earth's Future, 3, doi:10.1002/2014EF000259, for more details). The results of these experiments are gathered, validated and publicly distributed through the VALUE validation portal, allowing for a comprehensive community-open downscaling intercomparison study. In this contribution we describe the overall results from Experiment 1), consisting of a European wide 5-fold cross-validation (with consecutive 6-year periods from 1979 to 2008) using predictors from ERA-Interim to downscale precipitation and temperatures (minimum and maximum) over a set of 86 ECA&D stations representative of the main geographical and climatic regions in Europe. As a result of the open call for contribution to this experiment (closed in Dec. 2015), over 40 methods representative of the main approaches (MOS and Perfect Prognosis, PP) and techniques (linear scaling, quantile mapping, analogs, weather typing, linear and generalized regression, weather generators, etc.) were submitted, including information both data

  18. arena_cove.grd

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — NGDC builds and distributes high-resolution, coastal digital elevation models (DEMs) that integrate ocean bathymetry and land topography to support NOAA's mission to...

  19. Validation of a 16-Item Short Form of the Czech Version of the Experiences in Close Relationships Revised Questionnaire in a Representative Sample

    Czech Academy of Sciences Publication Activity Database

    Kaščáková, N.; Husárová, D.; Hašto, J.; Kolarčik, P.; Poláčková Šolcová, Iva; Madarasová Gecková, A.; Tavel, P.

    2016-01-01

    Roč. 119, č. 3 (2016), s. 804-825 ISSN 0033-2941 Institutional support: RVO:68081740 Keywords : Short form of the ECR-R * Experiences in Close Relationships Revised Questionnaire * validation * attachment anxiety * attachment avoidance * attachment styles * representative sample Subject RIV: AN - Psychology Impact factor: 0.629, year: 2016

  20. ENDF/B VI iron validation onpca-replica (H2O/FE) shielding benchmark experiment

    Energy Technology Data Exchange (ETDEWEB)

    Pescarini, M. [ENEA, Bologna (Italy). Centro Ricerche Energia `E. Clementel` - Area Energia e Innovazione

    1994-05-01

    The PCA-REPLICA (H2O/Fe) neutron shielding benchmark experiment is analysed using the SN 2-D DOT 3.5 code and the 3-D-equivalent flux synthesis method. This engineering benchmark reproduces the ex-core radial geometry of a PWR, including a mild steel reactor pressure vessel (RPV) simulator, and is dsigned to test the accuracy of the calculation of the in-vessel neutron exposure parameters (fast fluence and iron displacement rates). This accuracy is strongly dependent on the quality of the iron neutron cross section used to describe the nuclear reactions within the RPV simulator. In particular, in this report, the cross sections based on the ENDF/B VI iron data files are tested, through a comparison of the calculated integral and spectral results with the corresponding experimental data. In addition, the present results are compared, on the same benchmark experiment, with those of a preceding ENEA (Italian Agency for Energy, New Technologies and Environment)-Bologna validation of the JEF-2.1 iron cross sections. The integral result comparison indicates that, for all the thresold detectors considered (Rh-103 (n,n) Rh-103m, In-115 (n,n) In-115 (n,n) In-115m and S-32 (n.p) P-32), the ENDF/B VI iron data produce better results than the JEF-2.1 iron data. In particular, in the ENDF/B VI calcultaions, an improvement of the in-vessel C/E (Calculated/Experimental) activity ratios for the lower energy threshold detectors, Rh-103 and In-115, is observed. This improvement becomes more evident with increasing neutron penetration depth in the vessel. This is probably attributable to the fact that the inelastic scattering cross section values of the ENDF/B VI Fe-56 data file, approximately in the 0.86 - 1.5 MeV energy range, are lower then the corresponding values of the JEF-2.1 data file.

  1. Development and validation of a HPLC method for the assay of dapivirine in cell-based and tissue permeability experiments.

    Science.gov (United States)

    das Neves, José; Sarmento, Bruno; Amiji, Mansoor; Bahia, Maria Fernanda

    2012-12-12

    Dapivirine, a non-nucleoside reverse transcriptase inhibitor, is being currently used for the development of potential anti-HIV microbicide formulations and delivery systems. A new high-performance liquid chromatography (HPLC) method with UV detection was developed for the assay of this drug in different biological matrices, namely cell lysates, receptor media from permeability experiments and homogenates of mucosal tissues. The method used a reversed-phase C18 column with a mobile phase composed of trifluoroacetic acid solution (0.1%, v/v) and acetonitrile in a gradient mode. Injection volume was 50μL and the flow rate 1mL/min. The total run time was 12min and UV detection was performed at 290nm for dapivirine and the internal standard (IS) diphenylamine. A Box-Behnken experimental design was used to study different experimental variables of the method, namely the ratio of the mobile phase components and the gradient time, and their influence in responses such as the retention factor, tailing factor, and theoretical plates for dapivirine and the IS, as well as the peak resolution between both compounds. The optimized method was further validated and its usefulness assessed for in vitro and ex vivo experiments using dapivirine or dapivirine-loaded nanoparticles. The method showed to be selective, linear, accurate and precise in the range of 0.02-1.5μg/mL. Other chromatographic parameters, namely carry-over, lower limit of quantification (0.02μg/mL), limit of detection (0.006μg/mL), recovery (equal or higher than 90.7%), and sample stability at different storage conditions, were also determined and found adequate for the intended purposes. The method was successfully used for cell uptake assays and permeability studies across cell monolayers and pig genital mucosal tissues. Overall, the proposed method provides a simple, versatile and reliable way for studying the behavior of dapivirine in different biological matrices and assessing its potential as an anti

  2. Lithogenic and biogenic particle deposition in an Antarctic coastal environment (Marian Cove, King George Island): Seasonal patterns from a sediment trap study

    Science.gov (United States)

    Khim, B. K.; Shim, J.; Yoon, H. I.; Kang, Y. C.; Jang, Y. H.

    2007-06-01

    Particulate suspended material was recovered over a 23-month period using two sediment traps deployed in shallow water (˜30 m deep) off the King Sejong Station located in Marian Cove of King George Island, West Antarctica. Variability in seasonal flux and geochemical characteristics of the sediment particles highlights seasonal patterns of sedimentation of both lithogenic (terrigenous) and biogenic particles in the coastal glaciomarine environment. All components including total mass flux, lithogenic particle flux and biogenic particle flux show distinct seasonal variation, with high recovery rates during the summer and low rates under winter fast ice. The major contributor to total mass flux is the lithogenic component, comprising from 88% during the summer months (about 21 g m -2 d -1) up to 97% during the winter season (about 2 g m -2 d -1). The lithogenic particle flux depends mainly on the amount of snow-melt (snow accumulation) delivered into the coastal region as well as on the resuspension of sedimentary materials. These fine-grained lithogenic particles are silt-to-clay sized, composed mostly of clay minerals weathered on King George Island. Biogenic particle flux is also seasonal. Winter flux is ˜0.2 g m -2 d -1, whereas the summer contribution increases more than tenfold, up to 2.6 g m -2 d -1. Different biogenic flux between the two summers indicates inter-annual variability to the spring-summer phytoplankton bloom. The maximum of lithogenic particle flux occurs over a short period of time, and follows the peak of biogenic particle flux, which lasts longer. The seasonal warming and sea-ice retreat result in change in seawater nutrient status and subsequent ice-edge phytoplankton production. Meanwhile, the meltwater input to Marian Cove from the coastal drainage in January to February plays a major role in transporting lithogenic particles into the shallow water environment, although the tidal currents may be the main agents of resuspension in this

  3. Validation of CTF Droplet Entrainment and Annular/Mist Closure Models using Riso Steam/Water Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Wysocki, Aaron J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Salko, Robert K. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-02-01

    This report summarizes the work done to validate the droplet entrainment and de-entrainment models as well as two-phase closure models in the CTF code by comparison with experimental data obtained at Riso National Laboratory. The Riso data included a series of over 250 steam/water experiments that were performed in both tube and annulus geometries over a range of various pressures and outlet qualities. Experimental conditions were set so that the majority of cases were in the annular/mist ow regime. Measurements included liquid lm ow rate, droplet ow rate, lm thickness, and two-phase pressure drop. CTF was used to model 180 of the tubular geometry cases, matching experimental geometry, outlet pressure, and outlet ow quality to experimental values. CTF results were compared to the experimental data at the outlet of the test section in terms of vapor and entrained liquid ow fractions, pressure drop per unit length, and liquid lm thickness. The entire process of generating CTF input decks, running cases, extracting data, and generating comparison plots was scripted using Python and Matplotlib for a completely automated validation process. All test cases and scripting tools have been committed to the COBRA-TF master repository and selected cases have been added to the continuous testing system to serve as regression tests. The dierences between the CTF- and experimentally-calculated ow fraction values were con- sistent with previous calculations by Wurtz, who applied the same entrainment correlation to the same data. It has been found that CTF's entrainment/de-entrainment predictive capability in the annular/mist ow regime for this particular facility is comparable to the licensed industry code, COBRAG. While lm and droplet predictions are generally good, it has been found that accuracy is diminished at lower ow qualities. This nding is consistent with the noted deciencies in the Wurtz entrainment model employed by CTF. The CTF predicted two-phase pressure drop in

  4. Measurements of Humidity in the Atmosphere and Validation Experiments (MOHAVE-2009: overview of campaign operations and results

    Directory of Open Access Journals (Sweden)

    T. Leblanc

    2011-12-01

    Full Text Available The Measurements of Humidity in the Atmosphere and Validation Experiment (MOHAVE 2009 campaign took place on 11–27 October 2009 at the JPL Table Mountain Facility in California (TMF. The main objectives of the campaign were to (1 validate the water vapor measurements of several instruments, including, three Raman lidars, two microwave radiometers, two Fourier-Transform spectrometers, and two GPS receivers (column water, (2 cover water vapor measurements from the ground to the mesopause without gaps, and (3 study upper tropospheric humidity variability at timescales varying from a few minutes to several days.

    A total of 58 radiosondes and 20 Frost-Point hygrometer sondes were launched. Two types of radiosondes were used during the campaign. Non negligible differences in the readings between the two radiosonde types used (Vaisala RS92 and InterMet iMet-1 made a small, but measurable impact on the derivation of water vapor mixing ratio by the Frost-Point hygrometers. As observed in previous campaigns, the RS92 humidity measurements remained within 5% of the Frost-point in the lower and mid-troposphere, but were too dry in the upper troposphere.

    Over 270 h of water vapor measurements from three Raman lidars (JPL and GSFC were compared to RS92, CFH, and NOAA-FPH. The JPL lidar profiles reached 20 km when integrated all night, and 15 km when integrated for 1 h. Excellent agreement between this lidar and the frost-point hygrometers was found throughout the measurement range, with only a 3% (0.3 ppmv mean wet bias for the lidar in the upper troposphere and lower stratosphere (UTLS. The other two lidars provided satisfactory results in the lower and mid-troposphere (2–5% wet bias over the range 3–10 km, but suffered from contamination by fluorescence (wet bias ranging from 5 to 50% between 10 km and 15 km, preventing their use as an independent measurement in the UTLS.

    The comparison between all available stratospheric

  5. EXCALIBUR-at-CALIBAN: a neutron transmission experiment for {sup 238}U(n,n'{sub continuum}γ) nuclear data validation

    Energy Technology Data Exchange (ETDEWEB)

    Bernard, David; Leconte, Pierre; Destouches, Christophe [CEA, DEN, DER, SPRC et SPEX, Cadarache F-13108 SAINT-PAUL-LEZ-DURANCE (France); Casoli, Pierre; Chambru, Laurent; Chanussot, Didier; Chateauvieux, Herve; Gevrey, Gaetan; Guilbert, Frederique; Lereuil, Hugues; Rousseau, Guillaume; Schaub, Muriel [CEA, DAM, Valduc F-21120 IS-SUR-TILLE (France); Heusch, Murielle; Meplan, Olivier; Ramdhane, Mourad [CNRS/IN2P3, 53 rue des Martyrs, F-38026 Grenoble, Cedex (France)

    2015-07-01

    Two recent papers justified a new experimental program to give a new basis for the validation of {sup 238}U nuclear data, namely neutron induced inelastic scattering and transport codes at neutron fission energies. The general idea is to perform a neutron transmission experiment through natural uranium material. As shown by Hans Bethe, neutron transmissions measured by dosimetric responses are linked to inelastic cross sections. This paper describes the principle and the results of such an experience called EXCALIBUR performed recently (January and October 2014) at the CALIBAN reactor facility. (authors)

  6. Prospective Validation of the Decalogue, a Set of Doctor-Patient Communication Recommendations to Improve Patient Illness Experience and Mood States within a Hospital Cardiologic Ambulatory Setting

    Directory of Open Access Journals (Sweden)

    Piercarlo Ballo

    2017-01-01

    Full Text Available Strategies to improve doctor-patient communication may have a beneficial impact on patient’s illness experience and mood, with potential favorable clinical effects. We prospectively tested the psychometric and clinical validity of the Decalogue, a tool utilizing 10 communication recommendations for patients and physicians. The Decalogue was administered to 100 consecutive patients referred for a cardiologic consultation, whereas 49 patients served as controls. The POMS-2 questionnaire was used to measure the total mood disturbance at the end of the consultation. Structural equation modeling showed high internal consistency (Cronbach alpha 0.93, good test-retest reproducibility, and high validity of the psychometric construct (all > 0.80, suggesting a positive effect on patients’ illness experience. The total mood disturbance was lower in the patients exposed to the Decalogue as compared to the controls (1.4±12.1 versus 14.8±27.6, p=0.0010. In an additional questionnaire, patients in the Decalogue group showed a trend towards a better understanding of their state of health (p=0.07. In a cardiologic ambulatory setting, the Decalogue shows good validity and reliability as a tool to improve patients’ illness experience and could have a favorable impact on mood states. These effects might potentially improve patient engagement in care and adherence to therapy, as well as clinical outcome.

  7. Validation of the BERT Point Source Inversion Scheme Using the Joint Urban 2003 Tracer Experiment Dataset - Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Brambilla, Sara [Los Alamos National Laboratory; Brown, Michael J. [Los Alamos National Laboratory

    2012-06-18

    zones. Due to a unique source inversion technique - called the upwind collector footprint approach - the tool runs fast and the source regions can be determined in a few minutes. In this report, we provide an overview of the BERT framework, followed by a description of the source inversion technique. The Joint URBAN 2003 field experiment held in Oklahoma City that was used to validate BERT is then described. Subsequent sections describe the metrics used for evaluation, the comparison of the experimental data and BERT output, and under what conditions the BERT tool succeeds and performs poorly. Results are aggregated in different ways (e.g., daytime vs. nighttime releases, 1 vs. 2 vs. 3 hit collectors) to determine if BERT shows any systematic errors. Finally, recommendations are given for how to improve the code and procedures for optimizing performance in operational mode.

  8. The Autism Family Experience Questionnaire (AFEQ): An Ecologically-Valid, Parent-Nominated Measure of Family Experience, Quality of Life and Prioritised Outcomes for Early Intervention

    Science.gov (United States)

    Leadbitter, Kathy; Aldred, Catherine; McConachie, Helen; Le Couteur, Ann; Kapadia, Dharmi; Charman, Tony; Macdonald, Wendy; Salomone, Erica; Emsley, Richard; Green, Jonathan; Barrett, Barbara; Barron, Sam; Beggs, Karen; Blazey, Laura; Bourne, Katy; Byford, Sarah; Cole-Fletcher, Rachel; Collino, Julia; Colmer, Ruth; Cutress, Anna; Gammer, Isobel; Harrop, Clare; Houghton, Tori; Howlin, Pat; Hudry, Kristelle; Leach, Sue; Maxwell, Jessica; Parr, Jeremy; Pickles, Andrew; Randles, Sarah; Slonims, Vicky; Taylor, Carol; Temple, Kathryn; Tobin, Hannah; Vamvakas, George; White, Lydia

    2018-01-01

    There is a lack of measures that reflect the intervention priorities of parents of children with autism spectrum disorder (ASD) and that assess the impact of interventions on family experience and quality of life. The Autism Family Experience Questionnaire (AFEQ) was developed through focus groups and online consultation with parents, and…

  9. Validation study of the reactor physics lattice transport code WIMSD-5B by TRX and BAPL critical experiments of light water reactors

    International Nuclear Information System (INIS)

    Khan, M.J.H.; Alam, A.B.M.K.; Ahsan, M.H.; Mamun, K.A.A.; Islam, S.M.A.

    2015-01-01

    Highlights: • To validate the reactor physics lattice code WIMSD-5B by this analysis. • To model TRX and BAPL critical experiments using WIMSD-5B. • To compare the calculated results with experiment and MCNP results. • To rely on WIMSD-5B code for TRIGA calculations. - Abstract: The aim of this analysis is to validate the reactor physics lattice transport code WIMSD-5B by TRX (thermal reactor-one region lattice) and BAPL (Bettis Atomic Power Laboratory-one region lattice) critical experiments of light water reactors for neutronics analysis of 3 MW TRIGA Mark-II research reactor at AERE, Dhaka, Bangladesh. This analysis is achieved through the analysis of integral parameters of five light water reactor critical experiments TRX-1, TRX-2, BAPL-UO 2 -1, BAPL-UO 2 -2 and BAPL-UO 2 -3 based on evaluated nuclear data libraries JEFF-3.1 and ENDF/B-VII.1. In integral measurements, these experiments are considered as standard benchmark lattices for validating the reactor physics lattice transport code WIMSD-5B as well as evaluated nuclear data libraries. The integral parameters of the said critical experiments are calculated using the reactor physics lattice transport code WIMSD-5B. The calculated integral parameters are compared to the measured values as well as the earlier published MCNP results based on the Chinese evaluated nuclear data library CENDL-3.0 for assessment of deterministic calculation. It was found that the calculated integral parameters give mostly reasonable and globally consistent results with the experiment and the MCNP results. Besides, the group constants in WIMS format for the isotopes U-235 and U-238 between two data files have been compared using WIMS library utility code WILLIE and it was found that the group constants are well consistent with each other. Therefore, this analysis reveals the validation study of the reactor physics lattice transport code WIMSD-5B based on JEFF-3.1 and ENDF/B-VII.1 libraries and can also be essential to

  10. The role of CFD combustion modeling in hydrogen safety management – V: Validation for slow deflagrations in homogeneous hydrogen-air experiments

    Energy Technology Data Exchange (ETDEWEB)

    Sathiah, Pratap [Nuclear Research and Consultancy Group (NRG), Westerduinweg 3, 1755 ZG Petten (Netherlands); Holler, Tadej, E-mail: tadej.holler@ijs.si [Jozef Stefan Institute (JSI), Jamova cesta 39, 1000 Ljubljana (Slovenia); Kljenak, Ivo [Jozef Stefan Institute (JSI), Jamova cesta 39, 1000 Ljubljana (Slovenia); Komen, Ed [Nuclear Research and Consultancy Group (NRG), Westerduinweg 3, 1755 ZG Petten (Netherlands)

    2016-12-15

    Highlights: • Validation of the modeling approach for hydrogen deflagration is presented. • Modeling approach is based on two combustion models implemented in ANSYS Fluent. • Experiments with various initial hydrogen concentrations were used for validation. • The effects of heat transfer mechanisms selection were also investigated. • The grid sensitivity analysis was performed as well. - Abstract: The control of hydrogen in the containment is an important safety issue following rapid oxidation of the uncovered reactor core during a severe accident in a Nuclear Power Plant (NPP), because dynamic pressure loads from eventual hydrogen combustion can be detrimental to the structural integrity of the reactor safety systems and the reactor containment. In the set of our previous papers, a CFD-based method to assess the consequence of fast combustion of uniform hydrogen-air mixtures was presented, followed by its validation for hydrogen-air mixtures with diluents and for non-uniform hydrogen-air mixtures. In the present paper, the extension of this model for the slow deflagration regime is presented and validated using the hydrogen deflagration experiments performed in the medium-scale experimental facility THAI. The proposed method is implemented in the CFD software ANSYS Fluent using user defined functions. The paper describes the combustion model and the main results of code validation. It addresses questions regarding turbulence model selection, effect of heat transfer mechanisms, and grid sensitivity, as well as provides insights into the importance of combustion model choice for the slow deflagration regime of hydrogen combustion in medium-scale and large-scale experimental vessels mimicking the NPP containment.

  11. CFD Validation Experiment of a Mach 2.5 Axisymmetric Shock-Wave Boundary-Layer Interaction

    Science.gov (United States)

    Davis, David O.

    2015-01-01

    Preliminary results of an experimental investigation of a Mach 2.5 two-dimensional axisymmetric shock-wave/boundary-layer interaction (SWBLI) are presented. The purpose of the investigation is to create a SWBLI dataset specifically for CFD validation purposes. Presented herein are the details of the facility and preliminary measurements characterizing the facility and interaction region. The results will serve to define the region of interest where more detailed mean and turbulence measurements will be made.

  12. Assessing movement quality in persons with severe mental illness - Reliability and validity of the Body Awareness Scale Movement Quality and Experience.

    Science.gov (United States)

    Hedlund, Lena; Gyllensten, Amanda Lundvik; Waldegren, Tomas; Hansson, Lars

    2016-05-01

    Motor disturbances and disturbed self-recognition are common features that affect mobility in persons with schizophrenia spectrum disorder and bipolar disorder. Physiotherapists in Scandinavia assess and treat movement difficulties in persons with severe mental illness. The Body Awareness Scale Movement Quality and Experience (BAS MQ-E) is a new and shortened version of the commonly used Body Awareness Scale-Health (BAS-H). The purpose of this study was to investigate the inter-rater reliability and the concurrent validity of BAS MQ-E in persons with severe mental illness. The concurrent validity was examined by investigating the relationships between neurological soft signs, alexithymia, fatigue, anxiety, and mastery. Sixty-two persons with severe mental illness participated in the study. The results showed a satisfactory inter-rater reliability (n = 53) and a concurrent validity (n = 62) with neurological soft signs, especially cognitive and perceptual based signs. There was also a concurrent validity linked to physical fatigue and aspects of alexithymia. The scores of BAS MQ-E were in general higher for persons with schizophrenia compared to persons with other diagnoses within the schizophrenia spectrum disorders and bipolar disorder. The clinical implications are presented in the discussion.

  13. Psychological and interactional characteristics of patients with somatoform disorders: Validation of the Somatic Symptoms Experiences Questionnaire (SSEQ) in a clinical psychosomatic population.

    Science.gov (United States)

    Herzog, Annabel; Voigt, Katharina; Meyer, Björn; Wollburg, Eileen; Weinmann, Nina; Langs, Gernot; Löwe, Bernd

    2015-06-01

    The new DSM-5 Somatic Symptom Disorder (SSD) emphasizes the importance of psychological processes related to somatic symptoms in patients with somatoform disorders. To address this, the Somatic Symptoms Experiences Questionnaire (SSEQ), the first self-report scale that assesses a broad range of psychological and interactional characteristics relevant to patients with a somatoform disorder or SSD, was developed. This prospective study was conducted to validate the SSEQ. The 15-item SSEQ was administered along with a battery of self-report questionnaires to psychosomatic inpatients. Patients were assessed with the Structured Clinical Interview for DSM-IV to confirm a somatoform, depressive, or anxiety disorder. Confirmatory factor analyses, tests of internal consistency and tests of validity were performed. Patients (n=262) with a mean age of 43.4 years, 60.3% women, were included in the analyses. The previously observed four-factor model was replicated and internal consistency was good (Cronbach's α=.90). Patients with a somatoform disorder had significantly higher scores on the SSEQ (t=4.24, pquality of life. Sensitivity to change was shown by significantly higher effect sizes of the SSEQ change scores for improved patients than for patients without improvement. The SSEQ appears to be a reliable, valid, and efficient instrument to assess a broad range of psychological and interactional features related to the experience of somatic symptoms. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Validation philosophy

    International Nuclear Information System (INIS)

    Vornehm, D.

    1994-01-01

    To determine when a set of calculations falls within an umbrella of an existing validation documentation, it is necessary to generate a quantitative definition of range of applicability (our definition is only qualitative) for two reasons: (1) the current trend in our regulatory environment will soon make it impossible to support the legitimacy of a validation without quantitative guidelines; and (2) in my opinion, the lack of support by DOE for further critical experiment work is directly tied to our inability to draw a quantitative open-quotes line-in-the-sandclose quotes beyond which we will not use computer-generated values

  15. Measuring the Pros and Cons of What It Means to Be a Black Man: Development and Validation of the Black Men's Experiences Scale (BMES).

    Science.gov (United States)

    Bowleg, Lisa; English, Devin; Del Rio-Gonzalez, Ana Maria; Burkholder, Gary J; Teti, Michelle; Tschann, Jeanne M

    2016-04-01

    Although extensive research documents that Black people in the U.S. frequently experience social discrimination, most of this research aggregates these experiences primarily or exclusively by race. Consequently, empirical gaps exist about the psychosocial costs and benefits of Black men's experiences at the intersection of race and gender. Informed by intersectionality, a theoretical framework that highlights how multiple social identities intersect to reflect interlocking social-structural inequality, this study addresses these gaps with the qualitative development and quantitative test of the Black Men's Experiences Scale (BMES). The BMES assesses Black men's negative experiences with overt discrimination and microaggressions, as well their positive evaluations of what it means to be Black men. First, we conducted focus groups and individual interviews with Black men to develop the BMES. Next, we tested the BMES with 578 predominantly low-income urban Black men between the ages of 18 and 44. Exploratory factor analysis suggested a 12-item, 3-factor solution that explained 63.7% of the variance. We labeled the subscales: Overt Discrimination, Microaggressions, and Positives: Black Men . Confirmatory factor analysis supported the three-factor solution. As hypothesized, the BMES's subscales correlated with measures of racial discrimination, depression, resilience, and social class at the neighborhood-level. Preliminary evidence suggests that the BMES is a reliable and valid measure of Black men's experiences at the intersection of race and gender.

  16. Preliminary validation and principal components analysis of the Control of Eating Questionnaire (CoEQ) for the experience of food craving.

    Science.gov (United States)

    Dalton, M; Finlayson, G; Hill, A; Blundell, J

    2015-12-01

    The Control of Eating Questionnaire (CoEQ) comprises 21-items that are designed to assess the severity and type of food cravings an individual experiences over the previous 7 days. The CoEQ has been used in clinical trials as a multi-dimensional measure of appetite, craving and mood regulation however its underlying component structure has yet to be determined. The current paper has two aims; (1) to examine the psychometric properties, and internal consistency of the CoEQ; and (2) to provide a preliminary examination of the underlying components by exploring their construct and predictive validity. Data were pooled from four studies in which a total 215 adults (80% women; Age=29.7 ± 10.3; BMI=26.5 ± 5.2) had completed the CoEQ alongside measures of psychometric eating behaviour traits, ad libitum food intake, and body composition. A principal components analysis (PCA) and parallel analysis was conducted to examine the underlying structure of the questionnaire. The resulting subscales were tested for internal consistency (Cronbach's α=0.66-0.88). PCA revealed four components that explained 54.5% of the variance. The components were identified as: Craving Control, Positive Mood, Craving for Sweet, and Craving for Savoury. Associations between the underlying CoEQ subscales and measures of body composition and eating behaviour traits confirmed construct validity of the subscales. The associations between the subscales and snack food selection and intake of palatable snack foods supported the CoEQ's predictive validity. The CoEQ has good psychometric properties with a clear component structure and acceptable internal consistency. This preliminary validation supports the CoEQ as a measure of the experience of food cravings.

  17. Measuring Black Men’s Police-Based Discrimination Experiences: Development and Validation of the Police and Law Enforcement (PLE) Scale

    Science.gov (United States)

    English, Devin; Bowleg, Lisa; del Río-González, Ana Maria; Tschann, Jeanne M.; Agans, Robert; Malebranche, David J

    2017-01-01

    Objectives Although social science research has examined police and law enforcement-perpetrated discrimination against Black men using policing statistics and implicit bias studies, there is little quantitative evidence detailing this phenomenon from the perspective of Black men. Consequently, there is a dearth of research detailing how Black men’s perspectives on police and law enforcement-related stress predict negative physiological and psychological health outcomes. This study addresses these gaps with the qualitative development and quantitative test of the Police and Law Enforcement (PLE) scale. Methods In Study 1, we employed thematic analysis on transcripts of individual qualitative interviews with 90 Black men to assess key themes and concepts and develop quantitative items. In Study 2, we used 2 focus groups comprised of 5 Black men each (n=10), intensive cognitive interviewing with a separate sample of Black men (n=15), and piloting with another sample of Black men (n=13) to assess the ecological validity of the quantitative items. For study 3, we analyzed data from a sample of 633 Black men between the ages of 18 and 65 to test the factor structure of the PLE, as we all as its concurrent validity and convergent/discriminant validity. Results Qualitative analyses and confirmatory factor analyses suggested that a 5-item, 1-factor measure appropriately represented respondents’ experiences of police/law enforcement discrimination. As hypothesized, the PLE was positively associated with measures of racial discrimination and depressive symptoms. Conclusions Preliminary evidence suggests that the PLE is a reliable and valid measure of Black men’s experiences of discrimination with police/law enforcement. PMID:28080104

  18. Validating the Patient Experience with Treatment and Self-Management (PETS), a patient-reported measure of treatment burden, in people with diabetes.

    Science.gov (United States)

    Rogers, Elizabeth A; Yost, Kathleen J; Rosedahl, Jordan K; Linzer, Mark; Boehm, Deborah H; Thakur, Azra; Poplau, Sara; Anderson, Roger T; Eton, David T

    2017-01-01

    To validate a comprehensive general measure of treatment burden, the Patient Experience with Treatment and Self-Management (PETS), in people with diabetes. We conducted a secondary analysis of a cross-sectional survey study with 120 people diagnosed with type 1 or type 2 diabetes and at least one additional chronic illness. Surveys included established patient-reported outcome measures and a 48-item version of the PETS, a new measure comprised of multi-item scales assessing the burden of chronic illness treatment and self-care as it relates to nine domains: medical information, medications, medical appointments, monitoring health, interpersonal challenges, health care expenses, difficulty with health care services, role activity limitations, and physical/mental exhaustion from self-management. Internal reliability of PETS scales was determined using Cronbach's alpha. Construct validity was determined through correlation of PETS scores with established measures (measures of chronic condition distress, medication satisfaction, self-efficacy, and global well-being), and known-groups validity through comparisons of PETS scores across clinically distinct groups. In an exploratory test of predictive validity, step-wise regressions were used to determine which PETS scales were most associated with outcomes of chronic condition distress, overall physical and mental health, and medication adherence. Respondents were 37-88 years old, 59% female, 29% non-white, and 67% college-educated. PETS scales showed good reliability (Cronbach's alphas ≥0.74). Higher PETS scale scores (greater treatment burden) were correlated with more chronic condition distress, less medication convenience, lower self-efficacy, and worse general physical and mental health. Participants less (versus more) adherent to medications and those with more (versus fewer) health care financial difficulties had higher mean PETS scores. Medication burden was the scale that was most consistently associated with

  19. The role of CFD combustion modeling in hydrogen safety management – IV: Validation based on non-homogeneous hydrogen–air experiments

    Energy Technology Data Exchange (ETDEWEB)

    Sathiah, Pratap, E-mail: pratap.sathiah78@gmail.com [Nuclear Research and Consultancy Group (NRG), Westerduinweg 3, 1755 ZG Petten (Netherlands); Komen, Ed, E-mail: komen@nrg.eu [Nuclear Research and Consultancy Group (NRG), Westerduinweg 3, 1755 ZG Petten (Netherlands); Roekaerts, Dirk, E-mail: d.j.e.m.roekaerts@tudelft.nl [Delft University of Technology, Department of Process and Energy, Section Fluid Mechanics, Mekelweg 2, 2628 CD Delft (Netherlands)

    2016-12-15

    Highlights: • TFC combustion model is further extended to simulate flame propagation in non-homogeneous hydrogen–air mixtures. • TFC combustion model results are in good agreement with large-scale non-homogeneous hydrogen–air experiments. • The model is further extended to account for the non-uniform hydrogen–air–steam mixture for the presence of PARs on hydrogen deflagration. - Abstract: The control of hydrogen in the containment is an important safety issue in NPPs during a loss of coolant accident, because the dynamic pressure loads from hydrogen combustion can be detrimental to the structural integrity of the reactor safety systems and the reactor containment. In Sathiah et al. (2012b), we presented a computational fluid dynamics based method to assess the consequence of the combustion of uniform hydrogen–air mixtures. In the present article, the extension of this method to and its validation for non-uniform hydrogen–air mixture is described. The method is implemented in the CFD software ANSYS FLUENT using user defined functions. The extended code is validated against non-uniform hydrogen–air experiments in the ENACCEF facility. It is concluded that the maximum pressure and intermediate peak pressure were predicted within 12% and 18% accuracy. The eigen frequencies of the residual pressure wave phenomena were predicted within 4%. It is overall concluded that the current model predicts the considered ENACCEF experiments well.

  20. A computational method for computing an Alzheimer’s Disease Progression Score; experiments and validation with the ADNI dataset

    Science.gov (United States)

    Jedynak, Bruno M.; Liu, Bo; Lang, Andrew; Gel, Yulia; Prince, Jerry L.

    2014-01-01

    Understanding the time-dependent changes of biomarkers related to Alzheimer’s disease (AD) is a key to assessing disease progression and to measuring the outcomes of disease-modifying therapies. In this paper, we validate an Alzheimer’s disease progression score model which uses multiple biomarkers to quantify the AD progression of subjects following three assumptions: (1) there is a unique disease progression for all subjects, (2) each subject has a different age of onset and rate of progression, and (3) each biomarker is sigmoidal as a function of disease progression. Fitting the parameters of this model is a challenging problem which we approach using an alternating least squares optimization algorithm. In order to validate this optimization scheme under realistic conditions, we use the Alzheimer’s Disease Neuroimaging Initiative (ADNI) cohort. With the help of Monte Carlo simulations, we show that most of the global parameters of the model are tightly estimated, thus enabling an ordering of the biomarkers that fit the model well, ordered as: the Rey auditory verbal learning test with 30 minutes delay, the sum of the two lateral hippocampal volumes divided by the intra-cranial volume, followed by (the clinical dementia rating sum of boxes score and the mini mental state examination score) in no particular order and lastly the Alzheimer’s disease assessment scale-cognitive subscale. PMID:25444605

  1. The role of CFD combustion modeling in hydrogen safety management – III: Validation based on homogeneous hydrogen–air–diluent experiments

    Energy Technology Data Exchange (ETDEWEB)

    Sathiah, Pratap, E-mail: pratap.sathiah78@gmail.com [Shell Global Solutions Ltd., Brabazon House, Concord Business Park, Threapwood Road, Manchester M220RR (United Kingdom); Komen, Ed [Nuclear Research and Consultancy Group – NRG, P.O. Box 25, 1755 ZG Petten (Netherlands); Roekaerts, Dirk [Delft University of Technology, P.O. Box 5, 2600 AA Delft (Netherlands)

    2015-08-15

    Highlights: • A CFD based method proposed in the previous article is used for the simulation of the effect of CO{sub 2}–He dilution on hydrogen deflagration. • A theoretical study is presented to verify whether CO{sub 2}–He diluent can be used as a replacement for H{sub 2}O as diluent. • CFD model used for the validation work is described. • TFC combustion model results are in good agreement with large-scale homogeneous hydrogen–air–CO{sub 2}–He experiments. - Abstract: Large quantities of hydrogen can be generated and released into the containment during a severe accident in a PWR. The generated hydrogen, when mixed with air, can lead to hydrogen combustion. The dynamic pressure loads resulting from hydrogen combustion can be detrimental to the structural integrity of the reactor safety systems and the reactor containment. Therefore, accurate prediction of these pressure loads is an important safety issue. In our previous article, a CFD based method to determine these pressure loads was presented. This CFD method is based on the application of a turbulent flame speed closure combustion model. The method was validated against three uniform hydrogen–air deflagration experiments with different blockage ratio performed in the ENACCEF facility. It was concluded that the maximum pressures were predicted within 13% accuracy, while the rate of pressure rise dp/dt was predicted within about 30%. The eigen frequencies of the residual pressure wave phenomena were predicted within a few %. In the present article, we perform additional validation of the CFD based method against three uniform hydrogen–air–CO{sub 2}–He deflagration experiments with three different concentrations of the CO{sub 2}–He diluent. The trends of decrease in the flame velocity, the intermediate peak pressure, the rate of pressure rise dp/dt, and the maximum value of the mean pressure with an increase in the CO{sub 2}–He dilution are captured well in the simulations. From the

  2. NASA's Rodent Research Project: Validation of Flight Hardware, Operations and Science Capabilities for Conducting Long Duration Experiments in Space

    Science.gov (United States)

    Choi, S. Y.; Beegle, J. E.; Wigley, C. L.; Pletcher, D.; Globus, R. K.

    2015-01-01

    Research using rodents is an essential tool for advancing biomedical research on Earth and in space. Rodent Research (RR)-1 was conducted to validate flight hardware, operations, and science capabilities that were developed at the NASA Ames Research Center. Twenty C57BL/6J adult female mice were launched on Sept 21, 2014 in a Dragon Capsule (SpaceX-4), then transferred to the ISS for a total time of 21-22 days (10 commercial mice) or 37 (10 validation mice). Tissues collected on-orbit were either rapidly frozen or preserved in RNA later at less than or equal to -80 C (n=2/group) until their return to Earth. Remaining carcasses were rapidly frozen for dissection post-flight. The three controls groups at Kennedy Space Center consisted of: Basal mice euthanized at the time of launch, Vivarium controls, housed in standard cages, and Ground Controls (GC), housed in flight hardware within an environmental chamber. FLT mice appeared more physically active on-orbit than GC, and behavior analysis are in progress. Upon return to Earth, there were no differences in body weights between FLT and GC at the end of the 37 days in space. RNA was of high quality (RIN greater than 8.5). Liver enzyme activity levels of FLT mice and all control mice were similar in magnitude to those of the samples that were optimally processed in the laboratory. Liver samples collected from the intact frozen FLT carcasses had RNA RIN of 7.27 +/- 0.52, which was lower than that of the samples processed on-orbit, but similar to those obtained from the control group intact carcasses. Nonetheless, the RNA samples from the intact carcasses were acceptable for the most demanding transcriptomic analyses. Adrenal glands, thymus and spleen (organs associated with stress response) showed no significant difference in weights between FLT and GC. Enzymatic activity was also not significantly different. Over 3,000 tissues collected from the four groups of mice have become available for the Biospecimen Sharing

  3. Validation of the MC{sup 2}-3/DIF3D Code System for Control Rod Worth via the BFS-75-1 Reactor Physics Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Yun, Sunghwan; Kim, Sang Ji [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    In this paper, control rod worths of the BFS-75-1 reactor physics experiments were examined using continuous energy MCNP models and deterministic MC2-3/DIF3D models based on the ENDF/B-VII.0 library. We can conclude that the ENDF/B-VII.0 library shows very good agreement in small-size metal uranium fuel loaded core which is surrounded by the depleted uranium blanket. However, the control rod heterogeneity effect reported by the reference is not significant in this problem because the tested control rod models were configured by single rod. Hence comparison with other control rod worth measurements data such as the BFS-109-2A reactor physics experiment is planned as a future study. The BFS-75-1 critical experiment was carried out in the BFS-1 facility of IPPE in Russia within the framework of validating an early phase of KALIMER- 150 design. The Monte-Carlo model of the BFS- 75-1 critical experiment had been developed. However, due to incomplete information for the BFS- 75-1 experiments, Monte-Carlo models had been generated for the reference criticality and sodium void reactivity measurements with disk-wise homogeneous model. Recently, KAERI performed another physics experiment, BFS-109-2A, by collaborating with Russian IPPE. During the review process of the experimental report of the BFS-109-2A critical experiments, valuable information for the BFS-1 facility which can also be used for the BFS-75-1 experiments was discovered.

  4. A novel enterovirus and parechovirus multiplex one-step real-time PCR-validation and clinical experience

    DEFF Research Database (Denmark)

    Nielsen, A. C. Y.; Bottiger, B.; Midgley, S. E.

    2013-01-01

    As the number of new enteroviruses and human parechoviruses seems ever growing, the necessity for updated diagnostics is relevant. We have updated an enterovirus assay and combined it with a previously published assay for human parechovirus resulting in a multiplex one-step RT-PCR assay....... The multiplex assay was validated by analysing the sensitivity and specificity of the assay compared to the respective monoplex assays, and a good concordance was found. Furthermore, the enterovirus assay was able to detect 42 reference strains from all 4 species, and an additional 9 genotypes during panel...... testing and routine usage. During 15 months of routine use, from October 2008 to December 2009, we received and analysed 2187 samples (stool samples, cerebrospinal fluids, blood samples, respiratory samples and autopsy samples) were tested, from 1546 patients and detected enteroviruses and parechoviruses...

  5. Prediction and validation of burnout curves for Goettelborn char using reaction kinetics determined in shock tube experiments

    Energy Technology Data Exchange (ETDEWEB)

    Moors, J.H.J.; Banin, V.E.; Haas, J.H.P.; Weber, R.; Veefkind, A. [Eindhoven University of Technology, Eindhoven (Netherlands). Dept. of Applied Physics

    1999-01-01

    Using a shock tube facility the combustion characteristics of pulverised char ({lt} 10 {mu}m) were measured. A prediction was made for the burnout behaviour of a commercial sized char particle (75-90 {mu}m) in different ambient conditions using a `pseudo kinetic` approach. In this approach the kinetic rate of a surface containing micro pores is determined and these `pseudo kinetics` are then applied to the larger particle not taking into account the micro pores. Comparison of the predictions with measurements done with an isothermal plug flow reactor showed this approach to be valid within experimental error for low burnout. A linear decrease of the kinetic reaction rate with burnout is shown to predict the burnout behaviour in the complete range of burnout. A possible explanation for this linear decrease could be a growing fraction of non-combustible material in the char particles during burnout. 11 refs., 6 figs., 2 tabs.

  6. Development of a Two-fluid Drag Law for Clustered Particles using Direct Numerical Simulation and Validation through Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Gokaltun, Seckin [Florida International Univ., Miami, FL (United States); Munroe, Norman [Florida International Univ., Miami, FL (United States); Subramaniam, Shankar [Iowa State Univ., Ames, IA (United States)

    2014-12-31

    This study presents a new drag model, based on the cohesive inter-particle forces, implemented in the MFIX code. This new drag model combines an existing standard model in MFIX with a particle-based drag model based on a switching principle. Switches between the models in the computational domain occur where strong particle-to-particle cohesion potential is detected. Three versions of the new model were obtained by using one standard drag model in each version. Later, performance of each version was compared against available experimental data for a fluidized bed, published in the literature and used extensively by other researchers for validation purposes. In our analysis of the results, we first observed that standard models used in this research were incapable of producing closely matching results. Then, we showed for a simple case that a threshold is needed to be set on the solid volume fraction. This modification was applied to avoid non-physical results for the clustering predictions, when governing equation of the solid granular temperate was solved. Later, we used our hybrid technique and observed the capability of our approach in improving the numerical results significantly; however, improvement of the results depended on the threshold of the cohesive index, which was used in the switching procedure. Our results showed that small values of the threshold for the cohesive index could result in significant reduction of the computational error for all the versions of the proposed drag model. In addition, we redesigned an existing circulating fluidized bed (CFB) test facility in order to create validation cases for clustering regime of Geldart A type particles.

  7. The daily spiritual experiences scale and well-being: demographic comparisons and scale validation with older jewish adults and a diverse internet sample.

    Science.gov (United States)

    Kalkstein, Solomon; Tower, Roni Beth

    2009-12-01

    A substantive literature connects spirituality to positive physical, social, and mental health. In this study, the Daily Spiritual Experiences Scale (DSES) was administered to 410 subjects who participated in a community study and to 87 residents at the Hebrew Home for the Aged at Riverdale (HHAR), the latter sample consisting primarily of older Jewish respondents. Internal consistency of the DSES in both samples was high and exploratory factor analyses revealed one dominant factor and a second factor, which included 14 and 2 items, respectively, consistent with the scale's original validation (Underwood and Teresi 2002). Demographic subgroup comparison among religious groups revealed significantly fewer daily spiritual experiences among Jews, and lowest scores among those respondents endorsing no religious affiliation. Women exhibited more frequent daily experience than men, and attainment of higher levels of education was associated with less frequent daily spiritual experience. All but one of the outcome measures of physical and psychologic well-being were found to be positively associated with the DSES so that more frequent daily spiritual experience correlated with less psychopathology, more close friendships, and better self-rated health. Directions for future research, study interpretation and limitations, and clinical implications for use of the DSES are discussed.

  8. Post-test simulation and analysis of the second full scale CHAN 28-element experiment (validations of CHAN-II (MOD 6) against experiments)

    Energy Technology Data Exchange (ETDEWEB)

    Bayoumi, M H; Muir, W C [Ontario Hydro, Toronto, ON (Canada)

    1996-12-31

    An experimental program, the CHAN Thermal Chemical Experimental Program, has been setup at WNRE under COG/CANDEV to assess and verify the physical and mathematical models of the CHAN codes. The program has been progressing from studying separate effects in single-element experiments to a full integrated mode in a CANDU 28-element bundle geometry. The CHAN-II series codes are used in the licensing analysis of CANDU reactors. The basic code provides an efficient tool to predict the thermal response of a fuel channel during postulated loss-of-coolant accidents (LOCA) with and without a loss of emergency coolant injection (LOECI) in which the transport of heat by convection is greatly reduced. The code models the progression of the event including fuel channel geometry deformation due to severe overheating. It is the main objective of this paper to discuss further verification of the CHAN-II (MOD 6) computer code against the second full scale 28-element experiment performed at WNRE under COG/CANDEV, designed to represent a Pickering type bundle geometry. The main models and assumptions used in the code will be briefly described. The objective of the experiments is to provide data for the assessment of the physical and mathematical models of the CHAN codes and produce data for code verification under integrated conditions with significant hydrogen production and flow rates similar to the LOCA/LOECI scenario. The issue of whether the Zr/steam reaction is sustainable in a full bundle geometry at elevated temperatures is also examined. A comparison between the predictions of CHAN-II (MOD 6) and the experimental results is discussed. (author).12 refs., 17 figs.

  9. Post-test simulation and analysis of the second full scale CHAN 28-element experiment (validations of CHAN-II (MOD 6) against experiments)

    International Nuclear Information System (INIS)

    Bayoumi, M.H.; Muir, W.C.

    1995-01-01

    An experimental program, the CHAN Thermal Chemical Experimental Program, has been setup at WNRE under COG/CANDEV to assess and verify the physical and mathematical models of the CHAN codes. The program has been progressing from studying separate effects in single-element experiments to a full integrated mode in a CANDU 28-element bundle geometry. The CHAN-II series codes are used in the licensing analysis of CANDU reactors. The basic code provides an efficient tool to predict the thermal response of a fuel channel during postulated loss-of-coolant accidents (LOCA) with and without a loss of emergency coolant injection (LOECI) in which the transport of heat by convection is greatly reduced. The code models the progression of the event including fuel channel geometry deformation due to severe overheating. It is the main objective of this paper to discuss further verification of the CHAN-II (MOD 6) computer code against the second full scale 28-element experiment performed at WNRE under COG/CANDEV, designed to represent a Pickering type bundle geometry. The main models and assumptions used in the code will be briefly described. The objective of the experiments is to provide data for the assessment of the physical and mathematical models of the CHAN codes and produce data for code verification under integrated conditions with significant hydrogen production and flow rates similar to the LOCA/LOECI scenario. The issue of whether the Zr/steam reaction is sustainable in a full bundle geometry at elevated temperatures is also examined. A comparison between the predictions of CHAN-II (MOD 6) and the experimental results is discussed. (author).12 refs., 17 figs

  10. Stratification issues in the primary system. Review of available validation experiments and State-of-the-Art in modelling capabilities (StratRev)

    International Nuclear Information System (INIS)

    Westin, J.; Henriksson, M.; Paettikangas, T.; Toppila, T.; Raemae, T.; Kudinov, P.; Anglart, H.

    2009-08-01

    The objective of the present report is to review available validation experiments and State-of-the-Art in modelling of stratification and mixing in the primary system of Light Water Reactors. A topical workshop was arranged in Aelvkarleby in June 2008 within the framework of BWR-OG, and the presentations from various utilities showed that stratification issues are not unusual and can cause costly stops in the production. It is desirable to take actions in order to reduce the probability for stratification to occur, and to develop well-validated and accepted tools and procedures for analyzing upcoming stratification events. A research plan covering the main questions is outlined, and a few suggestions regarding more limited research activities are given. Since many of the stratification events results in thermal loads that are localized in time and space, CFD is a suitable tool. However, the often very large and complex geometry posses a great challenge to CFD, and it is important to perform a step-by-step increase in complexity with intermediate validation versus relevant experimental data. The ultimate goal is to establish Best Practice Guidelines that can be followed both by utilities and authorities in case of an event including stratification and thermal loads. An extension of the existing Best Practice Guidelines for CFD in nuclear safety applications developed by OECD/NEA is thus suggested as a relevant target for a continuation project. (au)

  11. Stratification issues in the primary system. Review of available validation experiments and State-of-the-Art in modelling capabilities (StratRev)

    Energy Technology Data Exchange (ETDEWEB)

    Westin, J.; Henriksson, M. (Vattenfall Research and Development AB (Sweden)); Paettikangas, T. (VTT (Finland)); Toppila, T.; Raemae, T. (Fortum Nuclear Services Ltd (Finland)); Kudinov, P. (KTH Nuclear Power Safety (Sweden)); Anglart, H. (KTH Nuclear Reactor Technology (Sweden))

    2009-08-15

    The objective of the present report is to review available validation experiments and State-of-the-Art in modelling of stratification and mixing in the primary system of Light Water Reactors. A topical workshop was arranged in AElvkarleby in June 2008 within the framework of BWR-OG, and the presentations from various utilities showed that stratification issues are not unusual and can cause costly stops in the production. It is desirable to take actions in order to reduce the probability for stratification to occur, and to develop well-validated and accepted tools and procedures for analyzing upcoming stratification events. A research plan covering the main questions is outlined, and a few suggestions regarding more limited research activities are given. Since many of the stratification events results in thermal loads that are localized in time and space, CFD is a suitable tool. However, the often very large and complex geometry posses a great challenge to CFD, and it is important to perform a step-by-step increase in complexity with intermediate validation versus relevant experimental data. The ultimate goal is to establish Best Practice Guidelines that can be followed both by utilities and authorities in case of an event including stratification and thermal loads. An extension of the existing Best Practice Guidelines for CFD in nuclear safety applications developed by OECD/NEA is thus suggested as a relevant target for a continuation project. (au)

  12. Validation Study for an Atmospheric Dispersion Model, Using Effective Source Heights Determined from Wind Tunnel Experiments in Nuclear Safety Analysis

    Directory of Open Access Journals (Sweden)

    Masamichi Oura

    2018-03-01

    Full Text Available For more than fifty years, atmospheric dispersion predictions based on the joint use of a Gaussian plume model and wind tunnel experiments have been applied in both Japan and the U.K. for the evaluation of public radiation exposure in nuclear safety analysis. The effective source height used in the Gaussian model is determined from ground-level concentration data obtained by a wind tunnel experiment using a scaled terrain and site model. In the present paper, the concentrations calculated by this method are compared with data observed over complex terrain in the field, under a number of meteorological conditions. Good agreement was confirmed in near-neutral and unstable stabilities. However, it was found to be necessary to reduce the effective source height by 50% in order to achieve a conservative estimation of the field observations in a stable atmosphere.

  13. Intersystem crossing and dynamics in O(3P) + C2H4 multichannel reaction: Experiment validates theory

    Science.gov (United States)

    Fu, Bina; Han, Yong-Chang; Bowman, Joel M.; Angelucci, Luca; Balucani, Nadia; Leonori, Francesca; Casavecchia, Piergiorgio

    2012-01-01

    The O(3P) + C2H4 reaction, of importance in combustion and atmospheric chemistry, stands out as a paradigm reaction involving triplet- and singlet-state potential energy surfaces (PESs) interconnected by intersystem crossing (ISC). This reaction poses challenges for theory and experiments owing to the ruggedness and high dimensionality of these potentials, as well as the long lifetimes of the collision complexes. Primary products from five competing channels (H + CH2CHO, H + CH3CO, H2 + CH2CO, CH3 + HCO, CH2 + CH2O) and branching ratios (BRs) are determined in crossed molecular beam experiments with soft electron-ionization mass-spectrometric detection at a collision energy of 8.4 kcal/mol. As some of the observed products can only be formed via ISC from triplet to singlet PESs, from the product BRs the extent of ISC is inferred. A new full-dimensional PES for the triplet state as well as spin-orbit coupling to the singlet PES are reported, and roughly half a million surface hopping trajectories are run on the coupled singlet-triplet PESs to compare with the experimental BRs and differential cross-sections. Both theory and experiment find almost equal contributions from the two PESs to the reaction, posing the question of how important is it to consider the ISC as one of the nonadiabatic effects for this and similar systems involved in combustion chemistry. Detailed comparisons at the level of angular and translational energy distributions between theory and experiment are presented for the two primary channel products, CH3 + HCO and H + CH2CHO. The agreement between experimental and theoretical functions is excellent, implying that theory has reached the capability of describing complex multichannel nonadiabatic reactions. PMID:22665777

  14. Validation of the code ETOBOX/BOXER for UO2 LWR lattices based on the experiments TRX, BAPL-UO2 and other critical experiments

    International Nuclear Information System (INIS)

    Paratte, J.M.

    1985-07-01

    The EIR codes system for LWR arrays is based on cross sections taken out of ENDF/B-4 and ENDF/B-5 by the code ETOBOX. The calculation method for the arrays (code BOXER) and the cross sections as well were applied to the CSEWG benchmark experiments TRX-1 to 4 and BAPL-UO/sub 2/-1 to 3. The results are compared to the measured values and to some calculations of other institutions as well. This demonstrates that the deviations of the parameters calculated by BOXER are typical for the cross sections used. A large number of critical experiments were calculated using the measured material bucklings in order to bring to light possible trends in the calculation of the multiplication factor k/sub eff/. First it came out that the error bounds of B/sub m//sup 2/ evalu-ated in the measurements are often optimistic. Two-dimensional calculations improved the results of the cell calculations. With a mean scattering of 4 to 5 mk in the normal arrays, the multiplication factors calculated by BOXER are satisfactory. However one has to take into account a slight trend of k/sub eff/ to grow with the moderator to fuel ratio and the enrichment. (author)

  15. Steady-state CFD simulations of an EPR™ reactor pressure vessel: A validation study based on the JULIETTE experiments

    International Nuclear Information System (INIS)

    Puragliesi, R.; Zhou, L.; Zerkak, O.; Pautz, A.

    2016-01-01

    Highlights: • CFD validation of k–ε (RANS model of EPR RPV. • Flat inlet velocity profile is not sufficient to correctly predict the pressure drops. • Swirl is responsible for asymmetric loads at the core barrel. • Parametric study to the turbulent Schmidt number for better predictions of passive-scalar transport. • The optimal turbulent Schmidt number was found to be one order of magnitude smaller than the standard value. - Abstract: Validating computational fluid dynamics (CFD) models against experimental measurements is a fundamental step towards a broader acceptance of CFD as a tool for reactor safety analysis when best-estimate one-dimensional thermal-hydraulic codes present strong modelling limitations. In the present paper numerical results of steady-state RANS analyses are compared to pressure, volumetric flow rate and concentration distribution measurements in different locations of an Areva EPR™ reactor pressure vessel (RPV) mock-up named JULIETTE. Several flow configurations are considered: Three different total volumetric flow rates, cold leg velocity field with or without swirl, three or four reactor coolant pumps functioning. Investigations on the influence of two types of inlet boundary profiles (i.e. flat or 1/7th power-law) and the turbulent Schmidt number have shown that the first affects sensibly the pressure loads at the core barrel whereas the latter parameter strongly affects the transport and the mixing of the tracer (passive scalar) and consequently its distribution at the core inlet. Furthermore, the introduction of an integral parameter as the swirl number has helped to decrease the large epistemic uncertainty associated with the swirling device. The swirl is found to be the cause of asymmetric loads on the walls of the core barrel and also asymmetries are enhanced for the tracer concentration distribution at the core inlet. The k–ϵ CFD model developed with the commercial code STAR-CCM+ proves to be able to predict

  16. Design and Validation of an Open-Source, Partial Task Trainer for Endonasal Neuro-Endoscopic Skills Development: Indian Experience.

    Science.gov (United States)

    Singh, Ramandeep; Baby, Britty; Damodaran, Natesan; Srivastav, Vinkle; Suri, Ashish; Banerjee, Subhashis; Kumar, Subodh; Kalra, Prem; Prasad, Sanjiva; Paul, Kolin; Anand, Sneh; Kumar, Sanjeev; Dhiman, Varun; Ben-Israel, David; Kapoor, Kulwant Singh

    2016-02-01

    Box trainers are ideal simulators, given they are inexpensive, accessible, and use appropriate fidelity. The development and validation of an open-source, partial task simulator that teaches the fundamental skills necessary for endonasal skull-base neuro-endoscopic surgery. We defined the Neuro-Endo-Trainer (NET) SkullBase-Task-GraspPickPlace with an activity area by analyzing the computed tomography scans of 15 adult patients with sellar suprasellar parasellar tumors. Four groups of participants (Group E, n = 4: expert neuroendoscopists; Group N, n =19: novice neurosurgeons; Group R, n = 11: neurosurgery residents with multiple iterations; and Group T, n = 27: neurosurgery residents with single iteration) performed grasp, pick, and place tasks using NET and were graded on task completion time and skills assessment scale score. Group E had lower task completion times and greater skills assessment scale scores than both Group N and R (P ≤ 0.03, 0.001). The performance of Groups N and R was found to be equivalent; in self-assessing neuro-endoscopic skill, the participants in these groups were found to have equally low pretraining scores (4/10) with significant improvement shown after NET simulation (6, 7 respectively). Angled scopes resulted in decreased scores with tilted plates compared with straight plates (30° P ≤ 0.04, 45° P ≤ 0.001). With tilted plates, decreased scores were observed when we compared the 0° with 45° endoscope (right, P ≤ 0.008; left, P ≤ 0.002). The NET, a face and construct valid open-source partial task neuroendoscopic trainer, was designed. Presimulation novice neurosurgeons and neurosurgical residents were described as having insufficient skills and preparation to practice neuro-endoscopy. Plate tilt and endoscope angle were shown to be important factors in participant performance. The NET was found to be a useful partial-task trainer for skill building in neuro-endoscopy. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Steady-state CFD simulations of an EPR™ reactor pressure vessel: A validation study based on the JULIETTE experiments

    Energy Technology Data Exchange (ETDEWEB)

    Puragliesi, R., E-mail: riccardo.puragliesi@psi.ch [Laboratory for Reactor Physics and Systems Behaviour, PSI, 5232 Villigen (Switzerland); Zhou, L. [Science and Technology on Reactor System Design Technology Laboratory, NPIC, Chengdu (China); Zerkak, O.; Pautz, A. [Laboratory for Reactor Physics and Systems Behaviour, PSI, 5232 Villigen (Switzerland)

    2016-04-15

    Highlights: • CFD validation of k–ε (RANS model of EPR RPV. • Flat inlet velocity profile is not sufficient to correctly predict the pressure drops. • Swirl is responsible for asymmetric loads at the core barrel. • Parametric study to the turbulent Schmidt number for better predictions of passive-scalar transport. • The optimal turbulent Schmidt number was found to be one order of magnitude smaller than the standard value. - Abstract: Validating computational fluid dynamics (CFD) models against experimental measurements is a fundamental step towards a broader acceptance of CFD as a tool for reactor safety analysis when best-estimate one-dimensional thermal-hydraulic codes present strong modelling limitations. In the present paper numerical results of steady-state RANS analyses are compared to pressure, volumetric flow rate and concentration distribution measurements in different locations of an Areva EPR™ reactor pressure vessel (RPV) mock-up named JULIETTE. Several flow configurations are considered: Three different total volumetric flow rates, cold leg velocity field with or without swirl, three or four reactor coolant pumps functioning. Investigations on the influence of two types of inlet boundary profiles (i.e. flat or 1/7th power-law) and the turbulent Schmidt number have shown that the first affects sensibly the pressure loads at the core barrel whereas the latter parameter strongly affects the transport and the mixing of the tracer (passive scalar) and consequently its distribution at the core inlet. Furthermore, the introduction of an integral parameter as the swirl number has helped to decrease the large epistemic uncertainty associated with the swirling device. The swirl is found to be the cause of asymmetric loads on the walls of the core barrel and also asymmetries are enhanced for the tracer concentration distribution at the core inlet. The k–ϵ CFD model developed with the commercial code STAR-CCM+ proves to be able to predict

  18. Translation and cultural adaptation of the States of Consciousness Questionnaire (SOCQ and statistical validation of the Mystical Experience Questionnaire (MEQ30 in Brazilian Portuguese

    Directory of Open Access Journals (Sweden)

    EDUARDO EKMAN SCHENBERG

    Full Text Available Abstract Background The States of Consciousness Questionnaire (SOCQ was developed to assess the occurrence features of the change in consciousness induced by psilocybin, and includes the Mystical Experience Questionnaire (MEQ, developed to assess the ocurrence of mystical experiences in altered states of consciousness. Objective To translate the SOCQ to Brazilian Portuguese and validate the 30-item MEQ. Methods The SOCQ was translated to Brazilian Portuguese and backtranslated into English. The two English versions were compared and differences corrected, resulting in a Brazilian translation. Using an internet-survey, 1504 Portuguese-speaking subjects answered the translated version of the SOCQ. The 4-factor version of MEQ30 was analyzed using confirmatory factor analysis and reliability analysis. Results A Brazilian Portuguese version of the SOCQ was made available. Goodness-of-fit indexes indicated that data met the factorial structure proposed for the English MEQ30. Factors presented excellent to acceptable reliability according to Cronbach’s alpha: mystical (0.95; positive mood (0.71; transcendence of time/space (0.83; and ineffability (0.81. Discussion The Brazilian Portuguese version of the MEQ30 is validated and it fits in the factorial structure performed on the original English version. The SOCQ is also available to the Brazilian Portuguese speaking population, allowing studies in different languages to be conducted and compared systematically.

  19. Validation of the Persian version of the Daily Spiritual Experiences Scale (DSES) in Pregnant Women: A Proper Tool to Assess Spirituality Related to Mental Health.

    Science.gov (United States)

    Saffari, Mohsen; Amini, Hossein; Sheykh-Oliya, Zarindokht; Pakpour, Amir H; Koenig, Harold G

    2017-12-01

    Assessing spirituality in healthy pregnant women may lead to supportive interventions that will improve their care. A psychometrically valid measure such as the Daily Spiritual Experiences Scale (DSES) may be helpful in this regard. The current study sought to adapt a Persian version of DSES for use in pregnancy. A total of 377 pregnant women were recruited from three general hospitals located in Tehran, Iran. Administered scales were the DSES, Duke University Religion Index, Santa Clara Strength of Religious Faith scale, and Depression Anxiety Stress Scale, as well as demographic measures. Reliability of the DSES was tested using Cronbach's alpha for internal consistency and the intraclass correlation coefficient (ICC) for test-retest stability. Scale validity was assessed by criterion-related tests, known-groups comparison, and exploratory factor analysis. Participant's mean age was 27.7 (4.1), and most were nulliparous (70%). The correlation coefficient between individual items on the scale and the total score was greater than 0.30 in most cases. Cronbach's alpha for the scale was 0.90. The ICC for 2-week test-retest reliability was high (0.86). Relationships between similar and dissimilar scales indicated acceptable convergent and divergent validity. The factor structure of the scale indicated a single factor that explained 59% of the variance. The DSES was found to be a reliable and valid measure of spirituality in pregnant Iranian women. This scale may be used to examine the relationship between spirituality and health outcomes, research that may lead to supportive interventions in this population.

  20. A novel enterovirus and parechovirus multiplex one-step real-time PCR-validation and clinical experience.

    Science.gov (United States)

    Nielsen, Alex Christian Yde; Böttiger, Blenda; Midgley, Sofie Elisabeth; Nielsen, Lars Peter

    2013-11-01

    As the number of new enteroviruses and human parechoviruses seems ever growing, the necessity for updated diagnostics is relevant. We have updated an enterovirus assay and combined it with a previously published assay for human parechovirus resulting in a multiplex one-step RT-PCR assay. The multiplex assay was validated by analysing the sensitivity and specificity of the assay compared to the respective monoplex assays, and a good concordance was found. Furthermore, the enterovirus assay was able to detect 42 reference strains from all 4 species, and an additional 9 genotypes during panel testing and routine usage. During 15 months of routine use, from October 2008 to December 2009, we received and analysed 2187 samples (stool samples, cerebrospinal fluids, blood samples, respiratory samples and autopsy samples) were tested, from 1546 patients and detected enteroviruses and parechoviruses in 171 (8%) and 66 (3%) of the samples, respectively. 180 of the positive samples could be genotyped by PCR and sequencing and the most common genotypes found were human parechovirus type 3, echovirus 9, enterovirus 71, Coxsackievirus A16, and echovirus 25. During 2009 in Denmark, both enterovirus and human parechovirus type 3 had a similar seasonal pattern with a peak during the summer and autumn. Human parechovirus type 3 was almost invariably found in children less than 4 months of age. In conclusion, a multiplex assay was developed allowing simultaneous detection of 2 viruses, which can cause similar clinical symptoms. Copyright © 2013 Elsevier B.V. All rights reserved.

  1. Validation of MCCI models implemented in ASTEC MEDICIS on OECD CCI-2 and CCI-3 experiments and further consideration on reactor cases

    Energy Technology Data Exchange (ETDEWEB)

    Agethen, K.; Koch, M.K., E-mail: agethen@lee.rub.de, E-mail: koch@lee.rub.de [Ruhr-Universitat Bochum, Energy Systems and Energy Economics, Reactor Simulation and Safety Group, Bochum (Germany)

    2014-07-01

    Within a severe accident in a light water reactor a loss of coolant can result in core melting and vessel failure. Afterwards, molten core material may discharge into the containment cavity and interact with the concrete basemat. Due to concrete erosion gases are released, which lead to exothermic oxidation reactions with the metals in the corium and to formation of combustible mixtures. In this work the MEDICIS module of the Accident Source Term Evaluation Code (ASTEC) is validated on experiments of the OECD CCI programme. The primary focus is set on the CCI-2 experiment with limestone common sand (LCS) concrete, in which nearly homogenous erosion appeared, and the CCI-3 experiment with siliceous concrete, in which increased lateral erosion occurred. These experiments enable the analysis of heat transfer depending on the axial and radial orientation from the interior of the melt to the surrounding surfaces and the impact of top flooding. For the simulation of both tests, two existing models in MEDICIS are used and analysed. Results of simulations show a good agreement of ablation behaviour, layer temperature and energy balance with experimental results. Furthermore the issue of a quasi-steady state in the energy balance for the long term appeared. Finally the basic data are scaled up to a generic reactor scenario, which shows that this quasi-steady state similarly occurred. (author)

  2. Evaluation of the validity of the Psychology Experiment Building Language tests of vigilance, auditory memory, and decision making

    Directory of Open Access Journals (Sweden)

    Brian Piper

    2016-03-01

    Full Text Available Background. The Psychology Experimental Building Language (PEBL test battery (http://pebl.sourceforge.net/ is a popular application for neurobehavioral investigations. This study evaluated the correspondence between the PEBL and the non-PEBL versions of four executive function tests. Methods. In one cohort, young-adults (N = 44 completed both the Conner’s Continuous Performance Test (CCPT and the PEBL CPT (PCPT with the order counter-balanced. In a second cohort, participants (N = 47 completed a non-computerized (Wechsler and a computerized (PEBL Digit Span (WDS or PDS both Forward and Backward. Participants also completed the Psychological Assessment Resources or the PEBL versions of the Iowa Gambling Task (PARIGT or PEBLIGT. Results. The between-test correlations were moderately high (reaction time r = 0.78, omission errors r = 0.65, commission errors r = 0.66 on the CPT. DS Forward was significantly greater than DS Backward on the WDS (p < .0005 and the PDS (p < .0005. The total WDS score was moderately correlated with the PDS (r = 0.56. The PARIGT and the PEBLIGTs showed a very similar pattern for response times across blocks, development of preference for Advantageous over Disadvantageous Decks, and Deck selections. However, the amount of money earned (score–loan was significantly higher in the PEBLIGT during the last Block. Conclusions. These findings are broadly supportive of the criterion validity of the PEBL measures of sustained attention, short-term memory, and decision making. Select differences between workalike versions of the same test highlight how detailed aspects of implementation may have more important consequences for computerized testing than has been previously acknowledged.

  3. Three-Gorge Reservoir: A 'Controlled Experiment' for Calibration/Validation of Time-Variable Gravity Signals Detected from Space

    Science.gov (United States)

    Chao, Benjamin F.; Boy, J. P.

    2003-01-01

    With the advances of measurements, modern space geodesy has become a new type of remote sensing for the Earth dynamics, especially for mass transports in the geophysical fluids on large spatial scales. A case in point is the space gravity mission GRACE (Gravity Recovery And Climate Experiment) which has been in orbit collecting gravity data since early 2002. The data promise to be able to detect changes of water mass equivalent to sub-cm thickness on spatial scale of several hundred km every month or so. China s Three-Gorge Reservoir has already started the process of water impoundment in phases. By 2009,40 km3 of water will be stored behind one of the world s highest dams and spanning a section of middle Yangtze River about 600 km in length. For the GRACE observations, the Three-Gorge Reservoir would represent a geophysical controlled experiment , one that offers a unique opportunity to do detailed geophysical studies. -- Assuming a complete documentation of the water level and history of the water impoundment process and aided with a continual monitoring of the lithospheric loading response (such as in area gravity and deformation), one has at hand basically a classical forwardinverse modeling problem of surface loading, where the input and certain output are known. The invisible portion of the impounded water, i.e. underground storage, poses either added values as an observable or a complication as an unknown to be modeled. Wang (2000) has studied the possible loading effects on a local scale; we here aim for larger spatial scales upwards from several hundred km, with emphasis on the time-variable gravity signals that can be detected by GRACE and follow-on missions. Results using the Green s function approach on the PREM elastic Earth model indicate the geoid height variations reaching several millimeters on wavelengths of about a thousand kilometers. The corresponding vertical deformations have amplitude of a few centimeters. In terms of long

  4. The Second Victim Experience and Support Tool: Validation of an Organizational Resource for Assessing Second Victim Effects and the Quality of Support Resources.

    Science.gov (United States)

    Burlison, Jonathan D; Scott, Susan D; Browne, Emily K; Thompson, Sierra G; Hoffman, James M

    2017-06-01

    Medical errors and unanticipated negative patient outcomes can damage the well-being of health care providers. These affected individuals, referred to as "second victims," can experience various psychological and physical symptoms. Support resources provided by health care organizations to prevent and reduce second victim-related harm are often inadequate. In this study, we present the development and psychometric evaluation of the Second Victim Experience and Support Tool (SVEST), a survey instrument that can assist health care organizations to implement and track the performance of second victim support resources. The SVEST (29 items representing 7 dimensions and 2 outcome variables) was completed by 303 health care providers involved in direct patient care. The survey collected responses on second victim-related psychological and physical symptoms and the quality of support resources. Desirability of possible support resources was also measured. The SVEST was assessed for content validity, internal consistency, and construct validity with confirmatory factor analysis. Confirmatory factor analysis results suggested good model fit for the survey. Cronbach α reliability scores for the survey dimensions ranged from 0.61 to 0.89. The most desired second victim support option was "A respected peer to discuss the details of what happened." The SVEST can be used by health care organizations to evaluate second victim experiences of their staff and the quality of existing support resources. It can also provide health care organization leaders with information on second victim-related support resources most preferred by their staff. The SVEST can be administered before and after implementing new second victim resources to measure perceptions of effectiveness.

  5. Validation of LWR calculation methods and JEF-1 based data libraries by TRX and BAPL critical experiments

    International Nuclear Information System (INIS)

    Pelloni, S.; Grimm, P.; Mathews, D.; Paratte, J.M.

    1989-06-01

    In this report the capability of various code systems widely used at PSI (such as WIMS-D, BOXER, and the AARE modules TRAMIX and MICROX-2 in connection with the one-dimensional transport code ONEDANT) and JEF-1 based nuclear data libraries to compute LWR lattices is analysed by comparing results from thermal reactor benchmarks TRX and BAPL with experiment and with previously published values. It is shown that with the JEF-1 evaluation eigenvalues are generally well predicted within 8 mk (1 mk = 0.001) or less by all code systems, and that all methods give reasonable results for the measured reaction rate within or not too far from the experimental uncertainty. This is consistent with previous similar studies. (author) 7 tabs., 36 refs

  6. Validating the Patient Experience with Treatment and Self-Management (PETS, a patient-reported measure of treatment burden, in people with diabetes

    Directory of Open Access Journals (Sweden)

    Rogers EA

    2017-11-01

    Full Text Available Elizabeth A Rogers,1,2 Kathleen J Yost,3 Jordan K Rosedahl,3 Mark Linzer,4 Deborah H Boehm,5 Azra Thakur,5 Sara Poplau,5 Roger T Anderson,6 David T Eton3 1Department of Medicine, University of Minnesota Medical School, Minneapolis, MN, USA; 2Department of Pediatrics, University of Minnesota Medical School, Minneapolis, MN, USA; 3Department of Health Services Research, Mayo Clinic, Rochester, MN, USA; 4Department of Medicine, Hennepin County Medical Center, Minneapolis, MN, USA; 5Minneapolis Medical Research Foundation, Minneapolis, MN, USA; 6University of Virginia School of Medicine, Charlottesville, VA, USA Aims: To validate a comprehensive general measure of treatment burden, the Patient Experience with Treatment and Self-Management (PETS, in people with diabetes. Methods: We conducted a secondary analysis of a cross-sectional survey study with 120 people diagnosed with type 1 or type 2 diabetes and at least one additional chronic illness. Surveys included established patient-reported outcome measures and a 48-item version of the PETS, a new measure comprised of multi-item scales assessing the burden of chronic illness treatment and self-care as it relates to nine domains: medical information, medications, medical appointments, monitoring health, interpersonal challenges, health care expenses, difficulty with health care services, role activity limitations, and physical/mental exhaustion from self-management. Internal reliability of PETS scales was determined using Cronbach’s alpha. Construct validity was determined through correlation of PETS scores with established measures (measures of chronic condition distress, medication satisfaction, self-efficacy, and global well-being, and known-groups validity through comparisons of PETS scores across clinically distinct groups. In an exploratory test of predictive validity, step-wise regressions were used to determine which PETS scales were most associated with outcomes of chronic condition

  7. Validation of the muon momentum resolution in view of the W mass measurement with the CMS experiment

    CERN Document Server

    Manca, Elisabetta

    2016-01-01

    In the framework of the Standard Model, the electroweak theory predictsrelations among observables which can be measured. After the discovery ofthe Higgs boson, all the parameters of the Standard Model are known andit is thus possible to predict those observables with increasing precision inorder to test the consistency of the model. A small deviation of measuredvalues from those predictions would be an indirect hint of physics beyondthe Standard Model.In particular, the mass of the W boson has a far smaller uncertainty inthe theoretical prediction than in the measured value. For this reason, anaccurate measurement of the W mass would provide such a test of validityof the Standard Model.In order to achieve the precision required to have a fair comparison with thetheory, it is necessary to control the distributions of the variables enteringthe measurement at the permil level or even better.The CMS experiment is planning to deliver a precise measurement of MWwithin the next years, analysing events of W decaying...

  8. An attempt to calibrate and validate a simple ductile failure model against axial-torsion experiments on Al 6061-T651

    Energy Technology Data Exchange (ETDEWEB)

    Reedlunn, Benjamin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lu, Wei -Yang [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2015-01-01

    This report details a work in progress. We have attempted to calibrate and validate a Von Mises plasticity model with the Johnson-Cook failure criterion ( Johnson & Cook , 1985 ) against a set of experiments on various specimens of Al 6061-T651. As will be shown, the effort was not successful, despite considerable attention to detail. When the model was com- pared against axial-torsion experiments on tubes, it over predicted failure by 3 x in tension, and never predicted failure in torsion, even when the tube was twisted by 4 x further than the experiment. While this result is unfortunate, it is not surprising. Ductile failure is not well understood. In future work, we will explore whether more sophisticated material mod- els of plasticity and failure will improve the predictions. Selecting the appropriate advanced material model and interpreting the results of said model are not trivial exercises, so it is worthwhile to fully investigate the behavior of a simple plasticity model before moving on to an anisotropic yield surface or a similarly complicated model.

  9. A Systematic Review Comparing the Acceptability, Validity and Concordance of Discrete Choice Experiments and Best-Worst Scaling for Eliciting Preferences in Healthcare.

    Science.gov (United States)

    Whitty, Jennifer A; Oliveira Gonçalves, Ana Sofia

    2018-06-01

    The aim of this study was to compare the acceptability, validity and concordance of discrete choice experiment (DCE) and best-worst scaling (BWS) stated preference approaches in health. A systematic search of EMBASE, Medline, AMED, PubMed, CINAHL, Cochrane Library and EconLit databases was undertaken in October to December 2016 without date restriction. Studies were included if they were published in English, presented empirical data related to the administration or findings of traditional format DCE and object-, profile- or multiprofile-case BWS, and were related to health. Study quality was assessed using the PREFS checklist. Fourteen articles describing 12 studies were included, comparing DCE with profile-case BWS (9 studies), DCE and multiprofile-case BWS (1 study), and profile- and multiprofile-case BWS (2 studies). Although limited and inconsistent, the balance of evidence suggests that preferences derived from DCE and profile-case BWS may not be concordant, regardless of the decision context. Preferences estimated from DCE and multiprofile-case BWS may be concordant (single study). Profile- and multiprofile-case BWS appear more statistically efficient than DCE, but no evidence is available to suggest they have a greater response efficiency. Little evidence suggests superior validity for one format over another. Participant acceptability may favour DCE, which had a lower self-reported task difficulty and was preferred over profile-case BWS in a priority setting but not necessarily in other decision contexts. DCE and profile-case BWS may be of equal validity but give different preference estimates regardless of the health context; thus, they may be measuring different constructs. Therefore, choice between methods is likely to be based on normative considerations related to coherence with theoretical frameworks and on pragmatic considerations related to ease of data collection.

  10. The role of CFD combustion modelling in hydrogen safety management – VI: Validation for slow deflagration in homogeneous hydrogen-air-steam experiments

    Energy Technology Data Exchange (ETDEWEB)

    Cutrono Rakhimov, A., E-mail: cutrono@nrg.eu [Nuclear Research and Consultancy Group (NRG), Westerduinweg 3, 1755 ZG Petten (Netherlands); Visser, D.C., E-mail: visser@nrg.eu [Nuclear Research and Consultancy Group (NRG), Westerduinweg 3, 1755 ZG Petten (Netherlands); Holler, T., E-mail: tadej.holler@ijs.si [Jožef Stefan Institute (JSI), Jamova cesta 39, 1000 Ljubljana (Slovenia); Komen, E.M.J., E-mail: komen@nrg.eu [Nuclear Research and Consultancy Group (NRG), Westerduinweg 3, 1755 ZG Petten (Netherlands)

    2017-01-15

    Highlights: • Deflagration of hydrogen-air-steam homogeneous mixtures is modeled in a medium-scale containment. • Adaptive mesh refinement is applied on flame front positions. • Steam effect influence on combustion modeling capabilities is investigated. • Mean pressure rise is predicted with 18% under-prediction when steam is involved. • Peak pressure is evaluated with 5% accuracy when steam is involved. - Abstract: Large quantities of hydrogen can be generated during a severe accident in a water-cooled nuclear reactor. When released in the containment, the hydrogen can create a potential deflagration risk. The dynamic pressure loads resulting from hydrogen combustion can be detrimental to the structural integrity of the reactor. Therefore, accurate prediction of these pressure loads is an important safety issue. In previous papers, we validated a Computational Fluid Dynamics (CFD) based method to determine the pressure loads from a fast deflagration. The combustion model applied in the CFD method is based on the Turbulent Flame Speed Closure (TFC). In our last paper, we presented the extension of this combustion model, Extended Turbulent Flame Speed Closure (ETFC), and its validation against hydrogen deflagration experiments in the slow deflagration regime. During a severe accident, cooling water will enter the containment as steam. Therefore, the effect of steam on hydrogen deflagration is important to capture in a CFD model. The primary objectives of the present paper are to further validate the TFC and ETFC combustion models, and investigate their capability to predict the effect of steam. The peak pressures, the trends of the flame velocity, and the pressure rise with an increase in the initial steam dilution are captured reasonably well by both combustion models. In addition, the ETFC model appeared to be more robust to mesh resolution changes. The mean pressure rise is evaluated with 18% under-prediction and the peak pressure is evaluated with 5

  11. Development and validation of the Patient Experience with Treatment and Self-management (PETS): a patient-reported measure of treatment burden.

    Science.gov (United States)

    Eton, David T; Yost, Kathleen J; Lai, Jin-Shei; Ridgeway, Jennifer L; Egginton, Jason S; Rosedahl, Jordan K; Linzer, Mark; Boehm, Deborah H; Thakur, Azra; Poplau, Sara; Odell, Laura; Montori, Victor M; May, Carl R; Anderson, Roger T

    2017-02-01

    The purpose of this study was to develop and validate a new comprehensive patient-reported measure of treatment burden-the Patient Experience with Treatment and Self-management (PETS). A conceptual framework was used to derive the PETS with items reviewed and cognitively tested with patients. A survey battery, including a pilot version of the PETS, was mailed to 838 multi-morbid patients from two healthcare institutions for validation. A total of 332 multi-morbid patients returned completed surveys. Diagnostics supported deletion and consolidation of some items and domains. Confirmatory factor analysis supported a domain model for scaling comprised of 9 factors: medical information, medications, medical appointments, monitoring health, interpersonal challenges, medical/healthcare expenses, difficulty with healthcare services, role/social activity limitations, and physical/mental exhaustion. Scales showed good internal consistency (α range 0.79-0.95). Higher PETS scores, indicative of greater treatment burden, were correlated with more distress, less satisfaction with medications, lower self-efficacy, worse physical and mental health, and lower convenience of healthcare (Ps health literacy, less adherence to medications, and more financial difficulties reported higher PETS scores (Ps < 0.01). A comprehensive patient-reported measure of treatment burden can help to better characterize the impact of treatment and self-management burden on patient well-being and guide care toward minimally disruptive medicine.

  12. The X-Ray Pebble Recirculation Experiment (X-PREX): Facility Description, Preliminary Discrete Element Method Simulation Validation Studies, and Future Test Program

    International Nuclear Information System (INIS)

    Laufer, Michael R.; Bickel, Jeffrey E.; Buster, Grant C.; Krumwiede, David L.; Peterson, Per F.

    2014-01-01

    This paper presents a facility description, preliminary results, and future test program of the new X-Ray Pebble Recirculation Experiment (X-PREX), which is now operational and being used to collect data on the behavior of slow dense granular flows relevant to pebble bed reactor core designs. The X-PREX facility uses digital x-ray tomography methods to track both the translational and rotational motion of spherical pebbles, which provides unique experimental results that can be used to validate discrete element method (DEM) simulations of pebble motion. The validation effort supported by the X-PREX facility provides a means to build confidence in analysis of pebble bed configuration and residence time distributions that impact the neutronics, thermal hydraulics, and safety analysis of pebble bed reactor cores. Preliminary experimental and DEM simulation results are reported for silo drainage, a classical problem in the granular flow literature, at several hopper angles. These studies include conventional converging and novel diverging geometries that provide additional flexibility in the design of pebble bed reactor cores. Excellent agreement is found between the X-PREX experimental and DEM simulation results. Finally, this paper discusses additional studies in progress relevant to the design and analysis of pebble bed reactor cores including pebble recirculation in cylindrical core geometries and evaluation of forces on shut down blades inserted directly into a packed pebble bed. (author)

  13. Do qualitative methods validate choice experiment-results? A case study on the economic valuation of peatland restoration in Central Kalimantan, Indonesia

    Energy Technology Data Exchange (ETDEWEB)

    Schaafsma, M.; Van Beukering, P.J.H.; Davies, O.; Oskolokaite, I.

    2009-05-15

    This study explores the benefits of combining independent results of qualitative focus group discussions (FGD) with a quantitative choice experiment (CE) in a developing country context. The assessment addresses the compensation needed by local communities in Central Kalimantan to cooperate in peatland restoration programs by using a CE combined with a series of FGD to validate and explain the CE-results. The main conclusion of this study is that a combination of qualitative and quantitative methods is necessary to assess the economic value of ecological services in monetary terms and to better understand the underlying attitudes and motives that drive these outcomes. The FGD not only cross-validate results of the CE, but also help to interpret the differences in preferences of respondents arising from environmental awareness and ecosystem characteristics. The FGD confirms that the CE results provide accurate information for ecosystem valuation. Additional to the advantages of FGD listed in the literature, this study finds that FGD provide the possibility to identify the specific terms and conditions on which respondents will accept land-use change scenarios. The results show that FGD may help to address problems regarding the effects of distribution of costs and benefits over time that neo-classical economic theory poses for the interpretation of economic valuation results in the demand it puts on the rationality of trade-offs and the required calculations.

  14. The virtual lover: variable and easily guided 3D fish animations as an innovative tool in mate-choice experiments with sailfin mollies-II. Validation.

    Science.gov (United States)

    Gierszewski, Stefanie; Müller, Klaus; Smielik, Ievgen; Hütwohl, Jan-Marco; Kuhnert, Klaus-Dieter; Witte, Klaudia

    2017-02-01

    The use of computer animation in behavioral research is a state-of-the-art method for designing and presenting animated animals to live test animals. The major advantages of computer animations are: (1) the creation of animated animal stimuli with high variability of morphology and even behavior; (2) animated stimuli provide highly standardized, controlled and repeatable testing procedures; and (3) they allow a reduction in the number of live test animals regarding the 3Rs principle. But the use of animated animals should be attended by a thorough validation for each test species to verify that behavior measured with live animals toward virtual animals can also be expected with natural stimuli. Here we present results on the validation of a custom-made simulation for animated 3D sailfin mollies Poecilia latipinna and show that responses of live test females were as strong to an animated fish as to a video or a live male fish. Movement of an animated stimulus was important but female response was stronger toward a swimming 3D fish stimulus than to a "swimming" box. Moreover, male test fish were able to discriminate between animated male and female stimuli; hence, rendering the animated 3D fish a useful tool in mate-choice experiments with sailfin mollies.

  15. Convergent validity between a discrete choice experiment and a direct, open-ended method: comparison of preferred attribute levels and willingness to pay estimates.

    Science.gov (United States)

    Marjon van der Pol; Shiell, Alan; Au, Flora; Johnston, David; Tough, Suzanne

    2008-12-01

    The Discrete Choice Experiment (DCE) has become increasingly popular as a method for eliciting patient or population preferences. If DCE estimates are to inform health policy, it is crucial that the answers they provide are valid. Convergent validity is tested in this paper by comparing the results of a DCE exercise with the answers obtained from direct, open-ended questions. The two methods are compared in terms of preferred attribute levels and willingness to pay (WTP) values. Face-to-face interviews were held with 292 women in Calgary, Canada. Similar values were found between the two methods with respect to preferred levels for two out of three of the attributes examined. The DCE predicted less well for levels outside the range than for levels inside the range reaffirming the importance of extensive piloting to ensure appropriate level range in DCEs. The mean WTP derived from the open-ended question was substantially lower than the mean derived from the DCE. However, the two sets of willingness to pay estimates were consistent with each other in that individuals who were willing to pay more in the open-ended question were also willing to pay more in the DCE. The difference in mean WTP values between the two approaches (direct versus DCE) demonstrates the importance of continuing research into the different biases present across elicitation methods.

  16. Numerical experiment on different validation cases of water coolant flow in supercritical pressure test sections assisted by discriminated dimensional analysis part I: the dimensional analysis

    International Nuclear Information System (INIS)

    Kiss, A.; Aszodi, A.

    2011-01-01

    As recent studies prove in contrast to 'classical' dimensional analysis, whose application is widely described in heat transfer textbooks despite its poor results, the less well known and used discriminated dimensional analysis approach can provide a deeper insight into the physical problems involved and much better results in all cases where it is applied. As a first step of this ongoing research discriminated dimensional analysis has been performed on supercritical pressure water pipe flow heated through the pipe solid wall to identify the independent dimensionless groups (which play an independent role in the above mentioned thermal hydraulic phenomena) in order to serve a theoretical base to comparison between well known supercritical pressure water pipe heat transfer experiments and results of their validated CFD simulations. (author)

  17. Trace metals in mussel shells and corresponding soft tissue samples: a validation experiment for the use of Perna perna shells in pollution monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Bellotto, V.R. [Vale do Itajai University (UNIVALI), CTTMAR (Center for Technology Earth and Ocean Science), Itajai (Brazil); Miekeley, N. [Pontifical Catholic University (PUC-Rio), Department of Chemistry, Rio de Janeiro (Brazil)

    2007-10-15

    The uptake of Cr, Mn, Ni, Cu, Zn, Cd and Pb in soft tissue of Perna perna mussels and their shells has been studied in aquarium experiments in which mussels were exposed for 30 or 60 days to seawater spiked with different concentrations of these contaminants (125 and 500 {mu}g L{sup -1}). Tissue samples were analyzed after acid digestion by conventional solution nebulization ICP-MS. Laser ablation ICP-MS was used for the quantitative determination of trace elements in different areas of the corresponding shells. With the exception of Mn and Zn, all other elements studied showed a significant concentration enhancements in soft tissue, with the magnitude of this enhancement following the order: Cr > Ni > Cd > Cu > Pb. A corresponding increase in most contaminants, although less pronounced, was also observed in the newly formed growth rings of mussel shells, contributing to the validation of Perna perna mussel shell as a bioindicator of toxic elements. (orig.)

  18. Five year experience in management of perforated peptic ulcer and validation of common mortality risk prediction models - are existing models sufficient? A retrospective cohort study.

    Science.gov (United States)

    Anbalakan, K; Chua, D; Pandya, G J; Shelat, V G

    2015-02-01

    Emergency surgery for perforated peptic ulcer (PPU) is associated with significant morbidity and mortality. Accurate and early risk stratification is important. The primary aim of this study is to validate the various existing MRPMs and secondary aim is to audit our experience of managing PPU. 332 patients who underwent emergency surgery for PPU at a single intuition from January 2008 to December 2012 were studied. Clinical and operative details were collected. Four MRPMs: American Society of Anesthesiology (ASA) score, Boey's score, Mannheim peritonitis index (MPI) and Peptic ulcer perforation (PULP) score were validated. Median age was 54.7 years (range 17-109 years) with male predominance (82.5%). 61.7% presented within 24 h of onset of abdominal pain. Median length of stay was 7 days (range 2-137 days). Intra-abdominal collection, leakage, re-operation and 30-day mortality rates were 8.1%, 2.1%, 1.2% and 7.2% respectively. All the four MRPMs predicted intra-abdominal collection and mortality; however, only MPI predicted leak (p = 0.01) and re-operation (p = 0.02) rates. The area under curve for predicting mortality was 75%, 72%, 77.2% and 75% for ASA score, Boey's score, MPI and PULP score respectively. Emergency surgery for PPU has low morbidity and mortality in our experience. MPI is the only scoring system which predicts all - intra-abdominal collection, leak, reoperation and mortality. All four MRPMs had a similar and fair accuracy to predict mortality, however due to geographic and demographic diversity and inherent weaknesses of exiting MRPMs, quest for development of an ideal model should continue. Copyright © 2015 Surgical Associates Ltd. Published by Elsevier Ltd. All rights reserved.

  19. Threats to validity in the design and conduct of preclinical efficacy studies: a systematic review of guidelines for in vivo animal experiments.

    Directory of Open Access Journals (Sweden)

    Valerie C Henderson

    Full Text Available The vast majority of medical interventions introduced into clinical development prove unsafe or ineffective. One prominent explanation for the dismal success rate is flawed preclinical research. We conducted a systematic review of preclinical research guidelines and organized recommendations according to the type of validity threat (internal, construct, or external or programmatic research activity they primarily address.We searched MEDLINE, Google Scholar, Google, and the EQUATOR Network website for all preclinical guideline documents published up to April 9, 2013 that addressed the design and conduct of in vivo animal experiments aimed at supporting clinical translation. To be eligible, documents had to provide guidance on the design or execution of preclinical animal experiments and represent the aggregated consensus of four or more investigators. Data from included guidelines were independently extracted by two individuals for discrete recommendations on the design and implementation of preclinical efficacy studies. These recommendations were then organized according to the type of validity threat they addressed. A total of 2,029 citations were identified through our search strategy. From these, we identified 26 guidelines that met our eligibility criteria--most of which were directed at neurological or cerebrovascular drug development. Together, these guidelines offered 55 different recommendations. Some of the most common recommendations included performance of a power calculation to determine sample size, randomized treatment allocation, and characterization of disease phenotype in the animal model prior to experimentation.By identifying the most recurrent recommendations among preclinical guidelines, we provide a starting point for developing preclinical guidelines in other disease domains. We also provide a basis for the study and evaluation of preclinical research practice. Please see later in the article for the Editors' Summary.

  20. Threats to validity in the design and conduct of preclinical efficacy studies: a systematic review of guidelines for in vivo animal experiments.

    Science.gov (United States)

    Henderson, Valerie C; Kimmelman, Jonathan; Fergusson, Dean; Grimshaw, Jeremy M; Hackam, Dan G

    2013-01-01

    The vast majority of medical interventions introduced into clinical development prove unsafe or ineffective. One prominent explanation for the dismal success rate is flawed preclinical research. We conducted a systematic review of preclinical research guidelines and organized recommendations according to the type of validity threat (internal, construct, or external) or programmatic research activity they primarily address. We searched MEDLINE, Google Scholar, Google, and the EQUATOR Network website for all preclinical guideline documents published up to April 9, 2013 that addressed the design and conduct of in vivo animal experiments aimed at supporting clinical translation. To be eligible, documents had to provide guidance on the design or execution of preclinical animal experiments and represent the aggregated consensus of four or more investigators. Data from included guidelines were independently extracted by two individuals for discrete recommendations on the design and implementation of preclinical efficacy studies. These recommendations were then organized according to the type of validity threat they addressed. A total of 2,029 citations were identified through our search strategy. From these, we identified 26 guidelines that met our eligibility criteria--most of which were directed at neurological or cerebrovascular drug development. Together, these guidelines offered 55 different recommendations. Some of the most common recommendations included performance of a power calculation to determine sample size, randomized treatment allocation, and characterization of disease phenotype in the animal model prior to experimentation. By identifying the most recurrent recommendations among preclinical guidelines, we provide a starting point for developing preclinical guidelines in other disease domains. We also provide a basis for the study and evaluation of preclinical research practice. Please see later in the article for the Editors' Summary.

  1. Development and Validation of Stability-Indicating Method for Estimation of Chlorthalidone in Bulk and Tablets with the Use of Experimental Design in Forced Degradation Experiments

    Directory of Open Access Journals (Sweden)

    Sandeep Sonawane

    2016-01-01

    Full Text Available Chlorthalidone was subjected to various forced degradation conditions. Substantial degradation of chlorthalidone was obtained in acid, alkali, and oxidative conditions. Further full factorial experimental design was applied for acid and alkali forced degradation conditions, in which strength of acid/alkali, temperature, and time of heating were considered as independent variables (factors and % degradation was considered as dependent variable (response. Factors responsible for acid and alkali degradation were statistically evaluated using Yates analysis and Pareto chart. Furthermore, using surface response curve, optimized 10% degradation was obtained. All chromatographic separation was carried out on Phenomenex HyperClone C 18 column (250 × 4.6 mm, 5 μ, using mobile phase comprising methanol : acetonitrile : phosphate buffer (20 mM (pH 3.0 adjusted with o-phosphoric acid: 30 : 10 : 60% v/v. The flow rate was kept constant at 1 mL/min and eluent was detected at 241 nm. In calibration curve experiments, linearity was found to be in the range of 2–12 μg/mL. Validation experiments proved good accuracy and precision of the method. Also there was no interference of excipients and degradation products at the retention time of chlorthalidone, indicating specificity of the method.

  2. Can 3D Gamified Simulations Be Valid Vocational Training Tools for Persons with Intellectual Disability? An Experiment Based on a Real-life Situation.

    Science.gov (United States)

    von Barnekow, Ariel; Bonet-Codina, Núria; Tost, Dani

    2017-03-23

    To investigate if 3D gamified simulations can be valid vocational training tools for persons with intellectual disability. A 3D gamified simulation composed by a set of training tasks for cleaning in hostelry was developed in collaboration with professionals of a real hostel and pedagogues of a special needs school. The learning objectives focus on the acquisition of vocabulary skills, work procedures, social abilities and risk prevention. Several accessibility features were developed to make the tasks easy to do from a technological point-of-view. A pilot experiment was conducted to test the pedagogical efficacy of this tool on intellectually disabled workers and students. User scores in the gamified simulation follow a curve of increasing progression. When confronted with reality, they recognized the scenario and tried to reproduce what they had learned in the simulation. Finally, they were interested in the tool, they showed a strong feeling of immersion and engagement, and they reported having fun. On the basis of this experiment we believe that 3D gamified simulations can be efficient tools to train social and professional skills of persons with intellectual disabilities contributing thus to foster their social inclusion through work.

  3. Health Services OutPatient Experience questionnaire: factorial validity and reliability of a patient-centered outcome measure for outpatient settings in Italy

    Directory of Open Access Journals (Sweden)

    Coluccia A

    2014-09-01

    Full Text Available Anna Coluccia, Fabio Ferretti, Andrea PozzaDepartment of Medical Sciences, Surgery and Neurosciences, Santa Maria alle Scotte University Hospital, University of Siena, Siena, ItalyPurpose: The patient-centered approach to health care does not seem to be sufficiently developed in the Italian context, and is still characterized by the biomedical model. In addition, there is a lack of validated outcome measures to assess outpatient experience as an aspect common to a variety of settings. The current study aimed to evaluate the factorial validity, reliability, and invariance across sex of the Health Services OutPatient Experience (HSOPE questionnaire, a short ten-item measure of patient-centeredness for Italian adult outpatients. The rationale for unidimensionality of the measure was that it could cover global patient experience as a process common to patients with a variety of diseases and irrespective of the phase of treatment course.Patients and methods: The HSOPE was compiled by 1,532 adult outpatients (51% females, mean age 59.22 years, standard deviation 16.26 receiving care in ten facilities at the Santa Maria alle Scotte University Hospital of Siena, Italy. The sample represented all the age cohorts. Twelve percent were young adults, 57% were adults, and 32% were older adults. Exploratory and confirmatory factor analyses were conducted to evaluate factor structure. Reliability was evaluated as internal consistency using Cronbach’s α. Factor invariance was assessed through multigroup analyses.Results: Both exploratory and confirmatory analyses suggested a clearly defined unidimensional structure of the measure, with all the ten items having salient loadings on a single factor. Internal consistency was excellent (α=0.95. Indices of model fit supported a single-factor structure for both male and female outpatient groups. Young adult outpatients had significantly lower scores on perceived patient-centeredness relative to older adults. No

  4. Elements of a pragmatic approach for dealing with bias and uncertainty in experiments through predictions : experiment design and data conditioning; %22real space%22 model validation and conditioning; hierarchical modeling and extrapolative prediction.

    Energy Technology Data Exchange (ETDEWEB)

    Romero, Vicente Jose

    2011-11-01

    This report explores some important considerations in devising a practical and consistent framework and methodology for utilizing experiments and experimental data to support modeling and prediction. A pragmatic and versatile 'Real Space' approach is outlined for confronting experimental and modeling bias and uncertainty to mitigate risk in modeling and prediction. The elements of experiment design and data analysis, data conditioning, model conditioning, model validation, hierarchical modeling, and extrapolative prediction under uncertainty are examined. An appreciation can be gained for the constraints and difficulties at play in devising a viable end-to-end methodology. Rationale is given for the various choices underlying the Real Space end-to-end approach. The approach adopts and refines some elements and constructs from the literature and adds pivotal new elements and constructs. Crucially, the approach reflects a pragmatism and versatility derived from working many industrial-scale problems involving complex physics and constitutive models, steady-state and time-varying nonlinear behavior and boundary conditions, and various types of uncertainty in experiments and models. The framework benefits from a broad exposure to integrated experimental and modeling activities in the areas of heat transfer, solid and structural mechanics, irradiated electronics, and combustion in fluids and solids.

  5. Validation of single-fluid and two-fluid magnetohydrodynamic models of the helicity injected torus spheromak experiment with the NIMROD code

    International Nuclear Information System (INIS)

    Akcay, Cihan; Victor, Brian S.; Jarboe, Thomas R.; Kim, Charlson C.

    2013-01-01

    We present a comparison study of 3-D pressureless resistive MHD (rMHD) and 3-D presureless two-fluid MHD models of the Helicity Injected Torus with Steady Inductive helicity injection (HIT-SI). HIT-SI is a current drive experiment that uses two geometrically asymmetric helicity injectors to generate and sustain toroidal plasmas. The comparable size of the collisionless ion skin depth d i to the resistive skin depth predicates the importance of the Hall term for HIT-SI. The simulations are run with NIMROD, an initial-value, 3-D extended MHD code. The modeled plasma density and temperature are assumed uniform and constant. The helicity injectors are modeled as oscillating normal magnetic and parallel electric field boundary conditions. The simulations use parameters that closely match those of the experiment. The simulation output is compared to the formation time, plasma current, and internal and surface magnetic fields. Results of the study indicate 2fl-MHD shows quantitative agreement with the experiment while rMHD only captures the qualitative features. The validity of each model is assessed based on how accurately it reproduces the global quantities as well as the temporal and spatial dependence of the measured magnetic fields. 2fl-MHD produces the current amplification (I tor /I inj ) and formation time τ f demonstrated by HIT-SI with similar internal magnetic fields. rMHD underestimates (I tor /I inj ) and exhibits much a longer τ f . Biorthogonal decomposition (BD), a powerful mathematical tool for reducing large data sets, is employed to quantify how well the simulations reproduce the measured surface magnetic fields without resorting to a probe-by-probe comparison. BD shows that 2fl-MHD captures the dominant surface magnetic structures and the temporal behavior of these features better than rMHD

  6. Validation and evaluation of the Dutch translation of the Overall Assessment of the Speaker's Experience of Stuttering for School-age children (OASES-S-D).

    Science.gov (United States)

    Lankman, Romy S; Yaruss, J Scott; Franken, Marie-Christine

    2015-09-01

    Stuttering can have a negative impact on many aspects of a speaker's life. Comprehensive assessment must therefore examine a range of experiences in order to reflect the overall impact of the disorder. This study evaluated the Dutch translation of the Overall Assessment of the Speaker's Experience of Stuttering--School-age (OASES-S; Yaruss & Quesal, 2010), which examines the impact of stuttering on the lives of children ages 7-12. The OASES-S was translated to Dutch (OASES-S-D) using a forward/backward translation process. Participants were 101 Dutch-speaking children who stutter (ages 7-12) who were recruited by speech-language therapists throughout the Netherlands. All participants completed the OASES-S-D, the Children's Attitudes about Talking-Dutch, a self-assessment of severity, a clinical assessment of severity, and a speech satisfaction rating. A control group of 51 children who do not stutter also completed the OASES-S-D to determine whether the tool could differentiate between children who stutter and children who do not stutter. All sections of the OASES-S-D except section I surpassed a Cronbach's alpha of 0.70, indicating good internal consistency and reliability. Comparisons between the OASES-S-D and other tools revealed moderate to high associations. The OASES-S-D was able to discriminate between children who stutter and children who do not stutter and between participants with different levels of stuttering severity. The OASES-S-D appears to be a reliable and valid measure that can be used to assess the impact of stuttering on 7- to 12-year old Dutch-speaking children who stutter. The reader will be able to: (a) describe the purpose of the OASES-S-D measurement tool; (b) summarize the translation process used in creating the OASES-S-D; (c) summarize the aspects of stuttering measured in the different sections of the OASES-S-D; (d) describe with what measurement tools the validity of the OASES-S-D was investigated; and (e) describe the differences

  7. Jendl-3.1 iron validation on the PCA-REPLICA (H{sub 2}O/Fe) shielding benchmark experiment

    Energy Technology Data Exchange (ETDEWEB)

    Pescarini, M.; Borgia, M. G. [ENEA, Centro Ricerche ``Ezio Clementel``, Bologna (Italy). Dipt. Energia

    1997-03-01

    The PCA-REPLICA (H{sub 2}O/Fe) neutron shielding benchmarks experiment is analysed using the SN 2-D DOT 3.5-E code and the 3-D-equivalent flux synthesis method. This engineering benchmark reproduces the ex-core radial geometry of a PWR, including a mild steel reactor pressure vessel (RPV) simulator, and is designed to test the accuracy of the calculation of the in-vessel neutron exposure parameters. This accuracy is strongly dependent on the quality of the iron neutron cross sections used to describe the nuclear reactions within the RPV simulator. In particular, in this report, the cross sections based on the JENDL-3.1 iron data files are tested, through a comparison of the calculated integral and spectral results with the corresponding experimental data. In addition, the present results are compared, on the same benchmark experiment, with those of a preceding ENEA-Bologna validation of the ENDF/B VI iron cross sections. The integral result comparison indicates that, for all the threshold detectors considered (Rh-103 (n, n`) Rh-103m, In-115 (n, n`) In-115m and S-32 (n, p) P-32), the JENDL-3.1 natural iron data produce satisfactory results similar to those obtained with the ENDF/B VI iron data. On the contrary, when the JENDL/3.1 Fe-56 data file is used, strongly underestimated results are obtained for the lower energy threshold detectors, Rh-103 and In-115. This fact, in particular, becomes more evident with increasing the neutron penetration depth in the RPV simulator.

  8. Development and validation of a critical gradient energetic particle driven Alfven eigenmode transport model for DIII-D tilted neutral beam experiments

    Science.gov (United States)

    Waltz, R. E.; Bass, E. M.; Heidbrink, W. W.; VanZeeland, M. A.

    2015-11-01

    Recent experiments with the DIII-D tilted neutral beam injection (NBI) varying the beam energetic particle (EP) source profiles have provided strong evidence that unstable Alfven eigenmodes (AE) drive stiff EP transport at a critical EP density gradient [Heidbrink et al 2013 Nucl. Fusion 53 093006]. Here the critical gradient is identified by the local AE growth rate being equal to the local ITG/TEM growth rate at the same low toroidal mode number. The growth rates are taken from the gyrokinetic code GYRO. Simulation show that the slowing down beam-like EP distribution has a slightly lower critical gradient than the Maxwellian. The ALPHA EP density transport code [Waltz and Bass 2014 Nucl. Fusion 54 104006], used to validate the model, combines the low-n stiff EP critical density gradient AE mid-core transport with the Angioni et al (2009 Nucl. Fusion 49 055013) energy independent high-n ITG/TEM density transport model controling the central core EP density profile. For the on-axis NBI heated DIII-D shot 146102, while the net loss to the edge is small, about half the birth fast ions are transported from the central core r/a  <  0.5 and the central density is about half the slowing down density. These results are in good agreement with experimental fast ion pressure profiles inferred from MSE constrained EFIT equilibria.

  9. Uranium and thorium loadings determined by chemical and nondestructive methods in HTGR fuel rods for the Fort St. Vrain Early Validation Irradiation Experiment

    International Nuclear Information System (INIS)

    Angelini, P.; Rushton, J.E.

    1979-01-01

    The Fort St. Vrain Early Validation Irradiation Experiment is an irradiation test of reference and of improved High-Temperature Gas-Cooled Reactor fuels in the Fort St. Vrain Reactor. The irradiation test includes fuel rods fabricated at ORNL on an engineering scale fuel rod molding machine. Fuel rods were nondestructively assayed for 235 U content by a technique based on the detection of prompt-fission neutrons induced by thermal-neutron interrogation and were later chemically assayed by using the modified Davies Gray potentiometric titration method. The chemical analysis of the thorium content was determined by a volumetric titration method. The chemical assay method for uranium was evaluated and the results from the as-molded fuel rods agree with those from: (1) large samples of Triso-coated fissile particles, (2) physical mixtures of the three particle types, and (3) standard solutions to within 0.05%. Standard fuel rods were fabricated in order to evaluate and calibrate the nondestructive assay device. The agreement of the results from calibration methods was within 0.6%. The precision of the nondestructive assay device was established as approximately 0.6% by repeated measurements of standard rods. The precision was comparable to that estimated by Poisson statistics. A relative difference of 0.77 to 1.5% was found between the nondestructive and chemical determinations on the reactor grade fuel rods

  10. Explicating Validity

    Science.gov (United States)

    Kane, Michael T.

    2016-01-01

    How we choose to use a term depends on what we want to do with it. If "validity" is to be used to support a score interpretation, validation would require an analysis of the plausibility of that interpretation. If validity is to be used to support score uses, validation would require an analysis of the appropriateness of the proposed…

  11. Ground water contamination with (238)U, (234)U, (235)U, (226)Ra and (210)Pb from past uranium mining: cove wash, Arizona.

    Science.gov (United States)

    Dias da Cunha, Kenya Moore; Henderson, Helenes; Thomson, Bruce M; Hecht, Adam A

    2014-06-01

    in the majority of the water samples, indicating more than one source of contamination could contribute to the sampled sources. The effective doses due to ingestion of the minimum uranium concentrations in water samples exceed the average dose considering inhalation and ingestion of regular diet for other populations around the world (1 μSv/year). The maximum doses due to ingestion of (238)U or (234)U were above the international limit for effective dose for members of the public (1 mSv/year), except for inhabitants of two chapters. The highest effective dose was estimated for inhabitants of Cove, and it was almost 20 times the international limit for members of the public. These results indicate that ingestion of water from some of the sampled sources poses health risks.

  12. Study of atmospheric stratification influence on pollutants dispersion using a numerical fluid mechanics model. Code-Saturne validation with the Prairie Grass experiment/Study of atmospheric stratification influence on pollutants dispersion using a numerical fluid mechanics software

    International Nuclear Information System (INIS)

    Coulon, Fanny

    2010-09-01

    A validation of Code-Saturne, a computational fluids dynamics model developed by EDF, is proposed for stable conditions. The goal is to guarantee the performance of the model in order to use it for impacts study. A comparison with the Prairie Grass data field experiment and with two Gaussian plume models will be done [fr

  13. Air Traffic Management Technology Demostration Phase 1 (ATD) Interval Management for Near-Term Operations Validation of Acceptability (IM-NOVA) Experiment

    Science.gov (United States)

    Kibler, Jennifer L.; Wilson, Sara R.; Hubbs, Clay E.; Smail, James W.

    2015-01-01

    The Interval Management for Near-term Operations Validation of Acceptability (IM-NOVA) experiment was conducted at the National Aeronautics and Space Administration (NASA) Langley Research Center (LaRC) in support of the NASA Airspace Systems Program's Air Traffic Management Technology Demonstration-1 (ATD-1). ATD-1 is intended to showcase an integrated set of technologies that provide an efficient arrival solution for managing aircraft using Next Generation Air Transportation System (NextGen) surveillance, navigation, procedures, and automation for both airborne and ground-based systems. The goal of the IMNOVA experiment was to assess if procedures outlined by the ATD-1 Concept of Operations were acceptable to and feasible for use by flight crews in a voice communications environment when used with a minimum set of Flight Deck-based Interval Management (FIM) equipment and a prototype crew interface. To investigate an integrated arrival solution using ground-based air traffic control tools and aircraft Automatic Dependent Surveillance-Broadcast (ADS-B) tools, the LaRC FIM system and the Traffic Management Advisor with Terminal Metering and Controller Managed Spacing tools developed at the NASA Ames Research Center (ARC) were integrated into LaRC's Air Traffic Operations Laboratory (ATOL). Data were collected from 10 crews of current 757/767 pilots asked to fly a high-fidelity, fixed-based simulator during scenarios conducted within an airspace environment modeled on the Dallas-Fort Worth (DFW) Terminal Radar Approach Control area. The aircraft simulator was equipped with the Airborne Spacing for Terminal Area Routes (ASTAR) algorithm and a FIM crew interface consisting of electronic flight bags and ADS-B guidance displays. Researchers used "pseudo-pilot" stations to control 24 simulated aircraft that provided multiple air traffic flows into the DFW International Airport, and recently retired DFW air traffic controllers served as confederate Center, Feeder, Final

  14. Validation of the TRACE code for the system dynamic simulations of the molten salt reactor experiment and the preliminary study on the dual fluid molten salt reactor

    International Nuclear Information System (INIS)

    He, Xun

    2016-01-01

    Molten Salt Reactor (MSR), which was confirmed as one of the six Generation IV reactor types by the GIF (Generation IV International Forum in 2008), recently draws a lot of attention all around the world. Due to the application of liquid fuels the MSR can be regarded as the most special one among those six GEN-IV reactor types in a sense. A unique advantage of using liquid nuclear fuel lies in that the core melting accident can be thoroughly eliminated. Besides, a molten salt reactor can have several fuel options, for instance, the fuel can be based on "2"3"5U, "2"3"2Th-"2"3"3U, "2"3"8U-"2"3"9Pu cycle or even the spent nuclear fuel (SNF), so the reactor can be operated as a breeder or as an actinides burner both with fast, thermal or epi-thermal neutron spectrum and hence, it has excellent features of the fuel sustainability and for the non-proliferation. Furthermore, the lower operating pressure not only means a lower risk of the explosion as well as the radioactive leakage but also implies that the reactor vessel and its components can be lightweight, thus lowering the cost of equipments. So far there is no commercial MSR being operated. However, the MSR concept and its technical validation dates back to the 1960s to 1970s, when the scientists and engineers from ORNL (Oak Ridge National Laboratory) in the United States managed to build and run the world's first civilian molten salt reactor called MSRE (Molten Salt Reactor Experiment). The MSRE was an experimental liquid-fueled reactor with 10 MW thermal output using "4LiF-BeF_2-ZrF_4-UF_4 as the fuel also as the coolant itself. The MSRE is usually taken as a very important reference case for many current researches to validate their codes and simulations. Without exception it works also as a benchmark for this thesis. The current thesis actually consists of two main parts. The first part is about the validation of the current code for the old MSRE concept, while the second one is about the demonstration of a new

  15. Validation of the TRACE code for the system dynamic simulations of the molten salt reactor experiment and the preliminary study on the dual fluid molten salt reactor

    Energy Technology Data Exchange (ETDEWEB)

    He, Xun

    2016-06-14

    Molten Salt Reactor (MSR), which was confirmed as one of the six Generation IV reactor types by the GIF (Generation IV International Forum in 2008), recently draws a lot of attention all around the world. Due to the application of liquid fuels the MSR can be regarded as the most special one among those six GEN-IV reactor types in a sense. A unique advantage of using liquid nuclear fuel lies in that the core melting accident can be thoroughly eliminated. Besides, a molten salt reactor can have several fuel options, for instance, the fuel can be based on {sup 235}U, {sup 232}Th-{sup 233}U, {sup 238}U-{sup 239}Pu cycle or even the spent nuclear fuel (SNF), so the reactor can be operated as a breeder or as an actinides burner both with fast, thermal or epi-thermal neutron spectrum and hence, it has excellent features of the fuel sustainability and for the non-proliferation. Furthermore, the lower operating pressure not only means a lower risk of the explosion as well as the radioactive leakage but also implies that the reactor vessel and its components can be lightweight, thus lowering the cost of equipments. So far there is no commercial MSR being operated. However, the MSR concept and its technical validation dates back to the 1960s to 1970s, when the scientists and engineers from ORNL (Oak Ridge National Laboratory) in the United States managed to build and run the world's first civilian molten salt reactor called MSRE (Molten Salt Reactor Experiment). The MSRE was an experimental liquid-fueled reactor with 10 MW thermal output using {sup 4}LiF-BeF{sub 2}-ZrF{sub 4}-UF{sub 4} as the fuel also as the coolant itself. The MSRE is usually taken as a very important reference case for many current researches to validate their codes and simulations. Without exception it works also as a benchmark for this thesis. The current thesis actually consists of two main parts. The first part is about the validation of the current code for the old MSRE concept, while the second

  16. Containment Code Validation Matrix

    International Nuclear Information System (INIS)

    Chin, Yu-Shan; Mathew, P.M.; Glowa, Glenn; Dickson, Ray; Liang, Zhe; Leitch, Brian; Barber, Duncan; Vasic, Aleks; Bentaib, Ahmed; Journeau, Christophe; Malet, Jeanne; Studer, Etienne; Meynet, Nicolas; Piluso, Pascal; Gelain, Thomas; Michielsen, Nathalie; Peillon, Samuel; Porcheron, Emmanuel; Albiol, Thierry; Clement, Bernard; Sonnenkalb, Martin; Klein-Hessling, Walter; Arndt, Siegfried; Weber, Gunter; Yanez, Jorge; Kotchourko, Alexei; Kuznetsov, Mike; Sangiorgi, Marco; Fontanet, Joan; Herranz, Luis; Garcia De La Rua, Carmen; Santiago, Aleza Enciso; Andreani, Michele; Paladino, Domenico; Dreier, Joerg; Lee, Richard; Amri, Abdallah

    2014-01-01

    The Committee on the Safety of Nuclear Installations (CSNI) formed the CCVM (Containment Code Validation Matrix) task group in 2002. The objective of this group was to define a basic set of available experiments for code validation, covering the range of containment (ex-vessel) phenomena expected in the course of light and heavy water reactor design basis accidents and beyond design basis accidents/severe accidents. It was to consider phenomena relevant to pressurised heavy water reactor (PHWR), pressurised water reactor (PWR) and boiling water reactor (BWR) designs of Western origin as well as of Eastern European VVER types. This work would complement the two existing CSNI validation matrices for thermal hydraulic code validation (NEA/CSNI/R(1993)14) and In-vessel core degradation (NEA/CSNI/R(2001)21). The report initially provides a brief overview of the main features of a PWR, BWR, CANDU and VVER reactors. It also provides an overview of the ex-vessel corium retention (core catcher). It then provides a general overview of the accident progression for light water and heavy water reactors. The main focus is to capture most of the phenomena and safety systems employed in these reactor types and to highlight the differences. This CCVM contains a description of 127 phenomena, broken down into 6 categories: - Containment Thermal-hydraulics Phenomena; - Hydrogen Behaviour (Combustion, Mitigation and Generation) Phenomena; - Aerosol and Fission Product Behaviour Phenomena; - Iodine Chemistry Phenomena; - Core Melt Distribution and Behaviour in Containment Phenomena; - Systems Phenomena. A synopsis is provided for each phenomenon, including a description, references for further information, significance for DBA and SA/BDBA and a list of experiments that may be used for code validation. The report identified 213 experiments, broken down into the same six categories (as done for the phenomena). An experiment synopsis is provided for each test. Along with a test description

  17. A Web-based Google-Earth Coincident Imaging Tool for Satellite Calibration and Validation

    Science.gov (United States)

    Killough, B. D.; Chander, G.; Gowda, S.

    2009-12-01

    The Group on Earth Observations (GEO) is coordinating international efforts to build a Global Earth Observation System of Systems (GEOSS) to meet the needs of its nine “Societal Benefit Areas”, of which the most demanding, in terms of accuracy, is climate. To accomplish this vision, satellite on-orbit and ground-based data calibration and validation (Cal/Val) of Earth observation measurements are critical to our scientific understanding of the Earth system. Existing tools supporting space mission Cal/Val are often developed for specific campaigns or events with little desire for broad application. This paper describes a web-based Google-Earth based tool for the calculation of coincident satellite observations with the intention to support a diverse international group of satellite missions to improve data continuity, interoperability and data fusion. The Committee on Earth Observing Satellites (CEOS), which includes 28 space agencies and 20 other national and international organizations, are currently operating and planning over 240 Earth observation satellites in the next 15 years. The technology described here will better enable the use of multiple sensors to promote increased coordination toward a GEOSS. The CEOS Systems Engineering Office (SEO) and the Working Group on Calibration and Validation (WGCV) support the development of the CEOS Visualization Environment (COVE) tool to enhance international coordination of data exchange, mission planning and Cal/Val events. The objective is to develop a simple and intuitive application tool that leverages the capabilities of Google-Earth web to display satellite sensor coverage areas and for the identification of coincident scene locations along with dynamic menus for flexibility and content display. Key features and capabilities include user-defined evaluation periods (start and end dates) and regions of interest (rectangular areas) and multi-user collaboration. Users can select two or more CEOS missions from a

  18. FACTAR validation

    International Nuclear Information System (INIS)

    Middleton, P.B.; Wadsworth, S.L.; Rock, R.C.; Sills, H.E.; Langman, V.J.

    1995-01-01

    A detailed strategy to validate fuel channel thermal mechanical behaviour codes for use of current power reactor safety analysis is presented. The strategy is derived from a validation process that has been recently adopted industry wide. Focus of the discussion is on the validation plan for a code, FACTAR, for application in assessing fuel channel integrity safety concerns during a large break loss of coolant accident (LOCA). (author)

  19. Validation of HNO3, ClONO2, and N2O5 from the Atmospheric Chemistry Experiment Fourier Transform Spectrometer (ACE-FTS

    Directory of Open Access Journals (Sweden)

    P. Raspollini

    2008-07-01

    Full Text Available The Atmospheric Chemistry Experiment (ACE satellite was launched on 12 August 2003. Its two instruments measure vertical profiles of over 30 atmospheric trace gases by analyzing solar occultation spectra in the ultraviolet/visible and infrared wavelength regions. The reservoir gases HNO3, ClONO2, and N2O5 are three of the key species provided by the primary instrument, the ACE Fourier Transform Spectrometer (ACE-FTS. This paper describes the ACE-FTS version 2.2 data products, including the N2O5 update, for the three species and presents validation comparisons with available observations. We have compared volume mixing ratio (VMR profiles of HNO3, ClONO2, and N2O5 with measurements by other satellite instruments (SMR, MLS, MIPAS, aircraft measurements (ASUR, and single balloon-flights (SPIRALE, FIRS-2. Partial columns of HNO3 and ClONO2 were also compared with measurements by ground-based Fourier Transform Infrared (FTIR spectrometers. Overall the quality of the ACE-FTS v2.2 HNO3 VMR profiles is good from 18 to 35 km. For the statistical satellite comparisons, the mean absolute differences are generally within ±1 ppbv ±20% from 18 to 35 km. For MIPAS and MLS comparisons only, mean relative differences lie within±10% between 10 and 36 km. ACE-FTS HNO3 partial columns (~15–30 km show a slight negative bias of −1.3% relative to the ground-based FTIRs at latitudes ranging from 77.8° S–76.5° N. Good agreement between ACE-FTS ClONO2 and MIPAS, using the Institut für Meteorologie und Klimaforschung and Instituto de Astrofísica de Andalucía (IMK-IAA data processor is seen. Mean absolute differences are typically within ±0.01 ppbv between 16 and 27 km and less than +0.09 ppbv between 27 and 34 km. The ClONO2 partial column comparisons show varying degrees of agreement, depending on the location and the quality of the FTIR measurements. Good agreement was found for the comparisons with the midlatitude Jungfraujoch partial columns for which

  20. Sua Pan surface bidirectional reflectance: a validation experiment of the Multi-angle Imaging SpectroRadiometer (MISR) during SAFARI 2000

    Science.gov (United States)

    Abdou, Wedad A.; Pilorz, Stuart H.; Helmlinger, Mark C.; Diner, David J.; Conel, James E.; Martonchik, John V.; Gatebe, Charles K.; King, Michael D.; Hobbs, Peter V.

    2004-01-01

    The Southern Africa Regional Science Initiative (SAFARI 2000) dray deason campaign was carried out during August and September 2000 at the peak of biomass burning. The intensive ground-based and airborne measurements in this campaign provided a unique opportunity to validate space sensors, such as the Multi-angle Imaging SpectroRadiometer (MISR), onboard NASA's EOS Terra platform.

  1. Linking Recognition Practices and National Qualifications Frameworks: International Benchmarking of Experiences and Strategies on the Recognition, Validation and Accreditation (RVA) of Non-Formal and Informal Learning

    Science.gov (United States)

    Singh, Madhu, Ed.; Duvekot, Ruud, Ed.

    2013-01-01

    This publication is the outcome of the international conference organized by UNESCO Institute for Lifelong Learning (UIL), in collaboration with the Centre for Validation of Prior Learning at Inholland University of Applied Sciences, the Netherlands, and in partnership with the French National Commission for UNESCO that was held in Hamburg in…

  2. Análisis de confiabilidad y de validez del instrumento Course Experience Questionnaire (CEQ Análise de confiabilidade e de validade do instrumento Course Experience Questionnaire (CEQ Analysis of the Reliability and Validity of the Course Experience Questionnaire (CEQ

    Directory of Open Access Journals (Sweden)

    Carlos González

    2012-04-01

    Full Text Available El objetivo de este estudio es analizar la validez del instrumento Course Experience Questionnaire (CEQ, empleado para conocer la percepción de los estudiantes sobre la calidad del aprendizaje en la educación superior. El cuestionario fue traducido y aplicado a 325 estudiantes de ingeniería de una universidad pública de la región metropolitana de Chile. Se generaron estadísticas descriptivas y tanto los niveles de confiabilidad como los análisis de validez mostraron resultados mayoritariamente adecuados. El instrumento CEQ puede emplearse para medir la calidad de la docencia en universidades latinoamericanas y se sugiere su uso con fines de investigación. Nuevas investigaciones deberán continuar el proceso de validación e incorporar otras variables consideradas clave por la línea Student Learning Research para indagar la experiencia de aprendizaje de los estudiantes universitarios.O objetivo deste estudo é analisar a validade do instrumento Course Experience Questionnaire (CEQ, empregado para conhecer a percepção dos estudantes sobre a qualidade da aprendizagem na educação superior. O questionário foi traduzido e aplicado com 325 estudantes de engenharia de uma universidade pública da região metropolitana do Chile. Geraram-se estatísticas descritivas e tanto os níveis de confiabilidade quanto as análises de validade mostraram resultados majoritariamente adequados. O instrumento CEQ pode empregar-se para medir a qualidade da docência em universidades latino-americanas e se sugere seu uso com fins de pesquisa. Novas pesquisas deverão continuar o processo de validade e incorporar outras variáveis consideradas chave pela linha Student Learning Research para indagar a experiência de aprendizagem dos estudantes universitários.The objective of this study is to analyze the validity of the Course Experience Questionnaire (CEQ used to know how students perceive the quality of learning in higher education. The CEQ was translated

  3. Using plot experiments to test the validity of mass balance models employed to estimate soil redistribution rates from 137Cs and 210Pbex measurements

    International Nuclear Information System (INIS)

    Porto, Paolo; Walling, Des E.

    2012-01-01

    Information on rates of soil loss from agricultural land is a key requirement for assessing both on-site soil degradation and potential off-site sediment problems. Many models and prediction procedures have been developed to estimate rates of soil loss and soil redistribution as a function of the local topography, hydrometeorology, soil type and land management, but empirical data remain essential for validating and calibrating such models and prediction procedures. Direct measurements using erosion plots are, however, costly and the results obtained relate to a small enclosed area, which may not be representative of the wider landscape. In recent years, the use of fallout radionuclides and more particularly caesium-137 ( 137 Cs) and excess lead-210 ( 210 Pb ex ) has been shown to provide a very effective means of documenting rates of soil loss and soil and sediment redistribution in the landscape. Several of the assumptions associated with the theoretical conversion models used with such measurements remain essentially unvalidated. This contribution describes the results of a measurement programme involving five experimental plots located in southern Italy, aimed at validating several of the basic assumptions commonly associated with the use of mass balance models for estimating rates of soil redistribution on cultivated land from 137 Cs and 210 Pb ex measurements. Overall, the results confirm the general validity of these assumptions and the importance of taking account of the fate of fresh fallout. However, further work is required to validate the conversion models employed in using fallout radionuclide measurements to document soil redistribution in the landscape and this could usefully direct attention to different environments and to the validation of the final estimates of soil redistribution rate as well as the assumptions of the models employed. - Highlights: ► Soil erosion is an important threat to the long-term sustainability of agriculture.

  4. Using plot experiments to test the validity of mass balance models employed to estimate soil redistribution rates from 137Cs and 210Pb(ex) measurements.

    Science.gov (United States)

    Porto, Paolo; Walling, Des E

    2012-10-01

    Information on rates of soil loss from agricultural land is a key requirement for assessing both on-site soil degradation and potential off-site sediment problems. Many models and prediction procedures have been developed to estimate rates of soil loss and soil redistribution as a function of the local topography, hydrometeorology, soil type and land management, but empirical data remain essential for validating and calibrating such models and prediction procedures. Direct measurements using erosion plots are, however, costly and the results obtained relate to a small enclosed area, which may not be representative of the wider landscape. In recent years, the use of fallout radionuclides and more particularly caesium-137 ((137)Cs) and excess lead-210 ((210)Pb(ex)) has been shown to provide a very effective means of documenting rates of soil loss and soil and sediment redistribution in the landscape. Several of the assumptions associated with the theoretical conversion models used with such measurements remain essentially unvalidated. This contribution describes the results of a measurement programme involving five experimental plots located in southern Italy, aimed at validating several of the basic assumptions commonly associated with the use of mass balance models for estimating rates of soil redistribution on cultivated land from (137)Cs and (210)Pb(ex) measurements. Overall, the results confirm the general validity of these assumptions and the importance of taking account of the fate of fresh fallout. However, further work is required to validate the conversion models employed in using fallout radionuclide measurements to document soil redistribution in the landscape and this could usefully direct attention to different environments and to the validation of the final estimates of soil redistribution rate as well as the assumptions of the models employed. Copyright © 2012 Elsevier Ltd. All rights reserved.

  5. Measuring the Pros and Cons of What It Means to Be a Black Man: Development and Validation of the Black Men?s Experiences Scale (BMES)

    OpenAIRE

    Bowleg, Lisa; English, Devin; del Rio-Gonzalez, Ana Maria; Burkholder, Gary J.; Teti, Michelle; Tschann, Jeanne M.

    2016-01-01

    Although extensive research documents that Black people in the U.S. frequently experience social discrimination, most of this research aggregates these experiences primarily or exclusively by race. Consequently, empirical gaps exist about the psychosocial costs and benefits of Black men?s experiences at the intersection of race and gender. Informed by intersectionality, a theoretical framework that highlights how multiple social identities intersect to reflect interlocking social-structural i...

  6. Validation of seismic soil-structure interaction analysis methods: EPRI [Electric Power Research Institute]/NRC [Nuclear Regulatory Commission] cooperation in Lotung, Taiwan, experiments

    International Nuclear Information System (INIS)

    Kot, C.A.; Srinivasan, M.G.; Hsieh, B.J.; Tang, Y.K.; Kassawara, R.P.

    1986-01-01

    The cooperative program between NRC/ANL and EPRI on the validation of soil-structure interaction analysis methods with actual seismic response data is described. A large scale-model of a containment building has been built by EPRI/Taipower in a highly seismic region of Taiwan. Vibration tests were performed, first on the basemat before the superstructure was built and then on the completed structure. Since its completion, the structure has experienced many earthquakes. The site and structural response to these earthquakes have been recorded with field (surface and downhole) and structural instrumentation. The validation program involves blind predictions of site and structural response during vibration tests and a selected seismic event, and subsequent comparison between the predictions and measurements. The predictive calculations are in progress. The results of the correlation are expected to lead to the evaluation of the methods as to their conservatisms and sensitivities

  7. Solar Tower Experiments for Radiometric Calibration and Validation of Infrared Imaging Assets and Analysis Tools for Entry Aero-Heating Measurements

    Science.gov (United States)

    Splinter, Scott C.; Daryabeigi, Kamran; Horvath, Thomas J.; Mercer, David C.; Ghanbari, Cheryl M.; Ross, Martin N.; Tietjen, Alan; Schwartz, Richard J.

    2008-01-01

    The NASA Engineering and Safety Center sponsored Hypersonic Thermodynamic Infrared Measurements assessment team has a task to perform radiometric calibration and validation of land-based and airborne infrared imaging assets and tools for remote thermographic imaging. The IR assets and tools will be used for thermographic imaging of the Space Shuttle Orbiter during entry aero-heating to provide flight boundary layer transition thermography data that could be utilized for calibration and validation of empirical and theoretical aero-heating tools. A series of tests at the Sandia National Laboratories National Solar Thermal Test Facility were designed for this task where reflected solar radiation from a field of heliostats was used to heat a 4 foot by 4 foot test panel consisting of LI 900 ceramic tiles located on top of the 200 foot tall Solar Tower. The test panel provided an Orbiter-like entry temperature for the purposes of radiometric calibration and validation. The Solar Tower provided an ideal test bed for this series of radiometric calibration and validation tests because it had the potential to rapidly heat the large test panel to spatially uniform and non-uniform elevated temperatures. Also, the unsheltered-open-air environment of the Solar Tower was conducive to obtaining unobstructed radiometric data by land-based and airborne IR imaging assets. Various thermocouples installed on the test panel and an infrared imager located in close proximity to the test panel were used to obtain surface temperature measurements for evaluation and calibration of the radiometric data from the infrared imaging assets. The overall test environment, test article, test approach, and typical test results are discussed.

  8. Validation of Neutron Calculation Codes and Models by means of benchmark cases in the frame of the Binational Commission of Nuclear Energy. Criticality Experiments

    International Nuclear Information System (INIS)

    Dos Santos, Adimir; Siqueira, Paulo de Tarso D.; Andrade e Silva, Graciete Simões; Grant, Carlos; Tarazaga, Ariel E.; Barberis, Claudia

    2013-01-01

    In year 2008 the Atomic Energy National Commission (CNEA) of Argentina, and the Brazilian Institute of Energetic and Nuclear Research (IPEN), under the frame of Nuclear Energy Argentine Brazilian Agreement (COBEN), among many others, included the project “Validation and Verification of Calculation Methods used for Research and Experimental Reactors . At this time, it was established that the validation was to be performed with models implemented in the deterministic codes HUEMUL and PUMA (cell and reactor codes) developed by CNEA and those ones implemented in MCNP by CNEA and IPEN. The necessary data for these validations would correspond to theoretical-experimental reference cases in the research reactor IPEN/MB-01 located in São Paulo, Brazil. The staff of the group Reactor and Nuclear Power Studies (SERC) of CNEA, from the argentine side, performed calculations with deterministic models (HUEMUL-PUMA) and probabilistic methods (MCNP) modeling a great number of physical situations of de reactor, which previously have been studied and modeled by members of the Center of Nuclear Engineering of the IPEN, whose results were extensively provided to CNEA. In this paper results for critical configurations are shown. (author)

  9. Validation suite for MCNP

    International Nuclear Information System (INIS)

    Mosteller, Russell D.

    2002-01-01

    Two validation suites, one for criticality and another for radiation shielding, have been defined and tested for the MCNP Monte Carlo code. All of the cases in the validation suites are based on experiments so that calculated and measured results can be compared in a meaningful way. The cases in the validation suites are described, and results from those cases are discussed. For several years, the distribution package for the MCNP Monte Carlo code1 has included an installation test suite to verify that MCNP has been installed correctly. However, the cases in that suite have been constructed primarily to test options within the code and to execute quickly. Consequently, they do not produce well-converged answers, and many of them are physically unrealistic. To remedy these deficiencies, sets of validation suites are being defined and tested for specific types of applications. All of the cases in the validation suites are based on benchmark experiments. Consequently, the results from the measurements are reliable and quantifiable, and calculated results can be compared with them in a meaningful way. Currently, validation suites exist for criticality and radiation-shielding applications.

  10. Development and validation of simple RP-HPLC-PDA analytical protocol for zileuton assisted with Design of Experiments for robustness determination

    OpenAIRE

    Saurabh B. Ganorkar; Dinesh M. Dhumal; Atul A. Shirkhedkar

    2017-01-01

    A simple, rapid, sensitive, robust, stability-indicating RP-HPLC-PDA analytical protocol was developed and validated for the analysis of zileuton racemate in bulk and in tablet formulation. Development of method and resolution of degradation products from forced; hydrolytic (acidic, basic, neutral), oxidative, photolytic (acidic, basic, neutral, solid state) and thermal (dry heat) degradation was achieved on a LC – GC Qualisil BDS C18 column (250 mm × 4.6 mm × 5 μm) by isocratic mode at ambie...

  11. DebrisInterMixing-2.3: a finite volume solver for three-dimensional debris-flow simulations with two calibration parameters – Part 2: Model validation with experiments

    Directory of Open Access Journals (Sweden)

    A. von Boetticher

    2017-11-01

    Full Text Available Here, we present validation tests of the fluid dynamic solver presented in von Boetticher et al. (2016, simulating both laboratory-scale and large-scale debris-flow experiments. The new solver combines a Coulomb viscoplastic rheological model with a Herschel–Bulkley model based on material properties and rheological characteristics of the analyzed debris flow. For the selected experiments in this study, all necessary material properties were known – the content of sand, clay (including its mineral composition and gravel as well as the water content and the angle of repose of the gravel. Given these properties, two model parameters are sufficient for calibration, and a range of experiments with different material compositions can be reproduced by the model without recalibration. One calibration parameter, the Herschel–Bulkley exponent, was kept constant for all simulations. The model validation focuses on different case studies illustrating the sensitivity of debris flows to water and clay content, channel curvature, channel roughness and the angle of repose. We characterize the accuracy of the model using experimental observations of flow head positions, front velocities, run-out patterns and basal pressures.

  12. Development and validation of simple RP-HPLC-PDA analytical protocol for zileuton assisted with Design of Experiments for robustness determination

    Directory of Open Access Journals (Sweden)

    Saurabh B. Ganorkar

    2017-02-01

    Full Text Available A simple, rapid, sensitive, robust, stability-indicating RP-HPLC-PDA analytical protocol was developed and validated for the analysis of zileuton racemate in bulk and in tablet formulation. Development of method and resolution of degradation products from forced; hydrolytic (acidic, basic, neutral, oxidative, photolytic (acidic, basic, neutral, solid state and thermal (dry heat degradation was achieved on a LC – GC Qualisil BDS C18 column (250 mm × 4.6 mm × 5 μm by isocratic mode at ambient temperature, employing a mobile phase methanol and (0.2%, v/v orthophosphoric acid in ratio of (80:20, v/v at a flow rate of 1.0 mL min−1 and detection at 260 nm. ‘Design of Experiments’ (DOE employing ‘Central Composite Design’ (CCD and ‘Response Surface Methodology’ (RSM were applied as an advancement to traditional ‘One Variable at Time’ (OVAT approach to evaluate the effects of variations in selected factors (methanol content, flow rate, concentration of orthophosphoric acid as graphical interpretation for robustness and statistical interpretation was achieved with Multiple Linear Regression (MLR and ANOVA. The method succeeded over the validation parameters: linearity, precision, accuracy, limit of detection and limit of quantitation, and robustness. The method was applied effectively for analysis of in-house zileuton tablets.

  13. Content validity and its estimation

    Directory of Open Access Journals (Sweden)

    Yaghmale F

    2003-04-01

    Full Text Available Background: Measuring content validity of instruments are important. This type of validity can help to ensure construct validity and give confidence to the readers and researchers about instruments. content validity refers to the degree that the instrument covers the content that it is supposed to measure. For content validity two judgments are necessary: the measurable extent of each item for defining the traits and the set of items that represents all aspects of the traits. Purpose: To develop a content valid scale for assessing experience with computer usage. Methods: First a review of 2 volumes of International Journal of Nursing Studies, was conducted with onlyI article out of 13 which documented content validity did so by a 4-point content validity index (CV! and the judgment of 3 experts. Then a scale with 38 items was developed. The experts were asked to rate each item based on relevance, clarity, simplicity and ambiguity on the four-point scale. Content Validity Index (CVI for each item was determined. Result: Of 38 items, those with CVIover 0.75 remained and the rest were discarded reSulting to 25-item scale. Conclusion: Although documenting content validity of an instrument may seem expensive in terms of time and human resources, its importance warrants greater attention when a valid assessment instrument is to be developed. Keywords: Content Validity, Measuring Content Validity

  14. Optimized set of two-dimensional experiments for fast sequential assignment, secondary structure determination, and backbone fold validation of 13C/15N-labelled proteins

    International Nuclear Information System (INIS)

    Bersch, Beate; Rossy, Emmanuel; Coves, Jacques; Brutscher, Bernhard

    2003-01-01

    NMR experiments are presented which allow backbone resonance assignment, secondary structure identification, and in favorable cases also molecular fold topology determination from a series of two-dimensional 1 H- 15 N HSQC-like spectra. The 1 H- 15 N correlation peaks are frequency shifted by an amount ± ω X along the 15 N dimension, where ω X is the C α , C β , or H α frequency of the same or the preceding residue. Because of the low dimensionality (2D) of the experiments, high-resolution spectra are obtained in a short overall experimental time. The whole series of seven experiments can be performed in typically less than one day. This approach significantly reduces experimental time when compared to the standard 3D-based methods. The here presented methodology is thus especially appealing in the context of high-throughput NMR studies of protein structure, dynamics or molecular interfaces

  15. Calculation methodology validation. Pt. 2/01-R. Calculation of the multiplication factor for eight experiments with a critical set of nineteen VVER-440 fuel assemblies

    International Nuclear Information System (INIS)

    Kyncl, J.

    2001-04-01

    Comparison calculations were performed for 8 experiments accomplished in 2000 on the LR-0 reactor. The MCNP4a code was applied using effective cross section data in the continuous representation as per the ENDF/B-VI library. (P.A.)

  16. Medical application and clinical validation for reliable and trustworthy physiological monitoring using functional textiles: experience from the HeartCycle and MyHeart project.

    Science.gov (United States)

    Reiter, Harald; Muehlsteff, Jens; Sipilä, Auli

    2011-01-01

    Functional textiles are seen as promising technology to enable healthcare services and medical care outside hospitals due to their ability to integrate textile-based sensing and monitoring technologies into the daily life. In the past much effort has been spent onto basic functional textile research already showing that reliable monitoring solutions can be realized. The challenge remains to find and develop suited medical application and to fulfil the boundary conditions for medical endorsement and exploitation. The HeartCycle vest described in this abstract will serve as an example for a functional textile carefully developed according to the requirements of a specific medical application, its clinical validation, the related certification aspects and the next improvement steps towards exploitation.

  17. Intercenter validation of a knowledge based model for automated planning of volumetric modulated arc therapy for prostate cancer. The experience of the German RapidPlan Consortium.

    Directory of Open Access Journals (Sweden)

    Carolin Schubert

    Full Text Available To evaluate the performance of a model-based optimisation process for volumetric modulated arc therapy applied to prostate cancer in a multicentric cooperative group. The RapidPlan (RP knowledge-based engine was tested for the planning of Volumetric modulated arc therapy with RapidArc on prostate cancer patients. The study was conducted in the frame of the German RapidPlan Consortium (GRC.43 patients from one institute of the GRC were used to build and train a RP model. This was further shared with all members of the GRC plus an external site from a different country to increase the heterogeneity of the patient's sampling. An in silico multicentric validation of the model was performed at planning level by comparing RP against reference plans optimized according to institutional procedures. A total of 60 patients from 7 institutes were used.On average, the automated RP based plans resulted fully consistent with the manually optimised set with a modest tendency to improvement in the medium-to-high dose region. A per-site stratification allowed to identify different patterns of performance of the model with some organs at risk resulting better spared with the manual or with the automated approach but in all cases the RP data fulfilled the clinical acceptability requirements. Discrepancies in the performance were due to different contouring protocols or to different emphasis put in the optimization of the manual cases.The multicentric validation demonstrated that it was possible to satisfactorily optimize with the knowledge based model patients from all participating centres. In the presence of possibly significant differences in the contouring protocols, the automated plans, though acceptable and fulfilling the benchmark goals, might benefit from further fine tuning of the constraints. The study demonstrates that, at least for the case of prostate cancer patients, it is possibile to share models among different clinical institutes in a cooperative

  18. Validation of the Long-term Difficulties Inventory (LDI) and the List of Threatening Experiences (LTE) as measures of stress in epidemiological population-based cohort studies.

    Science.gov (United States)

    Rosmalen, J G M; Bos, E H; de Jonge, P

    2012-12-01

    Stress questionnaires are included in many epidemiological cohort studies but the psychometric characteristics of these questionnaires are largely unknown. The aim of this study was to describe these characteristics for two short questionnaires measuring the lifetime and past year occurrence of stress: the List of Threatening Events (LTE) as a measure of acute stress and the Long-term Difficulties Inventory (LDI) as a measure of chronic stress. This study was performed in a general population cohort consisting of 588 females (53.7%) and 506 males (46.3%), with a mean age of 53.5 years (s.d.=11.3 years). Respondents completed the LTE and the LDI for the past year, and for the age categories of 0-12, 13-18, 19-39, 40-60, and >60 years. They also completed questionnaires on perceived stress, psychological distress (the General Health Questionnaire, GHQ-12), anxiety and depression (the Symptom Checklist, SCL-8) and neuroticism (the Eysenck Personality Questionnaire - Revised Short Scale, EPQ-RSS-N). Approximately 2 years later, 976 respondents (89%) completed these questionnaires for a second time. The stability of the retrospective reporting of long-term difficulties and life events was satisfactory: 0.7 for the lifetime LDI and 0.6 for the lifetime LTE scores. The construct validity of these lists is indicated by their positive associations with psychological distress, mental health problems and neuroticism. This study in a large population-based sample shows that the LDI and LTE have sufficient validity and stability to include them in major epidemiological cohort studies.

  19. Re-interpretation of the ERMINE-V experiment validation of fission product integral cross section in the fast energy range

    Science.gov (United States)

    Ros, Paul; Leconte, Pierre; Blaise, Patrick; Naymeh, Laurent

    2017-09-01

    The current knowledge of nuclear data in the fast neutron energy range is not as good as in the thermal range, resulting in larger propagated uncertainties in integral quantities such as critical masses or reactivity effects. This situation makes it difficult to get the full benefit from recent advances in modeling and simulation. Zero power facilities such as the French ZPR MINERVE have already demonstrated that they can contribute to significantly reduce those uncertainties thanks to dedicated experiments. Historically, MINERVE has been mainly dedicated to thermal spectrum studies. However, experiments involving fast-thermal coupled cores were also performed in MINERVE as part of the ERMINE program, in order to improve nuclear data in fast spectra for the two French SFRs: PHENIX and SUPERPHENIX. Some of those experiments have been recently revisited. In particular, a full characterization of ZONA-1 and ZONA-3, two different cores loaded in the ERMINE V campaign, has been done, with much attention paid to possible sources of errors. It includes detailed geometric descriptions, energy profiles of the direct and adjoint fluxes and spectral indices obtained thanks to Monte Carlo calculations and compared to a reference fast core configuration. Sample oscillation experiments of separated fission products such as 103Rh or 99Tc, which were part of the ERMINE V program, have been simulated using recently-developed options in the TRIPOLI-4 code and compared to the experimental values. The present paper describes the corresponding results. The findings motivate in-depth studies for designing optimized coupled-core conditions in ZEPHYR, a new ZPR which will replace MINERVE and will provide integral data to meet the needs of Gen-III and Gen-IV reactors.

  20. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  1. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  2. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William L.; Trucano, Timothy G.

    2008-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  3. Health Heritage© a web-based tool for the collection and assessment of family health history: initial user experience and analytic validity.

    Science.gov (United States)

    Cohn, W F; Ropka, M E; Pelletier, S L; Barrett, J R; Kinzie, M B; Harrison, M B; Liu, Z; Miesfeldt, S; Tucker, A L; Worrall, B B; Gibson, J; Mullins, I M; Elward, K S; Franko, J; Guterbock, T M; Knaus, W A

    2010-01-01

    A detailed family health history is currently the most potentially useful tool for diagnosis and risk assessment in clinical genetics. We developed and evaluated the usability and analytic validity of a patient-driven web-based family health history collection and analysis tool. Health Heritage(©) guides users through the collection of their family health history by relative, generates a pedigree, completes risk assessment, stratification, and recommendations for 89 conditions. We compared the performance of Health Heritage to that of Usual Care using a nonrandomized cohort trial of 109 volunteers. We contrasted the completeness and sensitivity of family health history collection and risk assessments derived from Health Heritage and Usual Care to those obtained by genetic counselors and genetic assessment teams. Nearly half (42%) of the Health Heritage participants reported discovery of health risks; 63% found the information easy to understand and 56% indicated it would change their health behavior. Health Heritage consistently outperformed Usual Care in the completeness and accuracy of family health history collection, identifying 60% of the elevated risk conditions specified by the genetic team versus 24% identified by Usual Care. Health Heritage also had greater sensitivity than Usual Care when comparing the identification of risks. These results suggest a strong role for automated family health history collection and risk assessment and underscore the potential of these data to serve as the foundation for comprehensive, cost-effective personalized genomic medicine. Copyright © 2010 S. Karger AG, Basel.

  4. A CFD validation roadmap for hypersonic flows

    Science.gov (United States)

    Marvin, Joseph G.

    1993-01-01

    A roadmap for computational fluid dynamics (CFD) code validation is developed. The elements of the roadmap are consistent with air-breathing vehicle design requirements and related to the important flow path components: forebody, inlet, combustor, and nozzle. Building block and benchmark validation experiments are identified along with their test conditions and measurements. Based on an evaluation criteria, recommendations for an initial CFD validation data base are given and gaps identified where future experiments would provide the needed validation data.

  5. Forward modeling of fluctuating dietary 13C signals to validate 13C turnover models of milk and milk components from a diet-switch experiment.

    Directory of Open Access Journals (Sweden)

    Alexander Braun

    Full Text Available Isotopic variation of food stuffs propagates through trophic systems. But, this variation is dampened in each trophic step, due to buffering effects of metabolic and storage pools. Thus, understanding of isotopic variation in trophic systems requires knowledge of isotopic turnover. In animals, turnover is usually quantified in diet-switch experiments in controlled conditions. Such experiments usually involve changes in diet chemical composition, which may affect turnover. Furthermore, it is uncertain if diet-switch based turnover models are applicable under conditions with randomly fluctuating dietary input signals. Here, we investigate if turnover information derived from diet-switch experiments with dairy cows can predict the isotopic composition of metabolic products (milk, milk components and feces under natural fluctuations of dietary isotope and chemical composition. First, a diet-switch from a C3-grass/maize diet to a pure C3-grass diet was used to quantify carbon turnover in whole milk, lactose, casein, milk fat and feces. Data were analyzed with a compartmental mixed effects model, which allowed for multiple pools and intra-population variability, and included a delay between feed ingestion and first tracer appearance in outputs. The delay for milk components and whole milk was ~12 h, and that of feces ~20 h. The half-life (t½ for carbon in the feces was 9 h, while lactose, casein and milk fat had a t½ of 10, 18 and 19 h. The (13C kinetics of whole milk revealed two pools, a fast pool with a t½ of 10 h (likely representing lactose, and a slower pool with a t½ of 21 h (likely including casein and milk fat. The diet-switch based turnover information provided a precise prediction (RMSE ~0.2 ‰ of the natural (13C fluctuations in outputs during a 30 days-long period when cows ingested a pure C3 grass with naturally fluctuating isotope composition.

  6. Validation of finite element code DELFIN by means of the zero power experiences at the nuclear power plant of Atucha I

    International Nuclear Information System (INIS)

    Grant, C.R.

    1996-01-01

    Code DELFIN, developed in CNEA, treats the spatial discretization using heterogeneous finite elements, allowing a correct treatment of the continuity of fluxes and currents among elements and a more realistic representation of the hexagonal lattice of the reactor. It can be used for fuel management calculation, Xenon oscillation and spatial kinetics. Using the HUEMUL code for cell calculation (which uses a generalized two dimensional collision probability theory and has the WIMS library incorporated in a data base), the zero power experiences performed in 1974 were calculated. (author). 8 refs., 9 figs., 3 tabs

  7. Freeze-thaw-induced embolism in Pinus contorta: centrifuge experiments validate the 'thaw-expansion hypothesis' but conflict with ultrasonic emission data.

    Science.gov (United States)

    Mayr, Stefan; Sperry, John S

    2010-03-01

    *The 'thaw-expansion hypothesis' postulates that xylem embolism is caused by the formation of gas bubbles on freezing and their expansion on thawing. We evaluated the hypothesis using centrifuge experiments and ultrasonic emission monitoring in Pinus contorta. *Stem samples were exposed to freeze-thaw cycles at varying xylem pressure (P) in a centrifuge before the percentage loss of hydraulic conductivity (PLC) was measured. Ultrasonic acoustic emissions were registered on samples exposed to freeze-thaw cycles in a temperature chamber. *Freeze-thaw exposure of samples spun at -3 MPa induced a PLC of 32% (one frost cycle) and 50% (two cycles). An increase in P to -0.5 MPa during freezing had no PLC effect, whereas increased P during thaw lowered PLC to 7%. Ultrasonic acoustic emissions were observed during freezing and thawing at -3 MPa, but not in air-dried or water-saturated samples. A decrease in minimum temperature caused additional ultrasonic acoustic emissions, but had no effect on PLC. *The centrifuge experiments indicate that the 'thaw-expansion hypothesis' correctly describes the embolization process. Possible explanations for the increase in PLC on repeated frost cycles and for the ultrasonic acoustic emissions observed during freezing and with decreasing ice temperature are discussed.

  8. Reflectance conversion methods for the VIS/NIR imaging spectrometer aboard the Chang'E-3 lunar rover: based on ground validation experiment data

    International Nuclear Information System (INIS)

    Liu Bin; Liu Jian-Zhong; Zhang Guang-Liang; Zou Yong-Liao; Ling Zong-Cheng; Zhang Jiang; He Zhi-Ping; Yang Ben-Yong

    2013-01-01

    The second phase of the Chang'E Program (also named Chang'E-3) has the goal to land and perform in-situ detection on the lunar surface. A VIS/NIR imaging spectrometer (VNIS) will be carried on the Chang'E-3 lunar rover to detect the distribution of lunar minerals and resources. VNIS is the first mission in history to perform in-situ spectral measurement on the surface of the Moon, the reflectance data of which are fundamental for interpretation of lunar composition, whose quality would greatly affect the accuracy of lunar element and mineral determination. Until now, in-situ detection by imaging spectrometers was only performed by rovers on Mars. We firstly review reflectance conversion methods for rovers on Mars (Viking landers, Pathfinder and Mars Exploration rovers, etc). Secondly, we discuss whether these conversion methods used on Mars can be applied to lunar in-situ detection. We also applied data from a laboratory bidirectional reflectance distribution function (BRDF) using simulated lunar soil to test the availability of this method. Finally, we modify reflectance conversion methods used on Mars by considering differences between environments on the Moon and Mars and apply the methods to experimental data obtained from the ground validation of VNIS. These results were obtained by comparing reflectance data from the VNIS measured in the laboratory with those from a standard spectrometer obtained at the same time and under the same observing conditions. The shape and amplitude of the spectrum fits well, and the spectral uncertainty parameters for most samples are within 8%, except for the ilmenite sample which has a low albedo. In conclusion, our reflectance conversion method is suitable for lunar in-situ detection.

  9. Real-time three-dimensional color doppler evaluation of the flow convergence zone for quantification of mitral regurgitation: Validation experimental animal study and initial clinical experience

    Science.gov (United States)

    Sitges, Marta; Jones, Michael; Shiota, Takahiro; Qin, Jian Xin; Tsujino, Hiroyuki; Bauer, Fabrice; Kim, Yong Jin; Agler, Deborah A.; Cardon, Lisa A.; Zetts, Arthur D.; hide

    2003-01-01

    BACKGROUND: Pitfalls of the flow convergence (FC) method, including 2-dimensional imaging of the 3-dimensional (3D) geometry of the FC surface, can lead to erroneous quantification of mitral regurgitation (MR). This limitation may be mitigated by the use of real-time 3D color Doppler echocardiography (CE). Our objective was to validate a real-time 3D navigation method for MR quantification. METHODS: In 12 sheep with surgically induced chronic MR, 37 different hemodynamic conditions were studied with real-time 3DCE. Using real-time 3D navigation, the radius of the largest hemispherical FC zone was located and measured. MR volume was quantified according to the FC method after observing the shape of FC in 3D space. Aortic and mitral electromagnetic flow probes and meters were balanced against each other to determine reference MR volume. As an initial clinical application study, 22 patients with chronic MR were also studied with this real-time 3DCE-FC method. Left ventricular (LV) outflow tract automated cardiac flow measurement (Toshiba Corp, Tokyo, Japan) and real-time 3D LV stroke volume were used to quantify the reference MR volume (MR volume = 3DLV stroke volume - automated cardiac flow measurement). RESULTS: In the sheep model, a good correlation and agreement was seen between MR volume by real-time 3DCE and electromagnetic (y = 0.77x + 1.48, r = 0.87, P time 3DCE-derived MR volume also showed a good correlation and agreement with the reference method (y = 0.89x - 0.38, r = 0.93, P time 3DCE can capture the entire FC image, permitting geometrical recognition of the FC zone geometry and reliable MR quantification.

  10. German taxi drivers' experiences and expressions of driving anger: Are the driving anger scale and the driving anger expression inventory valid measures?

    Science.gov (United States)

    Brandenburg, Stefan; Oehl, Michael; Seigies, Kristin

    2017-11-17

    The objective of this article was 2-fold: firstly, we wanted to examine whether the original Driving Anger Scale (DAS) and the original Driving Anger Expression Inventory (DAX) apply to German professional taxi drivers because these scales have previously been given to professional and particularly to nonprofessional drivers in different countries. Secondly, we wanted to examine possible differences in driving anger experience and expression between professional German taxi drivers and nonprofessional German drivers. We applied German versions of the DAS, the DAX, and the State-Trait Anger Expression Inventory (STAXI) to a sample of 138 professional German taxi drivers. We then compared their ratings to the ratings of a sample of 1,136 nonprofessional German drivers (Oehl and Brandenburg n.d. ). Regarding our first objective, confirmatory factor analysis shows that the model fit of the DAS is better for nonprofessional drivers than for professional drivers. The DAX applies neither to professional nor to nonprofessional German drivers properly. Consequently, we suggest modified shorter versions of both scales for professional drivers. The STAXI applies to both professional and nonprofessional drivers. With respect to our second objective, we show that professional drivers experience significantly less driving anger than nonprofessional drivers, but they express more driving anger. We conclude that the STAXI can be applied to professional German taxi drivers. In contrast, for the DAS and the DAX we found particular shorter versions for professional taxi drivers. Especially for the DAX, most statements were too strong for German drivers to agree to. They do not show behaviors related to driving anger expression as they are described in the DAX. These problems with the original American DAX items are in line with several other studies in different countries. Future investigations should examine whether (professional) drivers from further countries express their anger

  11. Validation of a CATHENA fuel channel model for the post blowdown analysis of the high temperature thermal-chemical experiment CS28-1, I - Steady state

    International Nuclear Information System (INIS)

    Rhee, Bo Wook; Kim, Hyoung Tae; Park, Joo Hwan

    2008-01-01

    To form a licensing basis for the new methodology of the fuel channel safety analysis code system for CANDU-6, a CATHENA model for the post-blowdown fuel channel analysis for a Large Break LOCA has been developed, and tested for the steady state of a high temperature thermal-chemical experiment CS28-1. As the major concerns of the post-blowdown fuel channel analysis of the current CANDU-6 design are how much of the decay heat can be discharged to the moderator via a radiation and a convective heat transfer at the expected accident conditions, and how much zirconium sheath would be oxidized to generate H 2 at how high a fuel temperature, this study has focused on understanding these phenomena, their interrelations, and a way to maintain a good accuracy in the prediction of the fuel and the pressure tube temperatures without losing the important physics of the involved phenomena throughout the post-blowdown phase of a LBLOCA. For a better prediction, those factors that may significantly contribute to the prediction accuracy of the steady state of the test bundles were sought. The result shows that once the pressure tube temperature is predicted correctly by the CATHENA heat transfer model between the pressure tube and the calandria tube through a gap thermal resistance adjustment, all the remaining temperatures of the inner ring, middle ring and outer ring FES temperatures can be predicted quite satisfactorily, say to within an accuracy range of 20-25 deg. C, which is comparable to the reported accuracy of the temperature measurement, ±2%. Also the analysis shows the choice of the emissivity of the solid structures (typically, 0.80, 0.34, 0.34 for FES, PT, CT), and the thermal resistance across the CO 2 annulus are factors that significantly affect the steady state prediction accuracy. A question on the legitimacy of using 'transparent' assumption for the CO 2 gas annulus for the radiation heat transfer between the pressure tube and the calandria tube in CATHENA

  12. EOS Terra Validation Program

    Science.gov (United States)

    Starr, David

    2000-01-01

    The EOS Terra mission will be launched in July 1999. This mission has great relevance to the atmospheric radiation community and global change issues. Terra instruments include Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER), Clouds and Earth's Radiant Energy System (CERES), Multi-Angle Imaging Spectroradiometer (MISR), Moderate Resolution Imaging Spectroradiometer (MODIS) and Measurements of Pollution in the Troposphere (MOPITT). In addition to the fundamental radiance data sets, numerous global science data products will be generated, including various Earth radiation budget, cloud and aerosol parameters, as well as land surface, terrestrial ecology, ocean color, and atmospheric chemistry parameters. Significant investments have been made in on-board calibration to ensure the quality of the radiance observations. A key component of the Terra mission is the validation of the science data products. This is essential for a mission focused on global change issues and the underlying processes. The Terra algorithms have been subject to extensive pre-launch testing with field data whenever possible. Intensive efforts will be made to validate the Terra data products after launch. These include validation of instrument calibration (vicarious calibration) experiments, instrument and cross-platform comparisons, routine collection of high quality correlative data from ground-based networks, such as AERONET, and intensive sites, such as the SGP ARM site, as well as a variety field experiments, cruises, etc. Airborne simulator instruments have been developed for the field experiment and underflight activities including the MODIS Airborne Simulator (MAS) AirMISR, MASTER (MODIS-ASTER), and MOPITT-A. All are integrated on the NASA ER-2 though low altitude platforms are more typically used for MASTER. MATR is an additional sensor used for MOPITT algorithm development and validation. The intensive validation activities planned for the first year of the Terra

  13. Validation of the ATHLET-code 2.1A by calculation of the ECTHOR experiment; Validierung des ATHLET-Codes 2.1A anhand des Einzeleffekt-Tests ECTHOR

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Andreas; Sarkadi, Peter; Schaffrath, Andreas [TUEV NORD SysTec GmbH und Co. KG, Hamburg (Germany)

    2010-05-15

    Before a numerical code (e.g. ATHLET) is used for simulation of physical phenomena being new or unknown for the code and/or the user, the user ensures the applicability of the code and his own experience of handling with it by means of a so-called validation. Parametric studies with the code are executed for that matter and the results have to be compared with verified experimental data. Corresponding reference values are available in terms of so-called single-effect-tests (e.g. ECTHOR). In this work the system-code ATHLET Mod. 2.1 Cycle A is validated by post test calculation of the ECTHOR experiment due to the above named aspects. With the ECTHOR-tests the clearing of a water-filled model of a loop seal by means of an air-stream was investigated including momentum exchange at the phase interface under adiabatic and atmospheric conditions. The post test calculations show that the analytical results meet the experimental data within the reproducibility of the experiments. Further findings of the parametric studies are: - The experimental results obtained with the system water-air (ECTHOR) can be assigned to a water-steam-system, if the densities of the phases are equal in both cases. - The initial water level in the loop seal has no influence on the results as long as the gas mass flow is increased moderately. - The loop seal is appropriately nodalized if the mean length of the control volumes accords approx. 1.5 tim es the hydraulic pipe diameter. (orig.)

  14. Validation of the ATHLET-code 2.1A by calculation of the ECTHOR experiment; Validierung des ATHLET-Codes 2.1A anhand des Einzeleffekt-Tests ECTHOR

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Andreas; Sarkadi, Peter; Schaffrath, Andreas [TUEV NORD SysTec GmbH und Co. KG, Hamburg (Germany)

    2010-06-15

    Before a numerical code (e.g. ATHLET) is used for simulation of physical phenomena being new or unknown for the code and/or the user, the user ensures the applicability of the code and his own experience of handling with it by means of a so-called validation. Parametric studies with the code are executed for that matter und the results have to be compared with verified experimental data. Corresponding reference values are available in terms of so-called single-effect-tests (e.g. ECTHOR). In this work the system-code ATHLET Mod. 2.1 Cycle A is validated by post test calculation of the ECTHOR experiment due to the above named aspects. With the ECTHOR-tests the clearing of a water-filled model of a loop seal by means of an air-stream was investigated including momentum exchange at the phase interface under adiabatic and atmospheric conditions. The post test calculations show that the analytical results meet the experimental data within the reproducibility of the experiments. Further findings of the parametric studies are: - The experimental results obtained with the system water-air (ECTHOR) can be assigned to a water-steam-system, if the densities of the phases are equal in both cases. - The initial water level in the loop seal has no influence on the results as long as the gas mass flow is increased moderately. - The loop seal is appropriately nodalized if the mean length of the control volumes accords approx. 1.5 times the hydraulic pipe diameter. (orig.)

  15. Dual-sided electrosurgery handpiece for simultaneous tissue cutting and coagulation: first report on a conceptual design validated by an animal experiment.

    Science.gov (United States)

    Tawfik, Hatem A; Fouad, Yousef A; Hafez, Rashad

    2015-01-01

    To introduce and evaluate the safety of a novel dual-sided electrosurgery handpiece design for simultaneous tissue cutting and coagulation. We designed a prototype double-sided handpiece allowing automatic switching between two electrodes with a simple handpiece flip. The concept of the system as a surgical instrument was assessed by an animal experiment. The skin of 15 Wistar albino white rats could be successfully incised and coagulated using both ends of the handpiece, thereby confirming the prospects and clinical applications of the system. The dual-sided electrosurgery handpiece is a simple and safe alternative to the traditional electrosurgery pencil, allowing the simultaneous use of two electrodes without the hassle of frequent electrode replacement.

  16. TOPFLOW-experiments, model development and validation for the qualification of CFD-odes for two-phase flows. Final report

    International Nuclear Information System (INIS)

    Lucas, D.; Beyer, M.; Banowski, M.; Seidel, T.; Krepper, E.; Liao, Y.; Apanasevich, P.; Gauss, F.; Ma, T.

    2016-12-01

    This report summarizes the main results obtained in frame of the project. The aim of the project was the qualification of CFD-methods for two-phase flows with phase transfer relevant for nuclear safety research. To reach this aim CFD-grade experimental data are required. Such data can be obtained at the TOPFLOW facility because of the combination of experiments in scales and at parameters which are relevant for nuclear safety research with innovative measuring techniques. The experimental part of this project comprises investigations on flows in vertical pipes using the ultrafast X-ray tomography, on flows with and without phase transfer in a special test basin and on counter-current flow limitation in a model of a PWR hot leg. These experiments are only briefly presented in this report since detailed documentations are given in separated reports for all of these 3 experimental series. One important results of the activities devoted on CFD qualification is the establishment of the baseline model concept and the definition of the baseline model for poly-disperse bubbly flows. This is an important contribution to improve the predictive capabilities of CFD-models basing on the two- or multi-fluid approach. On the other hand, the innovative Generalized Two-Phase Flow concept (GENTOP) aims on an extension of the range of applicability of CFD-methods. In many relevant flow situations different morphologies of the phases or different flow pattern occur simultaneously in one flow domain. In addition transitions between these morphologies may occur. The GENTOP-concept for the first time a framework was established which allows the simulation of such flow situations in a consistent manner. Other activities of the project aim on special model developments to improve the simulation capabilities for flows with phase transfer.

  17. Physical Validation of GPM Retrieval Algorithms Over Land: An Overview of the Mid-Latitude Continental Convective Clouds Experiment (MC3E)

    Science.gov (United States)

    Petersen, Walter A.; Jensen, Michael P.

    2011-01-01

    The joint NASA Global Precipitation Measurement (GPM) -- DOE Atmospheric Radiation Measurement (ARM) Midlatitude Continental Convective Clouds Experiment (MC3E) was conducted from April 22-June 6, 2011, centered on the DOE-ARM Southern Great Plains Central Facility site in northern Oklahoma. GPM field campaign objectives focused on the collection of airborne and ground-based measurements of warm-season continental precipitation processes to support refinement of GPM retrieval algorithm physics over land, and to improve the fidelity of coupled cloud resolving and land-surface satellite simulator models. DOE ARM objectives were synergistically focused on relating observations of cloud microphysics and the surrounding environment to feedbacks on convective system dynamics, an effort driven by the need to better represent those interactions in numerical modeling frameworks. More specific topics addressed by MC3E include ice processes and ice characteristics as coupled to precipitation at the surface and radiometer signals measured in space, the correlation properties of rainfall and drop size distributions and impacts on dual-frequency radar retrieval algorithms, the transition of cloud water to rain water (e.g., autoconversion processes) and the vertical distribution of cloud water in precipitating clouds, and vertical draft structure statistics in cumulus convection. The MC3E observational strategy relied on NASA ER-2 high-altitude airborne multi-frequency radar (HIWRAP Ka-Ku band) and radiometer (AMPR, CoSMIR; 10-183 GHz) sampling (a GPM "proxy") over an atmospheric column being simultaneously profiled in situ by the University of North Dakota Citation microphysics aircraft, an array of ground-based multi-frequency scanning polarimetric radars (DOE Ka-W, X and C-band; NASA D3R Ka-Ku and NPOL S-bands) and wind-profilers (S/UHF bands), supported by a dense network of over 20 disdrometers and rain gauges, all nested in the coverage of a six-station mesoscale rawinsonde

  18. Development and validation of a model of bio-barriers for remediation of Cr(VI) contaminated aquifers using laboratory column experiments.

    Science.gov (United States)

    Shashidhar, T; Bhallamudi, S Murty; Philip, Ligy

    2007-07-16

    Bench scale transport and biotransformation experiments and mathematical model simulations were carried out to study the effectiveness of bio-barriers for the containment of hexavalent chromium in contaminated confined aquifers. Experimental results showed that a 10cm thick bio-barrier with an initial biomass concentration of 0.205mg/g of soil was able to completely contain a Cr(VI) plume of 25mg/L concentration. It was also observed that pore water velocity and initial biomass concentration are the most significant parameters in the containment of Cr(VI). The mathematical model developed is based on one-dimensional advection-dispersion reaction equations for Cr(VI) and molasses in saturated, homogeneous porous medium. The transport of Cr(VI) and molasses is coupled with adsorption and Monod's inhibition kinetics for immobile bacteria. It was found that, in general, the model was able to simulate the experimental results satisfactorily. However, there was disparity between the numerically simulated and experimental breakthrough curves for Cr(VI) and molasses in cases where there was high clay content and high microbial activity. The mathematical model could contribute towards improved designs of future bio-barriers for the remediation of Cr(VI) contaminated aquifers.

  19. Validation of the Atmospheric Chemistry Experiment (ACE version 2.2 temperature using ground-based and space-borne measurements

    Directory of Open Access Journals (Sweden)

    R. J. Sica

    2008-01-01

    Full Text Available An ensemble of space-borne and ground-based instruments has been used to evaluate the quality of the version 2.2 temperature retrievals from the Atmospheric Chemistry Experiment Fourier Transform Spectrometer (ACE-FTS. The agreement of ACE-FTS temperatures with other sensors is typically better than 2 K in the stratosphere and upper troposphere and 5 K in the lower mesosphere. There is evidence of a systematic high bias (roughly 3–6 K in the ACE-FTS temperatures in the mesosphere, and a possible systematic low bias (roughly 2 K in ACE-FTS temperatures near 23 km. Some ACE-FTS temperature profiles exhibit unphysical oscillations, a problem fixed in preliminary comparisons with temperatures derived using the next version of the ACE-FTS retrieval software. Though these relatively large oscillations in temperature can be on the order of 10 K in the mesosphere, retrieved volume mixing ratio profiles typically vary by less than a percent or so. Statistical comparisons suggest these oscillations occur in about 10% of the retrieved profiles. Analysis from a set of coincident lidar measurements suggests that the random error in ACE-FTS version 2.2 temperatures has a lower limit of about ±2 K.

  20. Dual-sided electrosurgery handpiece for simultaneous tissue cutting and coagulation: first report on a conceptual design validated by an animal experiment

    Directory of Open Access Journals (Sweden)

    Tawfik HA

    2015-08-01

    Full Text Available Hatem A Tawfik,1 Yousef A Fouad,2 Rashad Hafez3 1Department of Ophthalmology, Oculoplastics Service, Ain Shams University, 2Faculty of Medicine, Ain Shams University, 3Eye Subspecialty Centre, Cairo, Egypt Objective: To introduce and evaluate the safety of a novel dual-sided electrosurgery handpiece design for simultaneous tissue cutting and coagulation. Methods: We designed a prototype double-sided handpiece allowing automatic switching between two electrodes with a simple handpiece flip. The concept of the system as a surgical instrument was assessed by an animal experiment. Results: The skin of 15 Wistar albino white rats could be successfully incised and coagulated using both ends of the handpiece, thereby confirming the prospects and clinical applications of the system. Conclusion: The dual-sided electrosurgery handpiece is a simple and safe alternative to the traditional electrosurgery pencil, allowing the simultaneous use of two electrodes without the hassle of frequent electrode replacement. Keywords: radiosurgery, ablative surgery, laser resurfacing, electrocautery, electrosurgery

  1. The Grimsel radionuclide migration experiment - a contribution to raising confidence in the validity of solute transport models used in performance assessment

    International Nuclear Information System (INIS)

    Frick, U.

    1995-01-01

    The safety assessment of radioactive waste repositories is to provide confidence that the predictive models utilized are applicable for the specific repository systems. Nagra has carried out radionuclide migration experiments at the Grimsel underground test site (Switzerland) for testing of currently used methodologies, data bases, conceptual approaches and codes for modeling radionuclide transport through fractured host rocks. Specific objectives included: identification of the relevant transport processes, to test the extrapolation of laboratory sorption data to field conditions, and to demonstrate the applicability of currently used methodology for conceptualizing or building realistic transport models. Field tests and transport modeling work are complemented by an extensive laboratory program. The field experimental activities focused predominantly on establishing appropriate conditions for identifying relevant transport mechanisms on the scale of a few meters, aiming at full recovery of injected tracers, simple geometry and long-term stability of induced dipole flow fields. A relatively simple homogeneous, dual-porosity advection/diffusion model was built with input from a state of the art petrographic characterisation of the water conducting feature. It was possible to calibrate the model from conservative tracer breakthrough curves. (J.S.). 21 refs., 14 figs., 4 tabs

  2. Validation of a continuous-energy Monte Carlo burn-up code MVP-BURN and its application to analysis of post irradiation experiment

    International Nuclear Information System (INIS)

    Okumura, Keisuke; Mori, Takamasa; Nakagawa, Masayuki; Kaneko, Kunio

    2000-01-01

    In order to confirm the reliability of a continuous-energy Monte Carlo burn-up calculation code MVP-BURN, it was applied to the burn-up benchmark problems for a high conversion LWR lattice and a BWR lattice with burnable poison rods. The results of MVP-BURN have shown good agreements with those of a deterministic code SRAC95 for burn-up changes of infinite neutron multiplication factor, conversion ratio, power distribution, and number densities of major fuel nuclides. Serious propagation of statistical errors along burn-up was not observed even in a highly heterogeneous lattice. MVP-BURN was applied to the analysis of a post irradiation experiment for a sample fuel irradiated up to 34.1 GWd/t, together with SRAC95 and SWAT. It was confirmed that the effect of statistical errors of MVP-BURN on a burned fuel composition was sufficiently small, and it could give a reference solution for other codes. In the analysis, the results of the three codes with JENDL-3.2 agreed with measured values within an error of 10% for most nuclides. However, large underestimation by about 20% was observed for 238 Pu, 242m Am and 244 Cm. It is probable that these discrepancies are a common problem for most current nuclear data files. (author)

  3. Development of tsunami fragility evaluation methods by large scale experiments. Part 2. Validation of the applicability of evaluation methods of impact force due to tsunami floating debris

    International Nuclear Information System (INIS)

    Takabatake, Daisuke; Kihara, Naoto; Kaida, Hideki; Miyagawa, Yoshinori; Ikeno, Masaaki; Shibayama, Atsushi

    2015-01-01

    In order to examine the applicability of the existing estimation equations of the impact force due to tsunami floating debris, the collision tests are carried out. In the experiments, logs and full-scale light car are used. In this report, two types of existing equations, one is based on the Young's module of the debris (Eq.A) and the other one is based on the stiffness of the debris (Eq.B), are focused on. The estimated impact forces using Eq.A with log's Young module obtained by the material test agree with measured forces obtained by the collision test. But Eq.A does not applicate to a car because it is not easy to determine the Young's module of a car. On the other hand, the estimated impact forces using Eq.B with car's stiffness obtained by the static loading test agree with measured forces obtained by the collision test. This indicates that Eq.B unable us to estimate impact force of the floating debris such as car if the stiffness of the debris is determined. (author)

  4. The effect of dielectric constants on noble metal/semiconductor SERS enhancement: FDTD simulation and experiment validation of Ag/Ge and Ag/Si substrates.

    Science.gov (United States)

    Wang, Tao; Zhang, Zhaoshun; Liao, Fan; Cai, Qian; Li, Yanqing; Lee, Shuit-Tong; Shao, Mingwang

    2014-02-11

    The finite-difference time-domain (FDTD) method was employed to simulate the electric field distribution for noble metal (Au or Ag)/semiconductor (Ge or Si) substrates. The simulation showed that noble metal/Ge had stronger SERS enhancement than noble metal/Si, which was mainly attributed to the different dielectric constants of semiconductors. In order to verify the simulation, Ag nanoparticles with the diameter of ca. 40 nm were grown on Ge or Si wafer (Ag/Ge or Ag/Si) and employed as surface-enhanced Raman scattering substrates to detect analytes in solution. The experiment demonstrated that both the two substrates exhibited excellent performance in the low concentration detection of Rhodamine 6G. Besides, the enhancement factor (1.3 × 10(9)) and relative standard deviation values (less than 11%) of Ag/Ge substrate were both better than those of Ag/Si (2.9 × 10(7) and less than 15%, respectively), which was consistent with the FDTD simulation. Moreover, Ag nanoparticles were grown in-situ on Ge substrate, which kept the nanoparticles from aggregation in the detection. To data, Ag/Ge substrates showed the best performance for their sensitivity and uniformity among the noble metal/semiconductor ones.

  5. Sample Preparation and Extraction in Small Sample Volumes Suitable for Pediatric Clinical Studies: Challenges, Advances, and Experiences of a Bioanalytical HPLC-MS/MS Method Validation Using Enalapril and Enalaprilat

    Science.gov (United States)

    Burckhardt, Bjoern B.; Laeer, Stephanie

    2015-01-01

    In USA and Europe, medicines agencies force the development of child-appropriate medications and intend to increase the availability of information on the pediatric use. This asks for bioanalytical methods which are able to deal with small sample volumes as the trial-related blood lost is very restricted in children. Broadly used HPLC-MS/MS, being able to cope with small volumes, is susceptible to matrix effects. The latter restrains the precise drug quantification through, for example, causing signal suppression. Sophisticated sample preparation and purification utilizing solid-phase extraction was applied to reduce and control matrix effects. A scale-up from vacuum manifold to positive pressure manifold was conducted to meet the demands of high-throughput within a clinical setting. Faced challenges, advances, and experiences in solid-phase extraction are exemplarily presented on the basis of the bioanalytical method development and validation of low-volume samples (50 μL serum). Enalapril, enalaprilat, and benazepril served as sample drugs. The applied sample preparation and extraction successfully reduced the absolute and relative matrix effect to comply with international guidelines. Recoveries ranged from 77 to 104% for enalapril and from 93 to 118% for enalaprilat. The bioanalytical method comprising sample extraction by solid-phase extraction was fully validated according to FDA and EMA bioanalytical guidelines and was used in a Phase I study in 24 volunteers. PMID:25873972

  6. Sample Preparation and Extraction in Small Sample Volumes Suitable for Pediatric Clinical Studies: Challenges, Advances, and Experiences of a Bioanalytical HPLC-MS/MS Method Validation Using Enalapril and Enalaprilat

    Directory of Open Access Journals (Sweden)

    Bjoern B. Burckhardt

    2015-01-01

    Full Text Available In USA and Europe, medicines agencies force the development of child-appropriate medications and intend to increase the availability of information on the pediatric use. This asks for bioanalytical methods which are able to deal with small sample volumes as the trial-related blood lost is very restricted in children. Broadly used HPLC-MS/MS, being able to cope with small volumes, is susceptible to matrix effects. The latter restrains the precise drug quantification through, for example, causing signal suppression. Sophisticated sample preparation and purification utilizing solid-phase extraction was applied to reduce and control matrix effects. A scale-up from vacuum manifold to positive pressure manifold was conducted to meet the demands of high-throughput within a clinical setting. Faced challenges, advances, and experiences in solid-phase extraction are exemplarily presented on the basis of the bioanalytical method development and validation of low-volume samples (50 μL serum. Enalapril, enalaprilat, and benazepril served as sample drugs. The applied sample preparation and extraction successfully reduced the absolute and relative matrix effect to comply with international guidelines. Recoveries ranged from 77 to 104% for enalapril and from 93 to 118% for enalaprilat. The bioanalytical method comprising sample extraction by solid-phase extraction was fully validated according to FDA and EMA bioanalytical guidelines and was used in a Phase I study in 24 volunteers.

  7. Dimensional assessment of schizotypal, psychotic, and other psychiatric traits in children and their parents: development and validation of the Childhood Oxford-Liverpool Inventory of Feelings and Experiences on a representative US sample.

    Science.gov (United States)

    Evans, David W; Lusk, Laina G; Slane, Mylissa M; Michael, Andrew M; Myers, Scott M; Uljarević, Mirko; Mason, Oliver; Claridge, Gordon; Frazier, Thomas

    2018-05-01

    Healthy functioning relies on a variety of perceptual, cognitive, emotional, and behavioral abilities that are distributed throughout the normal population. Variation in these traits define the wide range of neurodevelopmental (NDD) and neuropsychiatric (NPD) disorders. Here, we introduce a new measure for assessing these traits in typically developing children and children at risk for NDD and NPD from age 2 to 18 years. The Childhood Oxford-Liverpool Inventory of Feelings and Experiences (CO-LIFE) was created as a dimensional, parent-report measure of schizotypal and psychotic traits in the general population. Parents of 2,786 children also self-reported on an adapted version of the Oxford-Liverpool Inventory of Feelings and Experiences (O-LIFE-US). The CO-LIFE resulted in continuous distributions for the total score and for each of three factor analytically-derived subscales. Item response theory (IRT) analyses indicated strong reliability across the score range for the O-LIFE-US and the CO-LIFE. Internal consistency and test-retest reliability were high across all scales. Parent-child intraclass correlations were consistent with high heritability. The scales discriminated participants who reported a lifetime psychiatric diagnosis from those who reported no diagnosis. The O-LIFE-US and CO-LIFE scores correlated positively with the Social Responsiveness Scale 2 (SRS-2) indicating good convergent validity. Like the original O-LIFE, the O-LIFE-US and the CO-LIFE are valid and reliable tools that reflect the spectrum of psychiatric and schizotypal traits in the general population. Such scales are necessary for conducting family studies that aim to examine a range of psychological and behavioral traits in both children and adults and are well-suited for the Research Domain Criteria (RDoC) initiative of the NIMH. © 2017 Association for Child and Adolescent Mental Health.

  8. Study of the violation of the T and CP symmetries in the reactions Λb0 → Λ0 + a vector meson. Validation of the Front-end electronics for the PreShower detector of the LHCb experiment

    International Nuclear Information System (INIS)

    Conte, E.

    2007-11-01

    This thesis probes the beauty baryon physics in the framework of the LHCb experiment. The present study deals with the Λ b 0 → Λ 0 V decays where V is a vector meson such as J/Ψ(μ + μ - ), φ(K + K - ), ω(π + π - π0) or the ρ 0 - ω 0 (π + π - ) mixing. These processes allow to test independently the CP symmetry, which violation has not been observed yet in the baryonic sector, and the T symmetry, which experimental proofs are limited. Among the possible perspectives, a precise measurement of the Λ b 0 lifetime could contribute to the resolution of the raising theoretical-experimental puzzle. A phenomenological model of the Λ b 0 → Λ 0 V decays has been performed, from which branching ratios and angular distributions have been estimated. An advanced study of the reconstruction and the selection of these reactions by the LHCb apparatus shows that the channel Λ b 0 → Λ 0 J/Ψ is the dominant channel on both statistics and purity aspects. The Λ b 0 lifetime measure is the most imminent result; the constrains on asymmetries due to CP and T violation require several data taking years. Besides, an instrumental work has been achieved on the read-out electronics, called Front-End, of the experiment pre-shower. This contribution takes into account the validation of the prototype boards and the development of tools required by the qualification of the 100 production boards. (author)

  9. Patients' Experience of Myositis and Further Validation of a Myositis-specific Patient Reported Outcome Measure - Establishing Core Domains and Expanding Patient Input on Clinical Assessment in Myositis. Report from OMERACT 12.

    Science.gov (United States)

    Regardt, Malin; Basharat, Pari; Christopher-Stine, Lisa; Sarver, Catherine; Björn, Anita; Lundberg, Ingrid E; Wook Song, Yeong; Bingham, Clifton O; Alexanderson, Helene

    2015-12-01

    The Outcome Measures in Rheumatology (OMERACT) myositis working group was established to examine patient-reported outcomes (PRO) as well as to validate patient-reported outcome measures (PROM) in myositis. Qualitative studies using focus group interviews and cognitive debriefing of the myositis-specific Myositis Activities Profile (MAP) were used to explore the experience of adults living with polymyositis (PM) and dermatomyositis (DM). Preliminary results underscore the importance of patient input in the development of PROM to ensure content validity. Results from multicenter focus groups indicate the range of symptoms experienced including pain, fatigue, and impaired cognitive function, which are not currently assessed in myositis. Preliminary cognitive debriefing of the MAP indicated that while content was deemed relevant and important, several activities were not included; and that questionnaire construction and wording may benefit from revision. A research agenda was developed to continue work toward optimizing PRO assessment in myositis with 2 work streams. The first would continue to conduct and analyze focus groups until saturation in the thematic analysis was achieved to develop a framework that encompassed the patient-relevant aspects of myositis. The second would continue cognitive debriefing of the MAP to identify potential areas for revision. There was agreement that further work would be needed for inclusion body myositis and juvenile dermatomyositis, and that the inclusion of additional contributors such as caregivers and individuals from the pharmaceutical/regulatory spheres would be desirable. The currently used PROM do not assess symptoms or the effects of disease that are most important to patients; this emphasizes the necessity of patient involvement. Our work provides concrete examples for PRO identification.

  10. Meaning in life experience at the end of life: validation of the Hindi version of the Schedule for Meaning in Life Evaluation and a cross-cultural comparison between Indian and German palliative care patients.

    Science.gov (United States)

    Kudla, Dorothea; Kujur, Julius; Tigga, Sumanti; Tirkey, Prakash; Rai, Punita; Fegg, Martin Johannes

    2015-01-01

    The experience of Meaning in Life (MiL) is a major protective factor against feelings of hopelessness and wishes for hastened death in palliative care (PC) patients. However, most instruments for MiL assessment have been developed only in Western countries so far. Little is known about MiL experience in Asian PC patients. This study aimed to provide a Hindi version of the Schedule for Meaning in Life Evaluation (SMiLE), test its feasibility and validity in Indian PC patients, and compare the results with previous studies in Germany. Indian PC patients in a hospice for the destitute were eligible to participate in this cross-sectional study. In the SMiLE instrument, respondents individually listed MiL-giving areas before rating their satisfaction with and importance of these areas. Overall indices of satisfaction (IoS, range 0-100), weighting (IoW, range 0-100), and weighted satisfaction (IoWS, range 0-100) were calculated. A Hindi forward-backward translation of the SMiLE was made. Two hundred fifty-eight Indian PC patients took part in the study (response rate 93.5%). Convergent validity of the SMiLE was found with the World Health Organization Quality of Life-Brief version (r = 0.17; P = 0.008) and the Idler Index of Religiosity (public religiousness: r = 0.25, P < 0.001 and private religiousness: r = 0.29, P < 0.001). Indian PC patients' IoW was 65.8 ± 22.1, IoS 68.6 ± 17.4, and IoWS 70.2 ± 17.0. In multivariate analyses of covariance, they differed significantly from German PC patients only in IoW (IoW: 84.8 ± 11.5, P < 0.001; IoS: 70.2 ± 19.7; IoWS: 72.0 ± 19.4). Compared with Germans, Indians more often listed spirituality (P < 0.001) and social commitment (P < 0.001) and less often social relations (P = 0.008). Preliminary results indicate good feasibility and validity of the Hindi version of the SMiLE. MiL experience also seems to be a coping resource for Indian PC patients. Copyright © 2015 American Academy of Hospice

  11. Contribution to the experimental validation of the coupling between a particle accelerator and a subcritical core: Muse-3 and Muse-7 experiments; Contribution a la validation experimentale du couplage entre un accelerateur et un massif sous-critique: experience muse 3. et muse 4

    Energy Technology Data Exchange (ETDEWEB)

    Bompas, C A

    2000-12-01

    As part of the research on the Hybrid Systems and more specially on the physical phenomena involved in a sub-critical core coupled with an external source, it is necessary to qualify several neutronic parameters. These parameters characterize, on the one hand, the external source supplying the core with neutrons (importance, amplification) and, on the other hand, the sub-critical core (spatial distribution of flux, power emitted from the core, reactivity, influence of a spectrum degraded by the presence of buffers like lead). The MUSE Program consists of parametric studies of configuration with different compositions at different sub-critical levels supplied by different types of external source. The first part of this work concerns the first analyses of the static results obtained during the third phase of this experimental program (MUSE-III experiment) and also the preparation of the fourth phase (MUSE-IV experiment). This study has notably concluded on the superiority of a transition zone in lead compared to a sodium zone in terms of neutronic potential (because of the (n, 2n) reaction) and of the source importance. The second part of this work concerns the interpretation of the dynamic results obtained during the MUSE-III experiment and the realization of calculations on the MUSE-IV configurations. This study has shown the important impact of the hydrogenous materials on the external source for the MUSE-III dynamic results. It has also determined the applicability of the pulsed neutron source reactivity measurement technique and optimized the position of monitors for the future MUSE-IV experiment. (authors)

  12. Experimental validation of UTDefect

    Energy Technology Data Exchange (ETDEWEB)

    Eriksson, A.S. [ABB Tekniska Roentgencentralen AB, Taeby (Sweden); Bostroem, A.; Wirdelius, H. [Chalmers Univ. of Technology, Goeteborg (Sweden). Div. of Mechanics

    1997-01-01

    This study reports on conducted experiments and computer simulations of ultrasonic nondestructive testing (NDT). Experiments and simulations are compared with the purpose of validating the simulation program UTDefect. UTDefect simulates ultrasonic NDT of cracks and some other defects in isotropic and homogeneous materials. Simulations for the detection of surface breaking cracks are compared with experiments in pulse-echo mode on surface breaking cracks in carbon steel plates. The echo dynamics are plotted and compared with the simulations. The experiments are performed on a plate with thickness 36 mm and the crack depths are 7.2 mm and 18 mm. L- and T-probes with frequency 1, 2 and 4 MHz and angels 45, 60 and 70 deg are used. In most cases the probe and the crack is on opposite sides of the plate, but in some cases they are on the same side. Several cracks are scanned from two directions. In total 53 experiments are reported for 33 different combinations. Generally the simulations agree well with the experiments and UTDefect is shown to be able to, within certain limits, perform simulations that are close to experiments. It may be concluded that: For corner echoes the eight 45 deg cases and the eight 60 deg cases show good agreement between experiments and UTDefect, especially for the 7.2 mm crack. The amplitudes differ more for some cases where the defect is close to the probe and for the corner of the 18 mm crack. For the two 70 deg cases there are too few experimental values to compare the curve shapes, but the amplitudes do not differ too much. The tip diffraction echoes also agree well in general. For some cases, where the defect is close to the probe, the amplitudes differ more than 10-15 dB, but for all but two cases the difference in amplitude is less than 7 dB. 6 refs.

  13. Development of multi-dimensional thermal-hydraulic modeling using mixing factors for wire wrapped fuel pin bundles in fast reactors. Validation through a sodium experiment of 169-pin fuel subassembly

    International Nuclear Information System (INIS)

    Nishimura, M.; Kamide, H.; Miyake, Y.

    1997-04-01

    Temperature distributions in fuel subassemblies of fast reactors interactively affect heat transfer from center to outer region of the core (inter-subassembly heat transfer) and cooling capability of an inter-wrapper flow, as well as maximum cladding temperature. The prediction of temperature distribution in the subassembly is, therefore one of the important issues for the reactor safety assessment. Mixing factors were applied to multi-dimensional thermal-hydraulic code AQUA to enhance the predictive capability of simulating maximum cladding temperature in the fuel subassemblies. In the previous studies, this analytical method had been validated through the calculations of the sodium experiments using driver subassembly test rig PLANDTL-DHX with 37-pin bundle and blanket subassembly test rig CCTL-CFR with 61-pin bundle. The error of the analyses were comparable to the error of instrumentation's. Thus the modeling was capable of predicting thermal-hydraulic field in the middle scale subassemblies. Before the application to large scale real subassemblies with more than 217 pins, accuracy of the analytical method have to be inspected through calculations of sodium tests in a large scale pin bundle. Therefore, computations were performed on sodium experiments in the relatively large 169-pin subassembly which had heater pins sparsely within the bundle. The analysis succeeded to predict the experimental temperature distributions. The errors of temperature rise from inlet to maximum values were reduced to half magnitudes by using mixing factors, compared to those of analyses without mixing factors. Thus the modeling is capable of predicting the large scale real subassemblies. (author)

  14. The online Prescriptive Index platform for the assessment of managerial competencies and coaching needs: development and initial validation of the experience sampling Mood Wheel and the Manager-Rational and Irrational Beliefs Scale

    Directory of Open Access Journals (Sweden)

    David, O.A.

    2013-12-01

    Full Text Available The Prescriptive Index platform is dedicated to the appraisal and development of managerial competencies, and it is comprised of such measures as the multi-rater Freeman-Gavita Prescriptive Executive Coaching (PEC Assessment for assessing core managerial skills, and the multi-rater Managerial Coaching Assessment System (MCAS for the evaluation of coaching competencies in managers. The aim of this research was to present the development and psychometric properties of new tools, part of the Prescriptive Index platform, for the assessment of managerial emotional competencies: the web and mobile based Mood Wheel measure using experience sampling procedures, for the assessment of current/previous distress and positive emotions; and the self-report Manager Rational and Irrational Beliefs Scale (M-RIBS for the assessment of managerial attitudes involved in emotion-regulation processes. Results obtained show that both instruments integrated in the Prescriptive Index platform have adequate initial psychometric support and predictive validity. Practical implications of our findings are discussed in the light of the importance of enabling organizations to accurately identify managerial competencies and coaching needs.

  15. King Cove, Alaska Coastal Digital Elevation Model

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — NOAA's National Geophysical Data Center (NGDC) is building high-resolution digital elevation models (DEMs) for select U.S. coastal regions. These integrated...

  16. Arena Cove, California Coastal Digital Elevation Model

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — NOAA's National Geophysical Data Center (NGDC) is building high-resolution digital elevation models (DEMs) for select U.S. coastal regions. These integrated...

  17. Construct Validity and Case Validity in Assessment

    Science.gov (United States)

    Teglasi, Hedwig; Nebbergall, Allison Joan; Newman, Daniel

    2012-01-01

    Clinical assessment relies on both "construct validity", which focuses on the accuracy of conclusions about a psychological phenomenon drawn from responses to a measure, and "case validity", which focuses on the synthesis of the full range of psychological phenomena pertaining to the concern or question at hand. Whereas construct validity is…

  18. Validation of the model of Critical Heat Flux COBRA-TF compared experiments of Post-Dryout performed by the Royal Institute of Technology (KTH); Validacion del Modelo de Critical Heat Flux de COBRA-TF frente a los Experimentos de Post-Dryout realizados por el Royal Institute of Technology (KTH)

    Energy Technology Data Exchange (ETDEWEB)

    Abarca, A.; Miro, R.; Barrachina, T.; Verdu, G.

    2014-07-01

    In this work is a validation of the results obtained with different existing correlations for predicting the value and location of the CTF code CHF, using them for experiments of Post-Dryout conducted by the Royal Institute of Technology (KTH) in Stockholm, Sweden. (Author)

  19. Further Validation of the Coach Identity Prominence Scale

    Science.gov (United States)

    Pope, J. Paige; Hall, Craig R.

    2014-01-01

    This study was designed to examine select psychometric properties of the Coach Identity Prominence Scale (CIPS), including the reliability, factorial validity, convergent validity, discriminant validity, and predictive validity. Coaches (N = 338) who averaged 37 (SD = 12.27) years of age, had a mean of 13 (SD = 9.90) years of coaching experience,…

  20. Validação de métodos cromatográficos de análise: um experimento de fácil aplicação utilizando cromatografia líquida de alta eficiência (CLAE e os princípios da "Química Verde" na determinação de metilxantinas em bebidas Validation of chromatographic methods: an experiment using HPLC and Green Chemistry in methylxanthines determination

    Directory of Open Access Journals (Sweden)

    Nádia Machado de Aragão

    2009-01-01

    Full Text Available The validation of analytical methods is an important step in quality control. The main objective of this study is to propose an HPLC experiment to verify the parameters of validation of chromatographic methods, based on green chemistry principles, which can be used in experimental courses of chemistry and related areas.

  1. Validation: an overview of definitions

    International Nuclear Information System (INIS)

    Pescatore, C.

    1995-01-01

    The term validation is featured prominently in the literature on radioactive high-level waste disposal and is generally understood to be related to model testing using experiments. In a first class, validation is linked to the goal of predicting the physical world as faithfully as possible but is unattainable and unsuitable for setting goals for the safety analyses. In a second class, validation is associated to split-sampling or to blind-tests predictions. In the third class of definition, validation focuses on the quality of the decision-making process. Most prominent in the present review is the observed lack of use of the term validation in the field of low-level radioactive waste disposal. The continued informal use of the term validation in the field of high level wastes disposals can become cause for misperceptions and endless speculations. The paper proposes either abandoning the use of this term or agreeing to a definition which would be common to all. (J.S.). 29 refs

  2. Validation of thermalhydraulic codes

    International Nuclear Information System (INIS)

    Wilkie, D.

    1992-01-01

    Thermalhydraulic codes require to be validated against experimental data collected over a wide range of situations if they are to be relied upon. A good example is provided by the nuclear industry where codes are used for safety studies and for determining operating conditions. Errors in the codes could lead to financial penalties, to the incorrect estimation of the consequences of accidents and even to the accidents themselves. Comparison between prediction and experiment is often described qualitatively or in approximate terms, e.g. ''agreement is within 10%''. A quantitative method is preferable, especially when several competing codes are available. The codes can then be ranked in order of merit. Such a method is described. (Author)

  3. GPM GROUND VALIDATION CITATION VIDEOS IPHEX V1

    Data.gov (United States)

    National Aeronautics and Space Administration — The GPM Ground Validation Citation Videos IPHEx data were collected during the Integrated Precipitation and Hydrology Experiment (IPHEx) in the Southern...

  4. GPM Ground Validation Southern Appalachian Rain Gauge IPHEx V1

    Data.gov (United States)

    National Aeronautics and Space Administration — The GPM Ground Validation Southern Appalachian Rain Gauge IPHEx dataset was collected during the Integrated Precipitation and Hydrology Experiment (IPHEx) field...

  5. Lesson 6: Signature Validation

    Science.gov (United States)

    Checklist items 13 through 17 are grouped under the Signature Validation Process, and represent CROMERR requirements that the system must satisfy as part of ensuring that electronic signatures it receives are valid.

  6. Validation of an improved anaplasma antibody cELISA kit for detection of anaplasma ovis antibody in domestic sheep at the U.S. Sheep Experiment Station in Dubois, ID

    Science.gov (United States)

    An accurate and simple-to-perform new version of a competitive ELISA (cELISA) kit that became commercially available in 2015 for testing of cattle for antibody to Anaplasma marginale was validated for detection of Anaplasma ovis antibody in domestic sheep. True positives and negatives were identifie...

  7. 78 FR 26359 - Community of Elfin Cove, DBA Elfin Cove Utility Commission; Notice of Preliminary Permit...

    Science.gov (United States)

    2013-05-06

    ...-kilowatt (kW) power recovery turbine; (4) a 25-foot-long, 8-foot- wide, 3-foot-deep cobble-lined tailrace... 150-foot-long, 8- foot-wide, 3-foot-deep cobble-lined tailrace discharging flows into Port Althorp... electronically via the Internet. See 18 CFR 385.2001(a)(1)(iii) and the instructions on the Commission's Web site...

  8. Principles of Proper Validation

    DEFF Research Database (Denmark)

    Esbensen, Kim; Geladi, Paul

    2010-01-01

    to suffer from the same deficiencies. The PPV are universal and can be applied to all situations in which the assessment of performance is desired: prediction-, classification-, time series forecasting-, modeling validation. The key element of PPV is the Theory of Sampling (TOS), which allow insight......) is critically necessary for the inclusion of the sampling errors incurred in all 'future' situations in which the validated model must perform. Logically, therefore, all one data set re-sampling approaches for validation, especially cross-validation and leverage-corrected validation, should be terminated...

  9. THX Experiment Overview

    Science.gov (United States)

    Wernet, Mark; Wroblewski, Adam; Locke, Randy; Georgiadis, Nick

    2016-01-01

    This presentation provides an overview of experiments conducted at NASA GRC to provide turbulent flow measurements needed for new turbulence model development and validation. The experiments include particle image velocimetry (PIV) and hot-wire measurements of mean flow velocity and temperature fields, as well as fluctuating components.

  10. Assessment of juveniles testimonies’ validity

    Directory of Open Access Journals (Sweden)

    Dozortseva E.G.

    2015-12-01

    Full Text Available The article presents a review of the English language publications concerning the history and the current state of differential psychological assessment of validity of testimonies produced by child and adolescent victims of crimes. The topicality of the problem in Russia is high due to the tendency of Russian specialists to use methodical means and instruments developed abroad in this sphere for forensic assessments of witness testimony veracity. A system of Statement Validity Analysis (SVA by means of Criteria-Based Content Analysis (CBCA and Validity Checklist is described. The results of laboratory and field studies of validity of CBCA criteria on the basis of child and adult witnesses are discussed. The data display a good differentiating capacity of the method, however, a high level of error probability. The researchers recommend implementation of SVA in the criminal investigation process, but not in the forensic assessment. New perspective developments in the field of methods for differentiation of witness statements based on the real experience and fictional are noted. The conclusion is drawn that empirical studies and a special work for adaptation and development of new approaches should precede their implementation into Russian criminal investigation and forensic assessment practice

  11. Validation process of simulation model

    International Nuclear Information System (INIS)

    San Isidro, M. J.

    1998-01-01

    It is presented a methodology on empirical validation about any detailed simulation model. This king of validation it is always related with an experimental case. The empirical validation has a residual sense, because the conclusions are based on comparisons between simulated outputs and experimental measurements. This methodology will guide us to detect the fails of the simulation model. Furthermore, it can be used a guide in the design of posterior experiments. Three steps can be well differentiated: Sensitivity analysis. It can be made with a DSA, differential sensitivity analysis, and with a MCSA, Monte-Carlo sensitivity analysis. Looking the optimal domains of the input parameters. It has been developed a procedure based on the Monte-Carlo methods and Cluster techniques, to find the optimal domains of these parameters. Residual analysis. This analysis has been made on the time domain and on the frequency domain, it has been used the correlation analysis and spectral analysis. As application of this methodology, it is presented the validation carried out on a thermal simulation model on buildings, Esp., studying the behavior of building components on a Test Cell of LECE of CIEMAT. (Author) 17 refs

  12. Experimenting with a design experiment

    Directory of Open Access Journals (Sweden)

    Bakker, Judith

    2012-12-01

    Full Text Available The design experiment is an experimental research method that aims to help design and further develop new (policy instruments. For the development of a set of guidelines for the facilitation of citizens’ initiatives by local governments, we are experimenting with this method. It offers good opportunities for modeling interventions by testing their instrumental validity –the usefulness for the intended practical purposes. At the same time design experiments are also useful for evaluating the empirical validity of theoretical arguments and the further development of these arguments in the light of empirical evidence (by using e.g. the technique of pattern matching. We describe how we have applied this methodology in two cases and discuss our research approach. We encountered some unexpected difficulties, especially in the cooperation with professionals and citizens. These difficulties complicate the valid attribution of causal effects to the use of the new instrument. However, our preliminary conclusion is that design experiments are useful in our field of study

    El experimento de diseño es un método de investigación experimental que tiene como objetivo diseñar y desarrollar posteriormente nuevas herramientas (políticas. En este artículo experimentamos con este método para desarrollar un conjunto de directrices que permitan a los gobiernos locales facilitar las iniciativas ciudadanas. El método ofrece la oportunidad de modelar las intervenciones poniendo a prueba su validez instrumental (su utilidad para el fin práctico que se proponen. Al mismo tiempo, los experimentos de diseño son útiles también para evaluar la validez empírica de las discusiones teóricas y el posterior desarrollo de esas discusiones a la luz de la evidencia empírica (usando, por ejemplo, técnicas de concordancia de patrones. En este trabajo describimos cómo hemos aplicado este método a dos casos y discutimos nuestro enfoque de

  13. Validation of models with multivariate output

    International Nuclear Information System (INIS)

    Rebba, Ramesh; Mahadevan, Sankaran

    2006-01-01

    This paper develops metrics for validating computational models with experimental data, considering uncertainties in both. A computational model may generate multiple response quantities and the validation experiment might yield corresponding measured values. Alternatively, a single response quantity may be predicted and observed at different spatial and temporal points. Model validation in such cases involves comparison of multiple correlated quantities. Multiple univariate comparisons may give conflicting inferences. Therefore, aggregate validation metrics are developed in this paper. Both classical and Bayesian hypothesis testing are investigated for this purpose, using multivariate analysis. Since, commonly used statistical significance tests are based on normality assumptions, appropriate transformations are investigated in the case of non-normal data. The methodology is implemented to validate an empirical model for energy dissipation in lap joints under dynamic loading

  14. Validity in Qualitative Evaluation

    OpenAIRE

    Vasco Lub

    2015-01-01

    This article provides a discussion on the question of validity in qualitative evaluation. Although validity in qualitative inquiry has been widely reflected upon in the methodological literature (and is still often subject of debate), the link with evaluation research is underexplored. Elaborating on epistemological and theoretical conceptualizations by Guba and Lincoln and Creswell and Miller, the article explores aspects of validity of qualitative research with the explicit objective of con...

  15. Validation of HEDR models

    International Nuclear Information System (INIS)

    Napier, B.A.; Simpson, J.C.; Eslinger, P.W.; Ramsdell, J.V. Jr.; Thiede, M.E.; Walters, W.H.

    1994-05-01

    The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computer models for estimating the possible radiation doses that individuals may have received from past Hanford Site operations. This document describes the validation of these models. In the HEDR Project, the model validation exercise consisted of comparing computational model estimates with limited historical field measurements and experimental measurements that are independent of those used to develop the models. The results of any one test do not mean that a model is valid. Rather, the collection of tests together provide a level of confidence that the HEDR models are valid

  16. Roll-up of validation results to a target application.

    Energy Technology Data Exchange (ETDEWEB)

    Hills, Richard Guy

    2013-09-01

    Suites of experiments are preformed over a validation hierarchy to test computational simulation models for complex applications. Experiments within the hierarchy can be performed at different conditions and configurations than those for an intended application, with each experiment testing only part of the physics relevant for the application. The purpose of the present work is to develop methodology to roll-up validation results to an application, and to assess the impact the validation hierarchy design has on the roll-up results. The roll-up is accomplished through the development of a meta-model that relates validation measurements throughout a hierarchy to the desired response quantities for the target application. The meta-model is developed using the computation simulation models for the experiments and the application. The meta-model approach is applied to a series of example transport problems that represent complete and incomplete coverage of the physics of the target application by the validation experiments.

  17. Validation of simulation models

    DEFF Research Database (Denmark)

    Rehman, Muniza; Pedersen, Stig Andur

    2012-01-01

    In philosophy of science, the interest for computational models and simulations has increased heavily during the past decades. Different positions regarding the validity of models have emerged but the views have not succeeded in capturing the diversity of validation methods. The wide variety...

  18. Verification and validation of models

    International Nuclear Information System (INIS)

    Herbert, A.W.; Hodgkinson, D.P.; Jackson, C.P.; Lever, D.A.; Robinson, P.C.

    1986-12-01

    The numerical accuracy of the computer models for groundwater flow and radionuclide transport that are to be used in repository safety assessment must be tested, and their ability to describe experimental data assessed: they must be verified and validated respectively. Also appropriate ways to use the codes in performance assessments, taking into account uncertainties in present data and future conditions, must be studied. These objectives are being met by participation in international exercises, by developing bench-mark problems, and by analysing experiments. In particular the project has funded participation in the HYDROCOIN project for groundwater flow models, the Natural Analogues Working Group, and the INTRAVAL project for geosphere models. (author)

  19. Intercomparison of liquid metal fast reactor seismic analysis codes. V.1: Validation of seismic analysis codes using reactor core experiments. Proceedings of a research co-ordination meeting held in Vienna, 16-17 November 1993

    International Nuclear Information System (INIS)

    1995-05-01

    The Research Co-ordination Meeting held in Vienna, 16-17 November 1993, was attended by participants from France, India, Italy, Japan and the Russian Federation. The meeting was held to discuss and compare the results obtained by various organizations for the analysis of Italian tests on PEC mock-up. The background paper by A. Martelli, et al., Italy, entitled Fluid-Structure Interaction Experiments of PEC Core Mock-ups and Numerical Analysis Performed by ENEA presented details on the Italian PEC (Prova Elementi di Combustibile, i.e. Fuel Element Test Facility) test data for the benchmark. Several papers were presented on the analytical investigations of the PEC reactor core experiments. The paper by M. Morishita, Japan, entitled Seismic Response Analysis of PEC Reactor Core Mock-up, gives a brief review of the Japanese data on the Monju mock-up core experiment which had been distributed to the participating countries through the IAEA. Refs, figs and tabs

  20. Construction and validation of Experiences Questionnaire on Violence in Couple and Family Relations in University Students [Desarrollo del Cuestionario de Experiencias de Violencia en las Relaciones de Pareja y Familia en Estudiantes Universitarios

    Directory of Open Access Journals (Sweden)

    Angel A. Villafañe Santiago

    2012-03-01

    Full Text Available This study describes the process of developing the Experiences of Violence in Couple and Family Relationships in University Students Questionnaire, its psychometric properties and the results of the pilot study. The research design used for this study was a nonexperimental, transversal co relational design. The nonrandomized sample consisted of 267 students. The final version of the questionnaire consisted of 41 items and four sub-scales which measured experiences with violence in a relationship as an Aggressor and as a Victim, Observed between the Parents and in the Parent-child relationship as a victim. The total scale and the subscales obtained adequate reliability indexes. On average, the sample reported ten experiences with violence in different contexts. The results of this study contribute data on the prevalence of violence in college students’ romantic and family relationships which in turn, provide valuable information for planning prevention and early intervention efforts with this population.

  1. Site characterization and validation - Inflow to the validation drift

    International Nuclear Information System (INIS)

    Harding, W.G.C.; Black, J.H.

    1992-01-01

    Hydrogeological experiments have had an essential role in the characterization of the drift site on the Stripa project. This report focuses on the methods employed and the results obtained from inflow experiments performed on the excavated drift in stage 5 of the SCV programme. Inflows were collected in sumps on the floor, in plastic sheeting on the upper walls and ceiling, and measured by means of differential humidity of ventilated air at the bulkhead. Detailed evaporation experiments were also undertaken on uncovered areas of the excavated drift. The inflow distribution was determined on the basis of a system of roughly equal sized grid rectangles. The results have highlighted the overriding importance of fractures in the supply of water to the drift site. The validation drift experiment has revealed that in excess of 99% of inflow comes from a 5 m section corresponding to the 'H' zone, and that as much as 57% was observed coming from a single grid square (267). There was considerable heterogeneity even within the 'H' zone, with 38% of such samples areas yielding no flow at all. Model predictions in stage 4 underestimated the very substantial declines in inflow observed in the validation drift when compared to the SDE; this was especially so in the 'good' rock areas. Increased drawdowns in the drift have generated less flow and reduced head responses in nearby boreholes by a similar proportion. This behaviour has been the focus for considerable study in the latter part of the SCV project, and a number of potential processes have been proposed. These include 'transience', stress redistribution resulting from the creation of the drift, chemical precipitation, blast-induced dynamic unloading and related gas intrusion, and degassing. (au)

  2. Verification and validation in computational fluid dynamics

    Science.gov (United States)

    Oberkampf, William L.; Trucano, Timothy G.

    2002-04-01

    Verification and validation (V&V) are the primary means to assess accuracy and reliability in computational simulations. This paper presents an extensive review of the literature in V&V in computational fluid dynamics (CFD), discusses methods and procedures for assessing V&V, and develops a number of extensions to existing ideas. The review of the development of V&V terminology and methodology points out the contributions from members of the operations research, statistics, and CFD communities. Fundamental issues in V&V are addressed, such as code verification versus solution verification, model validation versus solution validation, the distinction between error and uncertainty, conceptual sources of error and uncertainty, and the relationship between validation and prediction. The fundamental strategy of verification is the identification and quantification of errors in the computational model and its solution. In verification activities, the accuracy of a computational solution is primarily measured relative to two types of highly accurate solutions: analytical solutions and highly accurate numerical solutions. Methods for determining the accuracy of numerical solutions are presented and the importance of software testing during verification activities is emphasized. The fundamental strategy of validation is to assess how accurately the computational results compare with the experimental data, with quantified error and uncertainty estimates for both. This strategy employs a hierarchical methodology that segregates and simplifies the physical and coupling phenomena involved in the complex engineering system of interest. A hypersonic cruise missile is used as an example of how this hierarchical structure is formulated. The discussion of validation assessment also encompasses a number of other important topics. A set of guidelines is proposed for designing and conducting validation experiments, supported by an explanation of how validation experiments are different

  3. Microgravity Flammability Experiments for Spacecraft Fire Safety

    DEFF Research Database (Denmark)

    Legros, Guillaume; Minster, Olivier; Tóth, Balazs

    2012-01-01

    As fire behaviour in manned spacecraft still remains poorly understood, an international topical team has been created to design a validation experiment that has an unprecedented large scale for a microgravity flammability experiment. While the validation experiment is being designed for a re-sup...

  4. Human Factors methods concerning integrated validation of nuclear power plant control rooms; Metodutveckling foer integrerad validering

    Energy Technology Data Exchange (ETDEWEB)

    Oskarsson, Per-Anders; Johansson, Bjoern J.E.; Gonzalez, Natalia (Swedish Defence Research Agency, Information Systems, Linkoeping (Sweden))

    2010-02-15

    The frame of reference for this work was existing recommendations and instructions from the NPP area, experiences from the review of the Turbic Validation and experiences from system validations performed at the Swedish Armed Forces, e.g. concerning military control rooms and fighter pilots. These enterprises are characterized by complex systems in extreme environments, often with high risks, where human error can lead to serious consequences. A focus group has been performed with representatives responsible for Human Factors issues from all Swedish NPP:s. The questions that were discussed were, among other things, for whom an integrated validation (IV) is performed and its purpose, what should be included in an IV, the comparison with baseline measures, the design process, the role of SSM, which methods of measurement should be used, and how the methods are affected of changes in the control room. The report brings different questions to discussion concerning the validation process. Supplementary methods of measurement for integrated validation are discussed, e.g. dynamic, psychophysiological, and qualitative methods for identification of problems. Supplementary methods for statistical analysis are presented. The study points out a number of deficiencies in the validation process, e.g. the need of common guidelines for validation and design, criteria for different types of measurements, clarification of the role of SSM, and recommendations for the responsibility of external participants in the validation process. The authors propose 12 measures for taking care of the identified problems

  5. Model Validation Status Review

    International Nuclear Information System (INIS)

    E.L. Hardin

    2001-01-01

    The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M and O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and

  6. Model Validation Status Review

    Energy Technology Data Exchange (ETDEWEB)

    E.L. Hardin

    2001-11-28

    The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M&O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and

  7. Validating presupposed versus focused text information.

    Science.gov (United States)

    Singer, Murray; Solar, Kevin G; Spear, Jackie

    2017-04-01

    There is extensive evidence that readers continually validate discourse accuracy and congruence, but that they may also overlook conspicuous text contradictions. Validation may be thwarted when the inaccurate ideas are embedded sentence presuppositions. In four experiments, we examined readers' validation of presupposed ("given") versus new text information. Throughout, a critical concept, such as a truck versus a bus, was introduced early in a narrative. Later, a character stated or thought something about the truck, which therefore matched or mismatched its antecedent. Furthermore, truck was presented as either given or new information. Mismatch target reading times uniformly exceeded the matching ones by similar magnitudes for given and new concepts. We obtained this outcome using different grammatical constructions and with different antecedent-target distances. In Experiment 4, we examined only given critical ideas, but varied both their matching and the main verb's factivity (e.g., factive know vs. nonfactive think). The Match × Factivity interaction closely resembled that previously observed for new target information (Singer, 2006). Thus, readers can successfully validate given target information. Although contemporary theories tend to emphasize either deficient or successful validation, both types of theory can accommodate the discourse and reader variables that may regulate validation.

  8. ASTEC validation on PANDA SETH

    International Nuclear Information System (INIS)

    Bentaib, Ahmed; Bleyer, Alexandre; Schwarz, Siegfried

    2009-01-01

    The ASTEC code development by IRSN and GRS is aimed to provide an integral code for the simulation of the whole course of severe accidents in Light-Water Reactors. ASTEC is a complex system of codes for reactor safety assessment. In this validation, only the thermal-hydraulic module of ASTEC code is used. ASTEC is a lumped-parameter code able to represent multi-compartment containments. It uses the following main elements: zones (compartments), junctions (liquids and atmospherics) and structures. The zones are connected by junctions and contain steam, water and non condensable gases. They exchange heat with structures by different heat transfer regimes: convection, radiation and condensation. This paper presents the validation of ASTEC V1.3 on the tests T9 and T9bis, of the PANDA OECD/SETH experimental program, investigating the impact of injection velocity and steam condensation on the plume shape and on the gas distribution. Dedicated meshes were developed to simulate the test facility with the two vessels DW1, DW2 and the interconnection pipe. The obtained numerical results are analyzed and compared to the experiments. The comparison shows a good agreement between experiments and calculations. (author)

  9. Benchmarks for GADRAS performance validation

    International Nuclear Information System (INIS)

    Mattingly, John K.; Mitchell, Dean James; Rhykerd, Charles L. Jr.

    2009-01-01

    The performance of the Gamma Detector Response and Analysis Software (GADRAS) was validated by comparing GADRAS model results to experimental measurements for a series of benchmark sources. Sources for the benchmark include a plutonium metal sphere, bare and shielded in polyethylene, plutonium oxide in cans, a highly enriched uranium sphere, bare and shielded in polyethylene, a depleted uranium shell and spheres, and a natural uranium sphere. The benchmark experimental data were previously acquired and consist of careful collection of background and calibration source spectra along with the source spectra. The calibration data were fit with GADRAS to determine response functions for the detector in each experiment. A one-dimensional model (pie chart) was constructed for each source based on the dimensions of the benchmark source. The GADRAS code made a forward calculation from each model to predict the radiation spectrum for the detector used in the benchmark experiment. The comparisons between the GADRAS calculation and the experimental measurements are excellent, validating that GADRAS can correctly predict the radiation spectra for these well-defined benchmark sources.

  10. Shielding experiments for accelerator facilities

    Energy Technology Data Exchange (ETDEWEB)

    Nakashima, Hiroshi; Tanaka, Susumu; Sakamoto, Yukio [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment] [and others

    2000-06-01

    A series of shielding experiments was carried out by using AVF cyclotron accelerator of TIARA at JAERI in order to validate shielding design methods for accelerator facilities in intermediate energy region. In this paper neutron transmission experiment through thick shields and radiation streaming experiment through a labyrinth are reported. (author)

  11. Shielding experiments for accelerator facilities

    International Nuclear Information System (INIS)

    Nakashima, Hiroshi; Tanaka, Susumu; Sakamoto, Yukio

    2000-01-01

    A series of shielding experiments was carried out by using AVF cyclotron accelerator of TIARA at JAERI in order to validate shielding design methods for accelerator facilities in intermediate energy region. In this paper neutron transmission experiment through thick shields and radiation streaming experiment through a labyrinth are reported. (author)

  12. Validity in Qualitative Evaluation

    Directory of Open Access Journals (Sweden)

    Vasco Lub

    2015-12-01

    Full Text Available This article provides a discussion on the question of validity in qualitative evaluation. Although validity in qualitative inquiry has been widely reflected upon in the methodological literature (and is still often subject of debate, the link with evaluation research is underexplored. Elaborating on epistemological and theoretical conceptualizations by Guba and Lincoln and Creswell and Miller, the article explores aspects of validity of qualitative research with the explicit objective of connecting them with aspects of evaluation in social policy. It argues that different purposes of qualitative evaluations can be linked with different scientific paradigms and perspectives, thus transcending unproductive paradigmatic divisions as well as providing a flexible yet rigorous validity framework for researchers and reviewers of qualitative evaluations.

  13. Cross validation in LULOO

    DEFF Research Database (Denmark)

    Sørensen, Paul Haase; Nørgård, Peter Magnus; Hansen, Lars Kai

    1996-01-01

    The leave-one-out cross-validation scheme for generalization assessment of neural network models is computationally expensive due to replicated training sessions. Linear unlearning of examples has recently been suggested as an approach to approximative cross-validation. Here we briefly review...... the linear unlearning scheme, dubbed LULOO, and we illustrate it on a systemidentification example. Further, we address the possibility of extracting confidence information (error bars) from the LULOO ensemble....

  14. Transient FDTD simulation validation

    OpenAIRE

    Jauregui Tellería, Ricardo; Riu Costa, Pere Joan; Silva Martínez, Fernando

    2010-01-01

    In computational electromagnetic simulations, most validation methods have been developed until now to be used in the frequency domain. However, the EMC analysis of the systems in the frequency domain many times is not enough to evaluate the immunity of current communication devices. Based on several studies, in this paper we propose an alternative method of validation of the transients in time domain allowing a rapid and objective quantification of the simulations results.

  15. HEDR model validation plan

    International Nuclear Information System (INIS)

    Napier, B.A.; Gilbert, R.O.; Simpson, J.C.; Ramsdell, J.V. Jr.; Thiede, M.E.; Walters, W.H.

    1993-06-01

    The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computational ''tools'' for estimating the possible radiation dose that individuals may have received from past Hanford Site operations. This document describes the planned activities to ''validate'' these tools. In the sense of the HEDR Project, ''validation'' is a process carried out by comparing computational model predictions with field observations and experimental measurements that are independent of those used to develop the model

  16. SDG and qualitative trend based model multiple scale validation

    Science.gov (United States)

    Gao, Dong; Xu, Xin; Yin, Jianjin; Zhang, Hongyu; Zhang, Beike

    2017-09-01

    Verification, Validation and Accreditation (VV&A) is key technology of simulation and modelling. For the traditional model validation methods, the completeness is weak; it is carried out in one scale; it depends on human experience. The SDG (Signed Directed Graph) and qualitative trend based multiple scale validation is proposed. First the SDG model is built and qualitative trends are added to the model. And then complete testing scenarios are produced by positive inference. The multiple scale validation is carried out by comparing the testing scenarios with outputs of simulation model in different scales. Finally, the effectiveness is proved by carrying out validation for a reactor model.