WorldWideScience

Sample records for criticality validation suite

  1. Validation suite for MCNP

    International Nuclear Information System (INIS)

    Mosteller, Russell D.

    2002-01-01

    Two validation suites, one for criticality and another for radiation shielding, have been defined and tested for the MCNP Monte Carlo code. All of the cases in the validation suites are based on experiments so that calculated and measured results can be compared in a meaningful way. The cases in the validation suites are described, and results from those cases are discussed. For several years, the distribution package for the MCNP Monte Carlo code1 has included an installation test suite to verify that MCNP has been installed correctly. However, the cases in that suite have been constructed primarily to test options within the code and to execute quickly. Consequently, they do not produce well-converged answers, and many of them are physically unrealistic. To remedy these deficiencies, sets of validation suites are being defined and tested for specific types of applications. All of the cases in the validation suites are based on benchmark experiments. Consequently, the results from the measurements are reliable and quantifiable, and calculated results can be compared with them in a meaningful way. Currently, validation suites exist for criticality and radiation-shielding applications.

  2. Comparison of results from the MCNP criticality validation suite using ENDF/B-VI and preliminary ENDF/B-VII nuclear data

    Energy Technology Data Exchange (ETDEWEB)

    Mosteller, R. D. (Russell D.)

    2004-01-01

    The MCNP Criticality Validation Suite is a collection of 31 benchmarks taken from the International Handbook of Evaluated Criticality Safety Benchmark Experiments. MCNP5 calculations clearly demonstrate that, overall, nuclear data for a preliminary version of ENDFB-VII produce better agreement with the benchmarks in the suite than do corresponding data from ENDF/B-VI. Additional calculations identify areas where improvements in the data still are needed. Based on results for the MCNP Criticality Validation Suite, the Pre-ENDF/B-VII nuclear data produce substantially better overall results than do their ENDF/B-VI counterparts. The calculated values for k{sub eff} for bare metal spheres and for an IEU cylinder reflected by normal uranium are in much better agreement with the benchmark values. In addition, the values of k{sub eff} for the bare metal spheres are much more consistent with those for corresponding metal spheres reflected by normal uranium or water. In addition, a long-standing controversy about the need for an ad hoc adjustment to the {sup 238}U resonance integral for thermal systems may finally be resolved. On the other hand, improvements still are needed in a number of areas. Those areas include intermediate-energy cross sections for {sup 235}U, angular distributions for elastic scattering in deuterium, and fast cross sections for {sup 237}Np.

  3. Space Suit Joint Torque Measurement Method Validation

    Science.gov (United States)

    Valish, Dana; Eversley, Karina

    2012-01-01

    In 2009 and early 2010, a test method was developed and performed to quantify the torque required to manipulate joints in several existing operational and prototype space suits. This was done in an effort to develop joint torque requirements appropriate for a new Constellation Program space suit system. The same test method was levied on the Constellation space suit contractors to verify that their suit design met the requirements. However, because the original test was set up and conducted by a single test operator there was some question as to whether this method was repeatable enough to be considered a standard verification method for Constellation or other future development programs. In order to validate the method itself, a representative subset of the previous test was repeated, using the same information that would be available to space suit contractors, but set up and conducted by someone not familiar with the previous test. The resultant data was compared using graphical and statistical analysis; the results indicated a significant variance in values reported for a subset of the re-tested joints. Potential variables that could have affected the data were identified and a third round of testing was conducted in an attempt to eliminate and/or quantify the effects of these variables. The results of the third test effort will be used to determine whether or not the proposed joint torque methodology can be applied to future space suit development contracts.

  4. Engineering Software Suite Validates System Design

    Science.gov (United States)

    2007-01-01

    EDAptive Computing Inc.'s (ECI) EDAstar engineering software tool suite, created to capture and validate system design requirements, was significantly funded by NASA's Ames Research Center through five Small Business Innovation Research (SBIR) contracts. These programs specifically developed Syscape, used to capture executable specifications of multi-disciplinary systems, and VectorGen, used to automatically generate tests to ensure system implementations meet specifications. According to the company, the VectorGen tests considerably reduce the time and effort required to validate implementation of components, thereby ensuring their safe and reliable operation. EDASHIELD, an additional product offering from ECI, can be used to diagnose, predict, and correct errors after a system has been deployed using EDASTAR -created models. Initial commercialization for EDASTAR included application by a large prime contractor in a military setting, and customers include various branches within the U.S. Department of Defense, industry giants like the Lockheed Martin Corporation, Science Applications International Corporation, and Ball Aerospace and Technologies Corporation, as well as NASA's Langley and Glenn Research Centers

  5. OpenMP 4.5 Validation and Verification Suite

    Energy Technology Data Exchange (ETDEWEB)

    2017-12-15

    OpenMP, a directive-based programming API, introduce directives for accelerator devices that programmers are starting to use more frequently in production codes. To make sure OpenMP directives work correctly across architectures, it is critical to have a mechanism that tests for an implementation's conformance to the OpenMP standard. This testing process can uncover ambiguities in the OpenMP specification, which helps compiler developers and users make a better use of the standard. We fill this gap through our validation and verification test suite that focuses on the offload directives available in OpenMP 4.5.

  6. A Test Suite for Safety-Critical Java using JML

    DEFF Research Database (Denmark)

    Ravn, Anders Peter; Søndergaard, Hans

    2013-01-01

    Development techniques are presented for a test suite for the draft specification of the Java profile for Safety-Critical Systems. Distinguishing features are: specification of conformance constraints in the Java Modeling Language, encoding of infrastructure concepts without implementation bias......, and corresponding specifications of implicitly stated behavioral and real-time properties. The test programs are auto-generated from the specification, while concrete values for test parameters are selected manually. The suite is open source and publicly accessible....

  7. SUIT

    DEFF Research Database (Denmark)

    Algreen-Ussing, Gregers; Wedebrunn, Ola

    2003-01-01

    Leaflet om project SUIT udgivet af European Commission. Tryksagen forklarer i korte ord resultatet af projektet SUIT. Kulturværdier i Miljøspørgsmål. Vurdering af projekter og indvirkning på miljø....

  8. Designing a suite of measurements to understand the critical zone

    Science.gov (United States)

    Brantley, Susan L.; DiBiase, Roman A.; Russo, Tess A.; Shi, Yuning; Lin, Henry; Davis, Kenneth J.; Kaye, Margot; Hill, Lillian; Kaye, Jason; Eissenstat, David M.; Hoagland, Beth; Dere, Ashlee L.; Neal, Andrew L.; Brubaker, Kristen M.; Arthur, Dan K.

    2016-03-01

    Many scientists have begun to refer to the earth surface environment from the upper canopy to the depths of bedrock as the critical zone (CZ). Identification of the CZ as an integral object worthy of study implicitly posits that the study of the whole earth surface will provide benefits that do not arise when studying the individual parts. To study the CZ, however, requires prioritizing among the measurements that can be made - and we do not generally agree on the priorities. Currently, the Susquehanna Shale Hills Critical Zone Observatory (SSHCZO) is expanding from a small original focus area (0.08 km2, Shale Hills catchment), to a larger watershed (164 km2, Shavers Creek watershed) and is grappling with the prioritization. This effort is an expansion from a monolithologic first-order forested catchment to a watershed that encompasses several lithologies (shale, sandstone, limestone) and land use types (forest, agriculture). The goal of the project remains the same: to understand water, energy, gas, solute, and sediment (WEGSS) fluxes that are occurring today in the context of the record of those fluxes over geologic time as recorded in soil profiles, the sedimentary record, and landscape morphology. Given the small size of the Shale Hills catchment, the original design incorporated measurement of as many parameters as possible at high temporal and spatial density. In the larger Shavers Creek watershed, however, we must focus the measurements. We describe a strategy of data collection and modeling based on a geomorphological and land use framework that builds on the hillslope as the basic unit. Interpolation and extrapolation beyond specific sites relies on geophysical surveying, remote sensing, geomorphic analysis, the study of natural integrators such as streams, groundwaters or air, and application of a suite of CZ models. We hypothesize that measurements of a few important variables at strategic locations within a geomorphological framework will allow

  9. Validation of Yoon's Critical Thinking Disposition Instrument.

    Science.gov (United States)

    Shin, Hyunsook; Park, Chang Gi; Kim, Hyojin

    2015-12-01

    The lack of reliable and valid evaluation tools targeting Korean nursing students' critical thinking (CT) abilities has been reported as one of the barriers to instructing and evaluating students in undergraduate programs. Yoon's Critical Thinking Disposition (YCTD) instrument was developed for Korean nursing students, but few studies have assessed its validity. This study aimed to validate the YCTD. Specifically, the YCTD was assessed to identify its cross-sectional and longitudinal measurement invariance. This was a validation study in which a cross-sectional and longitudinal (prenursing and postnursing practicum) survey was used to validate the YCTD using 345 nursing students at three universities in Seoul, Korea. The participants' CT abilities were assessed using the YCTD before and after completing an established pediatric nursing practicum. The validity of the YCTD was estimated and then group invariance test using multigroup confirmatory factor analysis was performed to confirm the measurement compatibility of multigroups. A test of the seven-factor model showed that the YCTD demonstrated good construct validity. Multigroup confirmatory factor analysis findings for the measurement invariance suggested that this model structure demonstrated strong invariance between groups (i.e., configural, factor loading, and intercept combined) but weak invariance within a group (i.e., configural and factor loading combined). In general, traditional methods for assessing instrument validity have been less than thorough. In this study, multigroup confirmatory factor analysis using cross-sectional and longitudinal measurement data allowed validation of the YCTD. This study concluded that the YCTD can be used for evaluating Korean nursing students' CT abilities. Copyright © 2015. Published by Elsevier B.V.

  10. SCALE criticality safety verification and validation package

    International Nuclear Information System (INIS)

    Bowman, S.M.; Emmett, M.B.; Jordan, W.C.

    1998-01-01

    Verification and validation (V and V) are essential elements of software quality assurance (QA) for computer codes that are used for performing scientific calculations. V and V provides a means to ensure the reliability and accuracy of such software. As part of the SCALE QA and V and V plans, a general V and V package for the SCALE criticality safety codes has been assembled, tested and documented. The SCALE criticality safety V and V package is being made available to SCALE users through the Radiation Safety Information Computational Center (RSICC) to assist them in performing adequate V and V for their SCALE applications

  11. Validation Of Critical Knowledge-Based Systems

    Science.gov (United States)

    Duke, Eugene L.

    1992-01-01

    Report discusses approach to verification and validation of knowledge-based systems. Also known as "expert systems". Concerned mainly with development of methodologies for verification of knowledge-based systems critical to flight-research systems; e.g., fault-tolerant control systems for advanced aircraft. Subject matter also has relevance to knowledge-based systems controlling medical life-support equipment or commuter railroad systems.

  12. Computerized Italian criticality guide, description and validation

    International Nuclear Information System (INIS)

    Carotenuto, M.; Landeyro, P.A.

    1988-10-01

    Our group is developing an 'expert system' for collecting engineering know-how on back-end nuclear plant design. An expert system is the most suitable software tool for our problem. During the analysis, the design process was divided into different branches. At each branch of the design process the Expert System relates a computerized design procedure. Any design procedure is composed of a set of design methods, together with their condition of application and reliability limits. In the framework of this expert system, the nuclear criticality safety analysis procedure was developed, in the form of a computerized criticality guide, attempting to reproduce the designer's normal 'reasoning' process. The criticality guide is composed of two parts: A computerized text, including theory, a description of the accidents occurred in the past and a description of the italian design experience; An interactive computer aided calculation module, containing a graphical facility for critical parameter curves. In the present report are presented the criticality guide (computerized Italian Criticality Guide) and its validation test. (author)

  13. Computerized Italian criticality guide, description and validation

    Energy Technology Data Exchange (ETDEWEB)

    Carotenuto, M; Landeyro, P A [ENEA - Dipartimento Ciclo del Combustibile, Centro Ricerche Energia, Casaccia (Italy)

    1988-10-15

    Our group is developing an 'expert system' for collecting engineering know-how on back-end nuclear plant design. An expert system is the most suitable software tool for our problem. During the analysis, the design process was divided into different branches. At each branch of the design process the Expert System relates a computerized design procedure. Any design procedure is composed of a set of design methods, together with their condition of application and reliability limits. In the framework of this expert system, the nuclear criticality safety analysis procedure was developed, in the form of a computerized criticality guide, attempting to reproduce the designer's normal 'reasoning' process. The criticality guide is composed of two parts: A computerized text, including theory, a description of the accidents occurred in the past and a description of the italian design experience; An interactive computer aided calculation module, containing a graphical facility for critical parameter curves. In the present report are presented the criticality guide (computerized Italian Criticality Guide) and its validation test. (author)

  14. Lagrangian Stochastic Dispersion Model IMS Model Suite and its Validation against Experimental Data

    International Nuclear Information System (INIS)

    Bartok, J.

    2010-01-01

    The dissertation presents IMS Lagrangian Dispersion Model, which is a 'new generation' Slovak dispersion model of long-range transport, developed by MicroStep-MIS. It solves trajectory equation for a vast number of Lagrangian 'particles' and stochastic equation that simulates the effects of turbulence. Model contains simulation of radioactive decay (full decay chains of more than 300 nuclides), and dry and wet deposition. Model was integrated into IMS Model Suite, a system in which several models and modules can run and cooperate, e.g. LAM model WRF preparing fine resolution meteorological data for dispersion. The main theme of the work is validation of dispersion model against large scale international campaigns CAPTEX and ETEX, which are two of the largest tracer experiments. Validation addressed treatment of missing data, data interpolation into comparable temporal and spatial representation. The best model results were observed for ETEX I, standard results for CAPTEXes and worst results for ETEX II, known in modelling community for its meteorological conditions that can be hardly resolved by models. The IMS Lagrangian Dispersion Model was identified as capable long range dispersion model for slowly- or nonreacting chemicals and radioactive matter. Influence of input data on simulation quality is discussed within the work. Additional modules were prepared according to praxis requirement: a) Recalculation of concentrations of radioactive pollutant into effective doses form inhalation, immersion in the plume and deposition. b) Dispersion of mineral dust was added and tested in desert locality, where wind and soil moisture were firstly analysed and forecast by WRF. The result was qualitatively verified in case study against satellite observations. (author)

  15. Validation testing of safety-critical software

    International Nuclear Information System (INIS)

    Kim, Hang Bae; Han, Jae Bok

    1995-01-01

    A software engineering process has been developed for the design of safety critical software for Wolsung 2/3/4 project to satisfy the requirements of the regulatory body. Among the process, this paper described the detail process of validation testing performed to ensure that the software with its hardware, developed by the design group, satisfies the requirements of the functional specification prepared by the independent functional group. To perform the tests, test facility and test software were developed and actual safety system computer was connected. Three kinds of test cases, i.e., functional test, performance test and self-check test, were programmed and run to verify each functional specifications. Test failures were feedback to the design group to revise the software and test results were analyzed and documented in the report to submit to the regulatory body. The test methodology and procedure were very efficient and satisfactory to perform the systematic and automatic test. The test results were also acceptable and successful to verify the software acts as specified in the program functional specification. This methodology can be applied to the validation of other safety-critical software. 2 figs., 2 tabs., 14 refs. (Author)

  16. Validation of Vegetation Index Time Series from Suomi NPP Visible Infrared Imaging Radiometer Suite Using Tower Radiation Flux Measurements

    Science.gov (United States)

    Miura, T.; Kato, A.; Wang, J.; Vargas, M.; Lindquist, M.

    2015-12-01

    Satellite vegetation index (VI) time series data serve as an important means to monitor and characterize seasonal changes of terrestrial vegetation and their interannual variability. It is, therefore, critical to ensure quality of such VI products and one method of validating VI product quality is cross-comparison with in situ flux tower measurements. In this study, we evaluated the quality of VI time series derived from Visible Infrared Imaging Radiometer Suite (VIIRS) onboard the Suomi National Polar-orbiting Partnership (NPP) spacecraft by cross-comparison with in situ radiation flux measurements at select flux tower sites over North America and Europe. VIIRS is a new polar-orbiting satellite sensor series, slated to replace National Oceanic and Atmospheric Administration's Advanced Very High Resolution Radiometer in the afternoon overpass and to continue the highly-calibrated data streams initiated with Moderate Resolution Imaging Spectrometer of National Aeronautics and Space Administration's Earth Observing System. The selected sites covered a wide range of biomes, including croplands, grasslands, evergreen needle forest, woody savanna, and open shrublands. The two VIIRS indices of the Top-of-Atmosphere (TOA) Normalized Difference Vegetation Index (NDVI) and the atmospherically-corrected, Top-of-Canopy (TOC) Enhanced Vegetation Index (EVI) (daily, 375 m spatial resolution) were compared against the TOC NDVI and a two-band version of EVI (EVI2) calculated from tower radiation flux measurements, respectively. VIIRS and Tower VI time series showed comparable seasonal profiles across biomes with statistically significant correlations (> 0.60; p-value 0.95), with mean differences of 2.3 days and 5.0 days for the NDVI and the EVI, respectively. These results indicate that VIIRS VI time series can capture seasonal evolution of vegetated land surface as good as in situ radiometric measurements. Future studies that address biophysical or physiological interpretations

  17. Assessment of the Value, Impact, and Validity of the Jobs and Economic Development Impacts (JEDI) Suite of Models

    Energy Technology Data Exchange (ETDEWEB)

    Billman, L.; Keyser, D.

    2013-08-01

    The Jobs and Economic Development Impacts (JEDI) models, developed by the National Renewable Energy Laboratory (NREL) for the U.S. Department of Energy (DOE) Office of Energy Efficiency and Renewable Energy (EERE), use input-output methodology to estimate gross (not net) jobs and economic impacts of building and operating selected types of renewable electricity generation and fuel plants. This analysis provides the DOE with an assessment of the value, impact, and validity of the JEDI suite of models. While the models produce estimates of jobs, earnings, and economic output, this analysis focuses only on jobs estimates. This validation report includes an introduction to JEDI models, an analysis of the value and impact of the JEDI models, and an analysis of the validity of job estimates generated by JEDI model through comparison to other modeled estimates and comparison to empirical, observed jobs data as reported or estimated for a commercial project, a state, or a region.

  18. Isotopic and criticality validation for actinide-only burnup credit

    International Nuclear Information System (INIS)

    Fuentes, E.; Lancaster, D.; Rahimi, M.

    1997-01-01

    The techniques used for actinide-only burnup credit isotopic validation and criticality validation are presented and discussed. Trending analyses have been incorporated into both methodologies, requiring biases and uncertainties to be treated as a function of the trending parameters. The isotopic validation is demonstrated using the SAS2H module of SCALE 4.2, with the 27BURNUPLIB cross section library; correction factors are presented for each of the actinides in the burnup credit methodology. For the criticality validation, the demonstration is performed with the CSAS module of SCALE 4.2 and the 27BURNUPLIB, resulting in a validated upper safety limit

  19. Aeromedical Evacuation Enroute Critical Care Validation Study

    Science.gov (United States)

    2015-02-27

    percentile TP, suggesting that TPs assumed complex postures to accomplish patient care tasks. The findings suggest that ergonomic specifications...bending has been associated with back pain (Guo, 2002). Enhanced medical treatment capabilities (e.g., enroute critical care nurses [ECCN...heights, including ergonomic factors such as medic stance and stability and the medic’s ability to maneuver into challenging work angles. The light

  20. Validation of JENDL-3.3 for the HTTR criticality

    International Nuclear Information System (INIS)

    Goto, Minoru; Nojiri, Naoki; Shimakawa, Satoshi

    2004-01-01

    Validation of JENDL-3.3 has been performed for the HTTR criticality using the MVP code with a ''lattice-cell'' of infinite models and a ''whole-core'' of finite models. It was found that the keff values calculated with JENDL-3.3 was decreased about 0.2-0.4%Δk from one with JENDL-3.2. The criticality prediction was closed to the experimental data in the critical approach situation of the HTTR. (author)

  1. WSRC approach to validation of criticality safety computer codes

    International Nuclear Information System (INIS)

    Finch, D.R.; Mincey, J.F.

    1991-01-01

    Recent hardware and operating system changes at Westinghouse Savannah River Site (WSRC) have necessitated review of the validation for JOSHUA criticality safety computer codes. As part of the planning for this effort, a policy for validation of JOSHUA and other criticality safety codes has been developed. This policy will be illustrated with the steps being taken at WSRC. The objective in validating a specific computational method is to reliably correlate its calculated neutron multiplication factor (K eff ) with known values over a well-defined set of neutronic conditions. Said another way, such correlations should be: (1) repeatable; (2) demonstrated with defined confidence; and (3) identify the range of neutronic conditions (area of applicability) for which the correlations are valid. The general approach to validation of computational methods at WSRC must encompass a large number of diverse types of fissile material processes in different operations. Special problems are presented in validating computational methods when very few experiments are available (such as for enriched uranium systems with principal second isotope 236 U). To cover all process conditions at WSRC, a broad validation approach has been used. Broad validation is based upon calculation of many experiments to span all possible ranges of reflection, nuclide concentrations, moderation ratios, etc. Narrow validation, in comparison, relies on calculations of a few experiments very near anticipated worst-case process conditions. The methods and problems of broad validation are discussed

  2. Validation of calculational methods for nuclear criticality safety - approved 1975

    International Nuclear Information System (INIS)

    Anon.

    1977-01-01

    The American National Standard for Nuclear Criticality Safety in Operations with Fissionable Materials Outside Reactors, N16.1-1975, states in 4.2.5: In the absence of directly applicable experimental measurements, the limits may be derived from calculations made by a method shown to be valid by comparison with experimental data, provided sufficient allowances are made for uncertainties in the data and in the calculations. There are many methods of calculation which vary widely in basis and form. Each has its place in the broad spectrum of problems encountered in the nuclear criticality safety field; however, the general procedure to be followed in establishing validity is common to all. The standard states the requirements for establishing the validity and area(s) of applicability of any calculational method used in assessing nuclear criticality safety

  3. Validation of a scenario-based assessment of critical thinking using an externally validated tool.

    Science.gov (United States)

    Buur, Jennifer L; Schmidt, Peggy; Smylie, Dean; Irizarry, Kris; Crocker, Carlos; Tyler, John; Barr, Margaret

    2012-01-01

    With medical education transitioning from knowledge-based curricula to competency-based curricula, critical thinking skills have emerged as a major competency. While there are validated external instruments for assessing critical thinking, many educators have created their own custom assessments of critical thinking. However, the face validity of these assessments has not been challenged. The purpose of this study was to compare results from a custom assessment of critical thinking with the results from a validated external instrument of critical thinking. Students from the College of Veterinary Medicine at Western University of Health Sciences were administered a custom assessment of critical thinking (ACT) examination and the externally validated instrument, California Critical Thinking Skills Test (CCTST), in the spring of 2011. Total scores and sub-scores from each exam were analyzed for significant correlations using Pearson correlation coefficients. Significant correlations between ACT Blooms 2 and deductive reasoning and total ACT score and deductive reasoning were demonstrated with correlation coefficients of 0.24 and 0.22, respectively. No other statistically significant correlations were found. The lack of significant correlation between the two examinations illustrates the need in medical education to externally validate internal custom assessments. Ultimately, the development and validation of custom assessments of non-knowledge-based competencies will produce higher quality medical professionals.

  4. Validation of KENO V.a: Comparison with critical experiments

    International Nuclear Information System (INIS)

    Jordan, W.C.; Landers, N.F.; Petrie, L.M.

    1986-12-01

    Section 1 of this report documents the validation of KENO V.a against 258 critical experiments. Experiments considered were primarily high or low enriched uranium systems. The results indicate that the KENO V.a Monte Carlo Criticality Program accurately calculates a broad range of critical experiments. A substantial number of the calculations showed a positive or negative bias in excess of 1 1/2% in k-effective (k/sub eff/). Classes of criticals which show a bias include 3% enriched green blocks, highly enriched uranyl fluoride slab arrays, and highly enriched uranyl nitrate arrays. If these biases are properly taken into account, the KENO V.a code can be used with confidence for the design and criticality safety analysis of uranium-containing systems. Sections 2 of this report documents the results of investigation into the cause of the bias observed in Sect. 1. The results of this study indicate that the bias seen in Sect. 1 is caused by code bias, cross-section bias, reporting bias, and modeling bias. There is evidence that many of the experiments used in this validation and in previous validations are not adequately documented. The uncertainty in the experimental parameters overshadows bias caused by the code and cross sections and prohibits code validation to better than about 1% in k/sub eff/. 48 refs., 19 figs., 19 tabs

  5. Lecture Notes on Criticality Safety Validation Using MCNP & Whisper

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rising, Michael Evan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Alwin, Jennifer Louise [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-03-11

    Training classes for nuclear criticality safety, MCNP documentation. The need for, and problems surrounding, validation of computer codes and data area considered first. Then some background for MCNP & Whisper is given--best practices for Monte Carlo criticality calculations, neutron spectra, S(α,β) thermal neutron scattering data, nuclear data sensitivities, covariance data, and correlation coefficients. Whisper is computational software designed to assist the nuclear criticality safety analyst with validation studies with the Monte Carlo radiation transport package MCNP. Whisper's methodology (benchmark selection – Ck's, weights; extreme value theory – bias, bias uncertainty; MOS for nuclear data uncertainty – GLLS) and usage are discussed.

  6. Validating analysis methodologies used in burnup credit criticality calculations

    International Nuclear Information System (INIS)

    Brady, M.C.; Napolitano, D.G.

    1992-01-01

    The concept of allowing reactivity credit for the depleted (or burned) state of pressurized water reactor fuel in the licensing of spent fuel facilities introduces a new challenge to members of the nuclear criticality community. The primary difference in this analysis approach is the technical ability to calculate spent fuel compositions (or inventories) and to predict their effect on the system multiplication factor. Isotopic prediction codes are used routinely for in-core physics calculations and the prediction of radiation source terms for both thermal and shielding analyses, but represent an innovation for criticality specialists. This paper discusses two methodologies currently being developed to specifically evaluate isotopic composition and reactivity for the burnup credit concept. A comprehensive approach to benchmarking and validating the methods is also presented. This approach involves the analysis of commercial reactor critical data, fuel storage critical experiments, chemical assay isotopic data, and numerical benchmark calculations

  7. Validation of a clinical critical thinking skills test in nursing

    Directory of Open Access Journals (Sweden)

    Sujin Shin

    2015-01-01

    Full Text Available Purpose: The purpose of this study was to develop a revised version of the clinical critical thinking skills test (CCTS and to subsequently validate its performance. Methods: This study is a secondary analysis of the CCTS. Data were obtained from a convenience sample of 284 college students in June 2011. Thirty items were analyzed using item response theory and test reliability was assessed. Test-retest reliability was measured using the results of 20 nursing college and graduate school students in July 2013. The content validity of the revised items was analyzed by calculating the degree of agreement between instrument developer intention in item development and the judgments of six experts. To analyze response process validity, qualitative data related to the response processes of nine nursing college students obtained through cognitive interviews were analyzed. Results: Out of initial 30 items, 11 items were excluded after the analysis of difficulty and discrimination parameter. When the 19 items of the revised version of the CCTS were analyzed, levels of item difficulty were found to be relatively low and levels of discrimination were found to be appropriate or high. The degree of agreement between item developer intention and expert judgments equaled or exceeded 50%. Conclusion: From above results, evidence of the response process validity was demonstrated, indicating that subjects respondeds as intended by the test developer. The revised 19-item CCTS was found to have sufficient reliability and validity and will therefore represents a more convenient measurement of critical thinking ability.

  8. Validation of a clinical critical thinking skills test in nursing.

    Science.gov (United States)

    Shin, Sujin; Jung, Dukyoo; Kim, Sungeun

    2015-01-27

    The purpose of this study was to develop a revised version of the clinical critical thinking skills test (CCTS) and to subsequently validate its performance. This study is a secondary analysis of the CCTS. Data were obtained from a convenience sample of 284 college students in June 2011. Thirty items were analyzed using item response theory and test reliability was assessed. Test-retest reliability was measured using the results of 20 nursing college and graduate school students in July 2013. The content validity of the revised items was analyzed by calculating the degree of agreement between instrument developer intention in item development and the judgments of six experts. To analyze response process validity, qualitative data related to the response processes of nine nursing college students obtained through cognitive interviews were analyzed. Out of initial 30 items, 11 items were excluded after the analysis of difficulty and discrimination parameter. When the 19 items of the revised version of the CCTS were analyzed, levels of item difficulty were found to be relatively low and levels of discrimination were found to be appropriate or high. The degree of agreement between item developer intention and expert judgments equaled or exceeded 50%. From above results, evidence of the response process validity was demonstrated, indicating that subjects respondeds as intended by the test developer. The revised 19-item CCTS was found to have sufficient reliability and validity and will therefore represents a more convenient measurement of critical thinking ability.

  9. Verification and Validation for Flight-Critical Systems (VVFCS)

    Science.gov (United States)

    Graves, Sharon S.; Jacobsen, Robert A.

    2010-01-01

    On March 31, 2009 a Request for Information (RFI) was issued by NASA s Aviation Safety Program to gather input on the subject of Verification and Validation (V & V) of Flight-Critical Systems. The responses were provided to NASA on or before April 24, 2009. The RFI asked for comments in three topic areas: Modeling and Validation of New Concepts for Vehicles and Operations; Verification of Complex Integrated and Distributed Systems; and Software Safety Assurance. There were a total of 34 responses to the RFI, representing a cross-section of academic (26%), small & large industry (47%) and government agency (27%).

  10. Criticality safety validation of MCNP5 using continuous energy libraries

    International Nuclear Information System (INIS)

    Salome, Jean A.D.; Pereira, Claubia; Assuncao, Jonathan B.A.; Veloso, Maria Auxiliadora F.; Costa, Antonella L.; Silva, Clarysson A.M. da

    2013-01-01

    The study of subcritical systems is very important in the design, installation and operation of various devices, mainly nuclear reactors and power plants. The information generated by these systems guide the decisions to be taken in the executive project, the economic viability and the safety measures to be employed in a nuclear facility. Simulating some experiments from the International Handbook of Evaluated Criticality Safety Benchmark Experiments, the code MCNP5 was validated to nuclear criticality analysis. Its continuous libraries were used. The average values and standard deviation (SD) were evaluated. The results obtained with the code are very similar to the values obtained by the benchmark experiments. (author)

  11. Validation of a clinical critical thinking skills test in nursing

    OpenAIRE

    Shin, Sujin; Jung, Dukyoo; Kim, Sungeun

    2015-01-01

    Purpose: The purpose of this study was to develop a revised version of the clinical critical thinking skills test (CCTS) and to subsequently validate its performance. Methods: This study is a secondary analysis of the CCTS. Data were obtained from a convenience sample of 284 college students in June 2011. Thirty items were analyzed using item response theory and test reliability was assessed. Test-retest reliability was measured using the results of 20 nursing college and graduate school stud...

  12. Validation of KENO-based criticality calculations at Rocky Flats

    International Nuclear Information System (INIS)

    Felsher, P.D.; McKamy, J.N.; Monahan, S.P.

    1992-01-01

    In the absence of experimental data, it is necessary to rely on computer-based computational methods in evaluating the criticality condition of a nuclear system. The validity of the computer codes is established in a two-part procedure as outlined in ANSI/ANS 8.1. The first step, usually the responsibility of the code developer, involves verification that the algorithmic structure of the code is performing the intended mathematical operations correctly. The second step involves an assessment of the code's ability to realistically portray the governing physical processes in question. This is accomplished by determining the code's bias, or systematic error, through a comparison of computational results to accepted values obtained experimentally. In this paper, the authors discuss the validation process for KENO and the Hansen-Roach cross sections in use at EG and G Rocky Flats. The validation process at Rocky Flats consists of both global and local techniques. The global validation resulted in a maximum k eff limit of 0.95 for the limiting-accident scanarios of a criticality evaluation

  13. Evaluation and validation of criticality codes for fuel dissolver calculations

    International Nuclear Information System (INIS)

    Santamarina, A.; Smith, H.J.; Whitesides, G.E.

    1991-01-01

    During the past ten years an OECD/NEA Criticality Working Group has examined the validity of criticality safety computational methods. International calculation tools which were shown to be valid in systems for which experimental data existed were demonstrated to be inadequate when extrapolated to fuel dissolver media. A theoretical study of the main physical parameters involved in fuel dissolution calculations was performed, i.e. range of moderation, variation of pellet size and the fuel double heterogeneity effect. The APOLLO/P IC method developed to treat this latter effect permits us to supply the actual reactivity variation with pellet dissolution and to propose international reference values. The disagreement among contributors' calculations was analyzed through a neutron balance breakdown, based on three-group microscopic reaction rates. The results pointed out that fast and resonance nuclear data in criticality codes are not sufficiently reliable. Moreover the neutron balance analysis emphasized the inadequacy of the standard self-shielding formalism to account for 238 U resonance mutual self-shielding in the pellet-fissile liquor interaction. The benchmark exercise has resolved a potentially dangerous inadequacy in dissolver calculations. (author)

  14. Validation issues for depletion and criticality analysis in burnup credit

    International Nuclear Information System (INIS)

    Parks, C.V.; Broadhead, B.L.; Dehart, M.D.; Gauld, I.C.

    2001-01-01

    This paper reviews validation issues associated with implementation of burnup credit in transport, dry storage, and disposal. The issues discussed are ones that have been identified by one or more constituents of the United States technical community (national laboratories, licensees, and regulators) that have been exploring the use of burnup credit. There is not necessarily agreement on the importance of the various issues, which sometimes is what creates the issue. The broad issues relate to the paucity of available experimental data (radiochemical assays and critical experiments) covering the full range and characteristics of spent nuclear fuel in away-from-reactor systems. The paper will also introduce recent efforts initiated at Oak Ridge National Laboratory (ORNL) to provide technical information that can help better assess the value of different experiments. The focus of the paper is on experience with validation issues related to use of burnup credit for transport and dry storage applications. (author)

  15. Critical Evaluation of Validation Rules Automated Extraction from Data

    Directory of Open Access Journals (Sweden)

    David Pejcoch

    2014-10-01

    Full Text Available The goal of this article is to critically evaluate a possibility of automatic extraction of such kind of rules which could be later used within a Data Quality Management process for validation of records newly incoming to Information System. For practical demonstration the 4FT-Miner procedure implemented in LISpMiner System was chosen. A motivation for this task is the potential simplification of projects focused on Data Quality Management. Initially, this article is going to critically evaluate a possibility of fully automated extraction with the aim to identify strengths and weaknesses of this approach in comparison to its alternative, when at least some a priori knowledge is available. As a result of practical implementation, this article provides design of recommended process which would be used as a guideline for future projects. Also the question of how to store and maintain extracted rules and how to integrate them with existing tools supporting Data Quality Management is discussed

  16. Evaluation and validation of criticality codes for fuel dissolver calculations

    International Nuclear Information System (INIS)

    Santamarina, A.; Smith, H.J.; Whitesides, G.E.

    1991-01-01

    During the past ten years an OECD/NEA Criticality Working Group has examined the validity of criticality safety computational methods. International calculation tools which were shown to be valid in systems for which experimental data existed were demonstrated to be inadequate when extrapolated to fuel dissolver media. The spread of the results in the international calculation amounted to ± 12,000 pcm in the realistic fuel dissolver exercise n degrees 19 proposed by BNFL, and to ± 25,000 pcm in the benchmark n degrees 20 in which fissile material in solid form is surrounded by fissile material in solution. A theoretical study of the main physical parameters involved in fuel dissolution calculations was performed, i.e. range of moderation, variation of pellet size and the fuel double heterogeneity effect. The APOLLO/P IC method developed to treat latter effect, permits us to supply the actual reactivity variation with pellet dissolution and to propose international reference values. The disagreement among contributors' calculations was analyzed through a neutron balance breakdown, based on three-group microscopic reaction rates solicited from the participants. The results pointed out that fast and resonance nuclear data in criticality codes are not sufficiently reliable. Moreover the neutron balance analysis emphasized the inadequacy of the standard self-shielding formalism (NITAWL in the international SCALE package) to account for 238 U resonance mutual self-shielding in the pellet-fissile liquor interaction. Improvements in the up-dated 1990 contributions, as do recent complementary reference calculations (MCNP, VIM, ultrafine slowing-down CGM calculation), confirm the need to use rigorous self-shielding methods in criticality design-oriented codes. 6 refs., 11 figs., 3 tabs

  17. Flight critical system design guidelines and validation methods

    Science.gov (United States)

    Holt, H. M.; Lupton, A. O.; Holden, D. G.

    1984-01-01

    Efforts being expended at NASA-Langley to define a validation methodology, techniques for comparing advanced systems concepts, and design guidelines for characterizing fault tolerant digital avionics are described with an emphasis on the capabilities of AIRLAB, an environmentally controlled laboratory. AIRLAB has VAX 11/750 and 11/780 computers with an aggregate of 22 Mb memory and over 650 Mb storage, interconnected at 256 kbaud. An additional computer is programmed to emulate digital devices. Ongoing work is easily accessed at user stations by either chronological or key word indexing. The CARE III program aids in analyzing the capabilities of test systems to recover from faults. An additional code, the semi-Markov unreliability program (SURE) generates upper and lower reliability bounds. The AIRLAB facility is mainly dedicated to research on designs of digital flight-critical systems which must have acceptable reliability before incorporation into aircraft control systems. The digital systems would be too costly to submit to a full battery of flight tests and must be initially examined with the AIRLAB simulation capabilities.

  18. Verification and Validation of Flight-Critical Systems

    Science.gov (United States)

    Brat, Guillaume

    2010-01-01

    For the first time in many years, the NASA budget presented to congress calls for a focused effort on the verification and validation (V&V) of complex systems. This is mostly motivated by the results of the VVFCS (V&V of Flight-Critical Systems) study, which should materialize as a a concrete effort under the Aviation Safety program. This talk will present the results of the study, from requirements coming out of discussions with the FAA and the Joint Planning and Development Office (JPDO) to technical plan addressing the issue, and its proposed current and future V&V research agenda, which will be addressed by NASA Ames, Langley, and Dryden as well as external partners through NASA Research Announcements (NRA) calls. This agenda calls for pushing V&V earlier in the life cycle and take advantage of formal methods to increase safety and reduce cost of V&V. I will present the on-going research work (especially the four main technical areas: Safety Assurance, Distributed Systems, Authority and Autonomy, and Software-Intensive Systems), possible extensions, and how VVFCS plans on grounding the research in realistic examples, including an intended V&V test-bench based on an Integrated Modular Avionics (IMA) architecture and hosted by Dryden.

  19. Validation of the ABBN/CONSYST constants system. Part 1: Validation through the critical experiments on compact metallic cores

    International Nuclear Information System (INIS)

    Ivanova, T.T.; Manturov, G.N.; Nikolaev, M.N.; Rozhikhin, E.V.; Semenov, M.Yu.; Tsiboulia, A.M.

    1999-01-01

    Worldwide compilation of criticality safety benchmark experiments, evaluated due to an activity of the International Criticality Safety Benchmark Evaluation Project (ICSBEP), discovers new possibilities for validation of the ABBN-93.1 cross section library for criticality safety analysis. Results of calculations of small assemblies with metal-fuelled cores are presented in this paper. It is concluded that ABBN-93.1 predicts criticality of such systems with required accuracy

  20. How to Measure Critical Health Competences: Development and Validation of the Critical Health Competence Test (CHC Test)

    Science.gov (United States)

    Steckelberg, Anke; Hulfenhaus, Christian; Kasper, Jurgen; Rost, Jurgen; Muhlhauser, Ingrid

    2009-01-01

    Consumers' autonomy regarding health increasingly requires competences to critically appraise health information. Critical health literacy refers to the concept of evidence-based medicine. Instruments to measure these competences in curriculum evaluation and surveys are lacking. We aimed to develop and validate an instrument to measure critical…

  1. SCALE system cross-section validation for criticality safety analysis

    International Nuclear Information System (INIS)

    Hathout, A.M.; Westfall, R.M.; Dodds, H.L. Jr.

    1980-01-01

    The purpose of this study is to test selected data from three cross-section libraries for use in the criticality safety analysis of UO 2 fuel rod lattices. The libraries, which are distributed with the SCALE system, are used to analyze potential criticality problems which could arise in the industrial fuel cycle for PWR and BWR reactors. Fuel lattice criticality problems could occur in pool storage, dry storage with accidental moderation, shearing and dissolution of irradiated elements, and in fuel transport and storage due to inadequate packing and shipping cask design. The data were tested by using the SCALE system to analyze 25 recently performed critical experiments

  2. Critical Values for Lawshe's Content Validity Ratio: Revisiting the Original Methods of Calculation

    Science.gov (United States)

    Ayre, Colin; Scally, Andrew John

    2014-01-01

    The content validity ratio originally proposed by Lawshe is widely used to quantify content validity and yet methods used to calculate the original critical values were never reported. Methods for original calculation of critical values are suggested along with tables of exact binomial probabilities.

  3. Literature research concerning alternative methods for validation of criticality calculation systems

    International Nuclear Information System (INIS)

    Behler, Matthias

    2016-05-01

    Beside radiochemical analysis of irradiated fuel and critical experiments, which has become a well-established basis for the validation of depletion code and criticality codes respectively, also results of oscillation experiments or the operating conditions of power reactor and research reactors can provide useful information for the validation of the above mentioned codes. Based on a literature review the potential of the utilization of oscillation experiment measurements for the validation of criticality codes is estimated. It is found that the reactivity measurements for actinides and fission products within the CERES program on the reactors DIMPLE (Winfrith, UK) and MINERVE (Cadarache, France) can give a valuable addition to the commonly used critical experiments for criticality code validation. However, there are approaches but yet no generally satisfactory solution for integrating the reactivity measurements in a quantitative bias determination for the neutron multiplication factor of typical application cases including irradiated spent fuel outside reactor cores, calculated using common criticality codes.

  4. Validation the methodology calculate critical position of control rods to the critical facility IPEN/MB-01

    International Nuclear Information System (INIS)

    Lopez Aldama, D.; Rodriguez Gual, R.

    1998-01-01

    Presently work intends to validate the models and programs used in the Nuclear Technology Center for calculating the critical position of control rods by means of the analysis of the measurements performed at the critical facility IPEN/MB-01. The lattice calculations were carried out with the WIMS/D4 code and for the global calculations the diffusion code SNAP-3D was used

  5. Criticality safety validation: Simple geometry, single unit 233U systems

    International Nuclear Information System (INIS)

    Putman, V.L.

    1997-06-01

    Typically used LMITCO criticality safety computational methods are evaluated for suitability when applied to INEEL 233 U systems which reasonably can be modeled as simple-geometry, single-unit systems. Sixty-seven critical experiments of uranium highly enriched in 233 U, including 57 aqueous solution, thermal-energy systems and 10 metal, fast-energy systems, were modeled. These experiments include 41 cylindrical and 26 spherical cores, and 41 reflected and 26 unreflected systems. No experiments were found for intermediate-neutron-energy ranges, or with interstitial non-hydrogenous materials typical of waste systems, mixed 233 U and plutonium, or reflectors such as steel, lead, or concrete. No simple geometry experiments were found with cubic or annular cores, or approximating infinite sea systems. Calculations were performed with various tools and methodologies. Nine cross-section libraries, based on ENDF/B-IV, -V, or -VI.2, or on Hansen-Roach source data, were used with cross-section processing methods of MCNP or SCALE. The k eff calculations were performed with neutral-particle transport and Monte Carlo methods of criticality codes DANT, MCNP 4A, and KENO Va

  6. Validation of Nuclear Criticality Safety Software and 27 energy group ENDF/B-IV cross sections

    International Nuclear Information System (INIS)

    Lee, B.L. Jr.

    1994-08-01

    The validation documented in this report is based on calculations that were executed during June through August 1992, and was completed in June 1993. The statistical analyses in Appendix C and Appendix D were completed in October 1993. This validation gives Portsmouth NCS personnel a basis for performing computerized KENO V.a calculations using the Martin Marietta Nuclear Criticality Safety Software. The first portion of the document outlines basic information in regard to validation of NCSS using ENDF/B-IV 27-group cross sections on the IBM 3090 at ORNL. A basic discussion of the NCSS system is provided, some discussion on the validation database and validation in general. Then follows a detailed description of the statistical analysis which was applied. The results of this validation indicate that the NCSS software may be used with confidence for criticality calculations at the Portsmouth Gaseous Diffusion Plant. When the validation results are treated as a single group, there is 95% confidence that 99.9% of future calculations of similar critical systems will have a calculated K eff > 0.9616. Based on this result the Portsmouth Nuclear Criticality Safety Department has adopted the calculational acceptance criteria that a k eff + 2σ ≤ 0.95 is safety subcritical. The validation of NCSS on the IBM 3090 at ORNL was extended to include NCSS on the IBM 3090 at K-25

  7. Use MACES IVA Suit for EVA Mobility Evaluations

    Science.gov (United States)

    Watson, Richard D.

    2014-01-01

    The use of an Intra-Vehicular Activity (IVA) suit for a spacewalk or Extra-Vehicular Activity (EVA) was evaluated for mobility and usability in the Neutral Buoyancy Lab (NBL) environment. The Space Shuttle Advanced Crew Escape Suit (ACES) has been modified (MACES) to integrate with the Orion spacecraft. The first several missions of the Orion MPCV spacecraft will not have mass available to carry an EVA specific suit so any EVA required will have to be performed by the MACES. Since the MACES was not designed with EVA in mind, it was unknown what mobility the suit would be able to provide for an EVA or if a person could perform useful tasks for an extended time inside the pressurized suit. The suit was evaluated in multiple NBL runs by a variety of subjects including crewmembers with significant EVA experience. Various functional mobility tasks performed included: translation, body positioning, carrying tools, body stabilization, equipment handling, and use of tools. Hardware configurations included with and without TMG, suit with IVA gloves and suit with EVA gloves. Most tasks were completed on ISS mockups with existing EVA tools. Some limited tasks were completed with prototype tools on a simulated rocky surface. Major findings include: demonstration of the ability to weigh-out the suit, understanding the need to have subjects perform multiple runs prior to getting feedback, determination of critical sizing factors, and need for adjustment of suit work envelop. The early testing has demonstrated the feasibility of EVA's limited duration and limited scope. Further testing is required with more flight like tasking and constraints to validate these early results. If the suit is used for EVA, it will require mission specific modifications for umbilical management or PLSS integration, safety tether attachment, and tool interfaces. These evaluations are continuing through calendar year 2014.

  8. A Turkish Version of the Critical-Care Pain Observation Tool: Reliability and Validity Assessment.

    Science.gov (United States)

    Aktaş, Yeşim Yaman; Karabulut, Neziha

    2017-08-01

    The study aim was to evaluate the validity and reliability of the Critical-Care Pain Observation Tool in critically ill patients. A repeated measures design was used for the study. A convenience sample of 66 patients who had undergone open-heart surgery in the cardiovascular surgery intensive care unit in Ordu, Turkey, was recruited for the study. The patients were evaluated by using the Critical-Care Pain Observation Tool at rest, during a nociceptive procedure (suctioning), and 20 minutes after the procedure while they were conscious and intubated after surgery. The Turkish version of the Critical-Care Pain Observation Tool has shown statistically acceptable levels of validity and reliability. Inter-rater reliability was supported by moderate-to-high-weighted κ coefficients (weighted κ coefficient = 0.55 to 1.00). For concurrent validity, significant associations were found between the scores on the Critical-Care Pain Observation Tool and the Behavioral Pain Scale scores. Discriminant validity was also supported by higher scores during suctioning (a nociceptive procedure) versus non-nociceptive procedures. The internal consistency of the Critical-Care Pain Observation Tool was 0.72 during a nociceptive procedure and 0.71 during a non-nociceptive procedure. The validity and reliability of the Turkish version of the Critical-Care Pain Observation Tool was determined to be acceptable for pain assessment in critical care, especially for patients who cannot communicate verbally. Copyright © 2016 American Society of PeriAnesthesia Nurses. Published by Elsevier Inc. All rights reserved.

  9. Fundamentals of critical analysis: the concept of validity and analysis essentials

    Directory of Open Access Journals (Sweden)

    Miguel Araujo Alonso

    2012-01-01

    Full Text Available Critical analysis of literature is an assessment process that allows the reader to get an idea of potential error in the results of a study, errors arising either from bias or confusion. Critical analysis attempts to establish whether the study meets expected criteria or methodological conditions. There are many checklists available that are commonly used to guide this analysis, but filling out a checklist is not tantamount to critical appraisal. Internal validity is defined as the extent to which a research finding actually represents the true relationship between exposure and outcome, considering the unique conditions in which the study was carried out. Attention must be given to the inclusion and exclusion criteria that were used, on the sampling methods, on the baseline characteristics of the patients that were enrolled in the study. External validity refers to the possibility of generalizing conclusions beyond the study sample or the study population. External validity includes population validity and ecological validity. Lastly, the article covers potential threats to external validity that must be considered when analyzing a study.

  10. Space Suit Joint Torque Testing

    Science.gov (United States)

    Valish, Dana J.

    2011-01-01

    In 2009 and early 2010, a test was performed to quantify the torque required to manipulate joints in several existing operational and prototype space suits in an effort to develop joint torque requirements appropriate for a new Constellation Program space suit system. The same test method was levied on the Constellation space suit contractors to verify that their suit design meets the requirements. However, because the original test was set up and conducted by a single test operator there was some question as to whether this method was repeatable enough to be considered a standard verification method for Constellation or other future space suits. In order to validate the method itself, a representative subset of the previous test was repeated, using the same information that would be available to space suit contractors, but set up and conducted by someone not familiar with the previous test. The resultant data was compared using graphical and statistical analysis and a variance in torque values for some of the tested joints was apparent. Potential variables that could have affected the data were identified and re-testing was conducted in an attempt to eliminate these variables. The results of the retest will be used to determine if further testing and modification is necessary before the method can be validated.

  11. Validation of Safety-Critical Systems for Aircraft Loss-of-Control Prevention and Recovery

    Science.gov (United States)

    Belcastro, Christine M.

    2012-01-01

    Validation of technologies developed for loss of control (LOC) prevention and recovery poses significant challenges. Aircraft LOC can result from a wide spectrum of hazards, often occurring in combination, which cannot be fully replicated during evaluation. Technologies developed for LOC prevention and recovery must therefore be effective under a wide variety of hazardous and uncertain conditions, and the validation framework must provide some measure of assurance that the new vehicle safety technologies do no harm (i.e., that they themselves do not introduce new safety risks). This paper summarizes a proposed validation framework for safety-critical systems, provides an overview of validation methods and tools developed by NASA to date within the Vehicle Systems Safety Project, and develops a preliminary set of test scenarios for the validation of technologies for LOC prevention and recovery

  12. Analysis and evaluation of critical experiments for validation of neutron transport calculations

    International Nuclear Information System (INIS)

    Bazzana, S.; Blaumann, H; Marquez Damian, J.I

    2009-01-01

    The calculation schemes, computational codes and nuclear data used in neutronic design require validation to obtain reliable results. In the nuclear criticality safety field this reliability also translates into a higher level of safety in procedures involving fissile material. The International Criticality Safety Benchmark Evaluation Project is an OECD/NEA activity led by the United States, in which participants from over 20 countries evaluate and publish criticality safety benchmarks. The product of this project is a set of benchmark experiment evaluations that are published annually in the International Handbook of Evaluated Criticality Safety Benchmark Experiments. With the recent participation of Argentina, this information is now available for use by the neutron calculation and criticality safety groups in Argentina. This work presents the methodology used for the evaluation of experimental data, some results obtained by the application of these methods, and some examples of the data available in the Handbook. [es

  13. Validation of the Dutch functional, communicative and critical health literacy scales

    NARCIS (Netherlands)

    van der Vaart, R.; Drossaert, Constance H.C.; Taal, Erik; ten Klooster, Peter M.; Hilderink-Koertshuis, Rianne T.E.; Klaase, Joost M.; van de Laar, Mart A F J

    2012-01-01

    Objective: While most existing health literacy (HL) measures focus primarily on reading comprehension, the functional, communicative and critical HL scales from Ishikawa et al. [19] aim to measure a broader HL spectrum. The objective of this study was to evaluate the validity of the Dutch

  14. Validation of an e-Learning 3.0 Critical Success Factors Framework: A Qualitative Research

    Science.gov (United States)

    Miranda, Paula; Isaias, Pedro; Costa, Carlos J.; Pifano, Sara

    2017-01-01

    Aim/Purpose: As e-Learning 3.0 evolves from a theoretical construct into an actual solution for online learning, it becomes crucial to accompany this progress by scrutinising the elements that are at the origin of its success. Background: This paper outlines a framework of e-Learning 3.0's critical success factors and its empirical validation.…

  15. Validation of SCALE-4 criticality sequences using ENDF/B-V data

    International Nuclear Information System (INIS)

    Bowman, S.M.; Wright, R.Q.; DeHart, M.D.; Taniuchi, H.

    1993-01-01

    The SCALE code system developed at Oak Ridge National Laboratory contains criticality safety analysis sequences that include the KENO V.a Monte Carlo code for calculation of the effective multiplication factor. These sequences are widely used for criticality safety analyses performed both in the United States and abroad. The purpose of the current work is to validate the SCALE-4 criticality sequences with an ENDF/B-V cross-section library for future distribution with SCALE-4. The library used for this validation is a broad-group library (44 groups) collapsed from the 238-group SCALE library. Extensive data testing of both the 238-group and the 44-group libraries included 10 fast and 18 thermal CSEWG benchmarks and 5 other fast benchmarks. Both libraries contain approximately 300 nuclides and are, therefore, capable of modeling most systems, including those containing spent fuel or radioactive waste. The validation of the broad-group library used 93 critical experiments as benchmarks. The range of experiments included 60 light-water-reactor fuel rod lattices, 13 mixed-oxide fuel rod lattice, and 15 other low- and high-enriched uranium critical assemblies

  16. Sensitivity and uncertainty analyses applied to criticality safety validation. Volume 2

    International Nuclear Information System (INIS)

    Broadhead, B.L.; Hopper, C.M.; Parks, C.V.

    1999-01-01

    This report presents the application of sensitivity and uncertainty (S/U) analysis methodologies developed in Volume 1 to the code/data validation tasks of a criticality safety computational study. Sensitivity and uncertainty analysis methods were first developed for application to fast reactor studies in the 1970s. This work has revitalized and updated the existing S/U computational capabilities such that they can be used as prototypic modules of the SCALE code system, which contains criticality analysis tools currently in use by criticality safety practitioners. After complete development, simplified tools are expected to be released for general use. The methods for application of S/U and generalized linear-least-square methodology (GLLSM) tools to the criticality safety validation procedures were described in Volume 1 of this report. Volume 2 of this report presents the application of these procedures to the validation of criticality safety analyses supporting uranium operations where enrichments are greater than 5 wt %. Specifically, the traditional k eff trending analyses are compared with newly developed k eff trending procedures, utilizing the D and c k coefficients described in Volume 1. These newly developed procedures are applied to a family of postulated systems involving U(11)O 2 fuel, with H/X values ranging from 0--1,000. These analyses produced a series of guidance and recommendations for the general usage of these various techniques. Recommendations for future work are also detailed

  17. Accumulation of Content Validation Evidence for the Critical Thinking Self-Assessment Scale.

    Science.gov (United States)

    Nair, Girija Gopinathan; Hellsten, Laurie-Ann M; Stamler, Lynnette Leeseberg

    2017-04-01

    Critical thinking skills (CTS) are essential for nurses; assessing students' acquisition of these skills is a mandate of nursing curricula. This study aimed to develop a self-assessment instrument of critical thinking skills (Critical Thinking Self-Assessment Scale [CTSAS]) for students' self-monitoring. An initial pool of 196 items across 6 core cognitive skills and 16 subskills were generated using the American Philosophical Association definition of CTS. Experts' content review of the items and their ratings provided evidence of content relevance using the item-level content validity index (I-CVI) and Aiken's content validity coefficient (VIk). 115 items were retained (range of I-CVI values = .70 to .94 and range of VIk values = .69-.95; significant at pself-assessment purposes.

  18. International report to validate criticality safety calculations for fissile material transport

    International Nuclear Information System (INIS)

    Whitesides, G.E.

    1984-01-01

    During the past three years a Working Group established by the Organization for Economic Co-operation and Development's Nuclear Energy Agency (OECD-NEA) in Paris, France, has been studying the validity and applicability of a variety of criticality safety computer programs and their associated nuclear data for the computation of the neutron multiplication factor, k/sub eff/, for various transport packages used in the fuel cycle. The principal objective of this work has been to provide an internationally acceptable basis for the licensing authorities in a country to honor licensing approvals granted by other participating countries. Eleven countries participated in the initial study which consisted of examining criticality safety calculations for packages designed for spent light water reactor fuel transport. This paper presents a summary of this study which has been completed and reported in an OECD-NEA Report No. CSNI-71. The basic goal of this study was to outline a satisfactory validation procedure for this particular application. First, a set of actual critical experiments were chosen which contained the various material and geometric properties present in typical LWR transport containers. Secondly, calculations were made by each of the methods in order to determine how accurately each method reproduced the experimental values. This successful effort in developing a benchmark procedure for validating criticality calculations for spent LWR transport packages along with the successful intercomparison of a number of methods should provide increased confidence by licensing authorities in the use of these methods for this area of application. 4 references, 2 figures

  19. Validity and reliability of The Johns Hopkins Adapted Cognitive Exam for critically ill patients.

    Science.gov (United States)

    Lewin, John J; LeDroux, Shannon N; Shermock, Kenneth M; Thompson, Carol B; Goodwin, Haley E; Mirski, Erin A; Gill, Randeep S; Mirski, Marek A

    2012-01-01

    To validate The Johns Hopkins Adapted Cognitive Exam designed to assess and quantify cognition in critically ill patients. Prospective cohort study. Neurosciences, surgical, and medical intensive care units at The Johns Hopkins Hospital. One hundred six adult critically ill patients. One expert neurologic assessment and four measurements of the Adapted Cognitive Exam (all patients). Four measurements of the Folstein Mini-Mental State Examination in nonintubated patients only. Adapted Cognitive Exam and Mini-Mental State Examination were performed by 76 different raters. One hundred six patients were assessed, 46 intubated and 60 nonintubated, resulting in 424 Adapted Cognitive Exam and 240 Mini-Mental State Examination measurements. Criterion validity was assessed by comparing Adapted Cognitive Exam with a neurointensivist's assessment of cognitive status (ρ = 0.83, p validity was assessed by comparing Adapted Cognitive Exam with Mini-Mental State Examination in nonintubated patients (ρ = 0.81, p validity was assessed by surveying raters who used both the Adapted Cognitive Exam and Mini-Mental State Examination and indicated the Adapted Cognitive Exam was an accurate reflection of the patient's cognitive status, more sensitive a marker of cognition than the Mini-Mental State Examination, and easy to use. The Adapted Cognitive Exam demonstrated excellent interrater reliability (intraclass correlation coefficient = 0.997; 95% confidence interval 0.997-0.998) and interitem reliability of each of the five subscales of the Adapted Cognitive Exam and Mini-Mental State Examination (Cronbach's α: range for Adapted Cognitive Exam = 0.83-0.88; range for Mini-Mental State Examination = 0.72-0.81). The Adapted Cognitive Exam is the first valid and reliable examination for the assessment and quantification of cognition in critically ill patients. It provides a useful, objective tool that can be used by any member of the interdisciplinary critical care team to support

  20. The development and validation of a test of science critical thinking for fifth graders.

    Science.gov (United States)

    Mapeala, Ruslan; Siew, Nyet Moi

    2015-01-01

    The paper described the development and validation of the Test of Science Critical Thinking (TSCT) to measure the three critical thinking skill constructs: comparing and contrasting, sequencing, and identifying cause and effect. The initial TSCT consisted of 55 multiple choice test items, each of which required participants to select a correct response and a correct choice of critical thinking used for their response. Data were obtained from a purposive sampling of 30 fifth graders in a pilot study carried out in a primary school in Sabah, Malaysia. Students underwent the sessions of teaching and learning activities for 9 weeks using the Thinking Maps-aided Problem-Based Learning Module before they answered the TSCT test. Analyses were conducted to check on difficulty index (p) and discrimination index (d), internal consistency reliability, content validity, and face validity. Analysis of the test-retest reliability data was conducted separately for a group of fifth graders with similar ability. Findings of the pilot study showed that out of initial 55 administered items, only 30 items with relatively good difficulty index (p) ranged from 0.40 to 0.60 and with good discrimination index (d) ranged within 0.20-1.00 were selected. The Kuder-Richardson reliability value was found to be appropriate and relatively high with 0.70, 0.73 and 0.92 for identifying cause and effect, sequencing, and comparing and contrasting respectively. The content validity index obtained from three expert judgments equalled or exceeded 0.95. In addition, test-retest reliability showed good, statistically significant correlations ([Formula: see text]). From the above results, the selected 30-item TSCT was found to have sufficient reliability and validity and would therefore represent a useful tool for measuring critical thinking ability among fifth graders in primary science.

  1. Validity of contents of a paediatric critical comfort scale using mixed methodology.

    Science.gov (United States)

    Bosch-Alcaraz, A; Jordan-Garcia, I; Alcolea-Monge, S; Fernández-Lorenzo, R; Carrasquer-Feixa, E; Ferrer-Orona, M; Falcó-Pegueroles, A

    Critical illness in paediatric patients includes acute conditions in a healthy child as well as exacerbations of chronic disease, and therefore these situations must be clinically managed in Critical Care Units. The role of the paediatric nurse is to ensure the comfort of these critically ill patients. To that end, instruments are required that correctly assess critical comfort. To describe the process for validating the content of a paediatric critical comfort scale using mixed-method research. Initially, a cross-cultural adaptation of the Comfort Behavior Scale from English to Spanish using the translation and back-translation method was made. After that, its content was evaluated using mixed method research. This second step was divided into a quantitative stage in which an ad hoc questionnaire was used in order to assess each scale's item relevance and wording and a qualitative stage with two meetings with health professionals, patients and a family member following the Delphi Method recommendations. All scale items obtained a content validity index >0.80, except physical movement in its relevance, which obtained 0.76. Global content scale validity was 0.87 (high). During the qualitative stage, items from each of the scale domains were reformulated or eliminated in order to make the scale more comprehensible and applicable. The use of a mixed-method research methodology during the scale content validity phase allows the design of a richer and more assessment-sensitive instrument. Copyright © 2017 Sociedad Española de Enfermería Intensiva y Unidades Coronarias (SEEIUC). Publicado por Elsevier España, S.L.U. All rights reserved.

  2. Sensitivity and uncertainty analyses applied to criticality safety validation, methods development. Volume 1

    International Nuclear Information System (INIS)

    Broadhead, B.L.; Hopper, C.M.; Childs, R.L.; Parks, C.V.

    1999-01-01

    This report presents the application of sensitivity and uncertainty (S/U) analysis methodologies to the code/data validation tasks of a criticality safety computational study. Sensitivity and uncertainty analysis methods were first developed for application to fast reactor studies in the 1970s. This work has revitalized and updated the available S/U computational capabilities such that they can be used as prototypic modules of the SCALE code system, which contains criticality analysis tools currently used by criticality safety practitioners. After complete development, simplified tools are expected to be released for general use. The S/U methods that are presented in this volume are designed to provide a formal means of establishing the range (or area) of applicability for criticality safety data validation studies. The development of parameters that are analogous to the standard trending parameters forms the key to the technique. These parameters are the D parameters, which represent the differences by group of sensitivity profiles, and the ck parameters, which are the correlation coefficients for the calculational uncertainties between systems; each set of parameters gives information relative to the similarity between pairs of selected systems, e.g., a critical experiment and a specific real-world system (the application)

  3. Analysis of Fresh Fuel Critical Experiments Appropriate for Burnup Credit Validation

    International Nuclear Information System (INIS)

    DeHart, M.D.

    1995-01-01

    The ANS/ANS-8.1 standard requires that calculational methods used in determining criticality safety limits for applications outside reactors be validated by comparison with appropriate critical experiments. This report provides a detailed description of 34 fresh fuel critical experiments and their analyses using the SCALE-4.2 code system and the 27-group ENDF/B-IV cross-section library. The 34 critical experiments were selected based on geometry, material, and neutron interaction characteristics that are applicable to a transportation cask loaded with pressurized-water-reactor spent fuel. These 34 experiments are a representative subset of a much larger data base of low-enriched uranium and mixed-oxide critical experiments. A statistical approach is described and used to obtain an estimate of the bias and uncertainty in the calculational methods and to predict a confidence limit for a calculated neutron multiplication factor. The SCALE-4.2 results for a superset of approximately 100 criticals are included in uncertainty analyses, but descriptions of the individual criticals are not included

  4. Validation of the Essentials of Magnetism II in Chinese critical care settings.

    Science.gov (United States)

    Bai, Jinbing; Hsu, Lily; Zhang, Qing

    2015-05-01

    To translate and evaluate the psychometric properties of the Essentials of Magnetism II tool (EOM II) for Chinese nurses in critical care settings. The EOM II is a reliable and valid scale for measuring the healthy work environment (HWE) for nurses in Western countries, however, it has not been validated among Chinese nurses. The translation of the EOM II followed internationally recognized guidelines. The Chinese version of the Essentials of Magnetism II tool (C-EOM II) was reviewed by an expert panel for culturally semantic equivalence and content validity. Then, 706 nurses from 28 intensive care units (ICUs) affiliated with 14 tertiary hospitals participated in this study. The reliability of the C-EOM II was assessed using the Cronbach's alpha coefficient; the content validity of this scale was assessed using the content validity index (CVI); and the construct validity was assessed using the confirmatory factor analysis (CFA). The C-EOM II showed excellent content validity with a CVI of 0·92. All the subscales of the C-EOM II were significantly correlated with overall nurse job satisfaction and nurse-assessed quality of care. The CFA showed that the C-EOM II was composed of 45 items with nine factors, accounting for 46·51% of the total variance. Cronbach's alpha coefficients for these factors ranged from 0·56 to 0·89. The C-EOM II is a promising scale to assess the HWE for Chinese ICU nurses. Nursing administrators and health care policy-makers can use the C-EOM II to evaluate clinical work environment so that a healthier work environment can be created and sustained for staff nurses. © 2013 British Association of Critical Care Nurses.

  5. Results from Carbon Dioxide Washout Testing Using a Suited Manikin Test Apparatus with a Space Suit Ventilation Test Loop

    Science.gov (United States)

    Chullen, Cinda; Conger, Bruce; McMillin, Summer; Vonau, Walt; Kanne, Bryan; Korona, Adam; Swickrath, Mike

    2016-01-01

    NASA is developing an advanced portable life support system (PLSS) to meet the needs of a new NASA advanced space suit. The PLSS is one of the most critical aspects of the space suit providing the necessary oxygen, ventilation, and thermal protection for an astronaut performing a spacewalk. The ventilation subsystem in the PLSS must provide sufficient carbon dioxide (CO2) removal and ensure that the CO2 is washed away from the oronasal region of the astronaut. CO2 washout is a term used to describe the mechanism by which CO2 levels are controlled within the helmet to limit the concentration of CO2 inhaled by the astronaut. Accumulation of CO2 in the helmet or throughout the ventilation loop could cause the suited astronaut to experience hypercapnia (excessive carbon dioxide in the blood). A suited manikin test apparatus (SMTA) integrated with a space suit ventilation test loop was designed, developed, and assembled at NASA in order to experimentally validate adequate CO2 removal throughout the PLSS ventilation subsystem and to quantify CO2 washout performance under various conditions. The test results from this integrated system will be used to validate analytical models and augment human testing. This paper presents the system integration of the PLSS ventilation test loop with the SMTA including the newly developed regenerative Rapid Cycle Amine component used for CO2 removal and tidal breathing capability to emulate the human. The testing and analytical results of the integrated system are presented along with future work.

  6. Relative criterion for validity of a semiclassical approach to the dynamics near quantum critical points.

    Science.gov (United States)

    Wang, Qian; Qin, Pinquan; Wang, Wen-ge

    2015-10-01

    Based on an analysis of Feynman's path integral formulation of the propagator, a relative criterion is proposed for validity of a semiclassical approach to the dynamics near critical points in a class of systems undergoing quantum phase transitions. It is given by an effective Planck constant, in the relative sense that a smaller effective Planck constant implies better performance of the semiclassical approach. Numerical tests of this relative criterion are given in the XY model and in the Dicke model.

  7. Validation of the criticality calculation for fuel elements using the Gamtec 2 - Keno 2 and 4

    International Nuclear Information System (INIS)

    Teixeira, M.C.C.; Andrade, M.C. de

    1990-01-01

    For criticality safety in the fabrication, storage and transportation of fuel assemblies, subcriticality analysis must be done. The calculations are performed at CDTN with the GAMTEC computer code, to homogenize the fuel assembly in order to create 16 group cross-section library, and with KENO code, for determining the multiplication factor. To validate the calculational method, suitable Benchmark experiments have been done. The results show that the calculational model overestimates kef when kef+ 2 σ was considered. (author) [pt

  8. Experimental methods to validate measures of emotional state and readiness for duty in critical operations

    International Nuclear Information System (INIS)

    Weston, Louise Marie

    2007-01-01

    A recent report on criticality accidents in nuclear facilities indicates that human error played a major role in a significant number of incidents with serious consequences and that some of these human errors may be related to the emotional state of the individual. A pre-shift test to detect a deleterious emotional state could reduce the occurrence of such errors in critical operations. The effectiveness of pre-shift testing is a challenge because of the need to gather predictive data in a relatively short test period and the potential occurrence of learning effects due to a requirement for frequent testing. This report reviews the different types of reliability and validity methods and testing and statistical analysis procedures to validate measures of emotional state. The ultimate value of a validation study depends upon the percentage of human errors in critical operations that are due to the emotional state of the individual. A review of the literature to identify the most promising predictors of emotional state for this application is highly recommended

  9. Content validity of critical success factors for e-Government implementation in Indonesia

    Science.gov (United States)

    Napitupulu, D.; Syafrullah, M.; Rahim, R.; Amar, A.; Sucahyo, YG

    2018-05-01

    The purpose of this research is to validate the Critical Success Factors (CSFs) of e-Government implementation in Indonesia. The e-Government initiative conducted only to obey the regulation but ignoring the quality. Defining CSFs will help government agencies to avoid failure of e-Government projects. A survey with the questionnaire was used to validate the item of CSF based on expert judgment through two round of Delphi. The result showed from 67 subjects in instrument tested; there are 11 invalid items deleted and remain only 56 items that had good content validity and internal reliability. Therefore, all 56 CSFs should be adopted by government agencies in Indonesia to support e-Government implementation.

  10. Standard problem exercise to validate criticality codes for spent LWR fuel transport container calculations

    International Nuclear Information System (INIS)

    Whitesides, G.H.; Stephens, M.E.

    1984-01-01

    During the past two years, a Working Group established by the Organization for Economic Co-Operation and Development's Nuclear Energy Agency (OECD-NEA) has been developing a set of criticality benchmark problems which could be used to help establish the validity of criticality safety computer programs and their associated nuclear data for calculation of ksub(eff) for spent light water reactor (LWR) fuel transport containers. The basic goal of this effort was to identify a set of actual critical experiments which would contain the various material and geometric properties present in spent LWR transport contrainers. These data, when used by the various computational methods, are intended to demonstrate the ability of each method to accurately reproduce the experimentally measured ksub(eff) for the parameters under consideration

  11. Validation of the Monte Carlo Criticality Program KENO V. a for highly-enriched uranium systems

    Energy Technology Data Exchange (ETDEWEB)

    Knight, J.R.

    1984-11-01

    A series of calculations based on critical experiments have been performed using the KENO V.a Monte Carlo Criticality Program for the purpose of validating KENO V.a for use in evaluating Y-12 Plant criticality problems. The experiments were reflected and unreflected systems of single units and arrays containing highly enriched uranium metal or uranium compounds. Various geometrical shapes were used in the experiments. The SCALE control module CSAS25 with the 27-group ENDF/B-4 cross-section library was used to perform the calculations. Some of the experiments were also calculated using the 16-group Hansen-Roach Library. Results are presented in a series of tables and discussed. Results show that the criteria established for the safe application of the KENO IV program may also be used for KENO V.a results.

  12. Proceedings of the workshop on integral experiment covariance data for critical safety validation

    Energy Technology Data Exchange (ETDEWEB)

    Stuke, Maik (ed.)

    2016-04-15

    For some time, attempts to quantify the statistical dependencies of critical experiments and to account for them properly in validation procedures were discussed in the literature by various groups. Besides the development of suitable methods especially the quality and modeling issues of the freely available experimental data are in the focus of current discussions, carried out for example in the Expert Group on Uncertainty Analysis for Criticality Safety Assessment (UACSA) of the OECD-NEA Nuclear Science Committee. The same committee compiles and publishes also the freely available experimental data in the International Handbook of Evaluated Criticality Safety Benchmark Experiments. Most of these experiments were performed as series and might share parts of experimental setups leading to correlated results. The quality of the determination of these correlations and the underlying covariance data depend strongly on the quality of the documentation of experiments.

  13. Proceedings of the workshop on integral experiment covariance data for critical safety validation

    International Nuclear Information System (INIS)

    Stuke, Maik

    2016-04-01

    For some time, attempts to quantify the statistical dependencies of critical experiments and to account for them properly in validation procedures were discussed in the literature by various groups. Besides the development of suitable methods especially the quality and modeling issues of the freely available experimental data are in the focus of current discussions, carried out for example in the Expert Group on Uncertainty Analysis for Criticality Safety Assessment (UACSA) of the OECD-NEA Nuclear Science Committee. The same committee compiles and publishes also the freely available experimental data in the International Handbook of Evaluated Criticality Safety Benchmark Experiments. Most of these experiments were performed as series and might share parts of experimental setups leading to correlated results. The quality of the determination of these correlations and the underlying covariance data depend strongly on the quality of the documentation of experiments.

  14. Development process and initial validation of the Ethical Conflict in Nursing Questionnaire-Critical Care Version.

    Science.gov (United States)

    Falcó-Pegueroles, Anna; Lluch-Canut, Teresa; Guàrdia-Olmos, Joan

    2013-06-01

    Ethical conflicts are arising as a result of the growing complexity of clinical care, coupled with technological advances. Most studies that have developed instruments for measuring ethical conflict base their measures on the variables 'frequency' and 'degree of conflict'. In our view, however, these variables are insufficient for explaining the root of ethical conflicts. Consequently, the present study formulates a conceptual model that also includes the variable 'exposure to conflict', as well as considering six 'types of ethical conflict'. An instrument was then designed to measure the ethical conflicts experienced by nurses who work with critical care patients. The paper describes the development process and validation of this instrument, the Ethical Conflict in Nursing Questionnaire Critical Care Version (ECNQ-CCV). The sample comprised 205 nursing professionals from the critical care units of two hospitals in Barcelona (Spain). The ECNQ-CCV presents 19 nursing scenarios with the potential to produce ethical conflict in the critical care setting. Exposure to ethical conflict was assessed by means of the Index of Exposure to Ethical Conflict (IEEC), a specific index developed to provide a reference value for each respondent by combining the intensity and frequency of occurrence of each scenario featured in the ECNQ-CCV. Following content validity, construct validity was assessed by means of Exploratory Factor Analysis (EFA), while Cronbach's alpha was used to evaluate the instrument's reliability. All analyses were performed using the statistical software PASW v19. Cronbach's alpha for the ECNQ-CCV as a whole was 0.882, which is higher than the values reported for certain other related instruments. The EFA suggested a unidimensional structure, with one component accounting for 33.41% of the explained variance. The ECNQ-CCV is shown to a valid and reliable instrument for use in critical care units. Its structure is such that the four variables on which our model

  15. Measuring functional, interactive and critical health literacy of Chinese secondary school students: reliable, valid and feasible?

    Science.gov (United States)

    Guo, Shuaijun; Davis, Elise; Yu, Xiaoming; Naccarella, Lucio; Armstrong, Rebecca; Abel, Thomas; Browne, Geoffrey; Shi, Yanqin

    2018-04-01

    Health literacy is an increasingly important topic in the global context. In mainland China, health literacy measures mainly focus on health knowledge and practices or on the functional domain for adolescents. However, little is known about interactive and critical domains. This study aimed to adopt a skills-based and three-domain (functional, interactive and critical) instrument to measure health literacy in Chinese adolescents and to examine the status and determinants of each domain. Using a systematic review, the eight-item Health Literacy Assessment Tool (HLAT-8) was selected and translated from English to Chinese (c-HLAT-8). Following the translation process, a cross-sectional study was conducted in four secondary schools in Beijing, China. A total of 650 students in Years 7-9 were recruited to complete a self-administered questionnaire that assessed socio-demographics, self-efficacy, social support, school environment, community environment and health literacy. Results showed that the c-HLAT-8 had satisfactory reliability (Cronbach's α = 0.79; intra-class correlation coefficient = 0.72) and strong validity (translation validity index (TVI) ≥0.95; χ 2 / df = 3.388, p students had an average score of 26.37 (±5.89) for the c-HLAT-8. When the determinants of each domain of health literacy were examined, social support was the strongest predictor of interactive and critical health literacy. On the contrary, self-efficacy and school environment played more dominant roles in predicting functional health literacy. The c-HLAT-8 was demonstrated to be a reliable, valid and feasible instrument for measuring functional, interactive and critical health literacy among Chinese students. The current findings indicate that increasing self-efficacy, social support and creating supportive environments are important for promoting health literacy in secondary school settings in China.

  16. Comparisons of the MCNP criticality benchmark suite with ENDF/B-VI.8, JENDL-3.3, and JEFF-3.0

    International Nuclear Information System (INIS)

    Kim, Do Heon; Gil, Choong-Sup; Kim, Jung-Do; Chang, Jonghwa

    2003-01-01

    A comparative study has been performed with the latest evaluated nuclear data libraries ENDF/B-VI.8, JENDL-3.3, and JEFF-3.0. The study has been conducted through the benchmark calculations for 91 criticality problems with the libraries processed for MCNP4C. The calculation results have been compared with those of the ENDF60 library. The self-shielding effects of the unresolved-resonance (UR) probability tables have also been estimated for each library. The χ 2 differences between the MCNP results and experimental data were calculated for the libraries. (author)

  17. Experimental validation of TASS/SMR-S critical flow model for the integral reactor SMART

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Si Won; Ra, In Sik; Kim, Kun Yeup [ACT Co., Daejeon (Korea, Republic of); Chung, Young Jong [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2011-05-15

    An advanced integral PWR, SMART (System- Integrated Modular Advanced ReacTor) is being developed in KAERI. It has a compact size and a relatively small power rating (330MWt) compared to a conventional reactor. Because new concepts are applied to SMART, an experimental and analytical validation is necessary for the safety evaluation of SMART. The analytical safety validation is being accomplished by a safety analysis code for an integral reactor, TASS/SMR-S developed by KAERI. TASS/SMR-S uses a lumped parameter one dimensional node and path modeling for the thermal hydraulic calculation and it uses point kinetics for the reactor power calculation. It has models for a general usage such as a core heat transfer model, a wall heat structure model, a critical flow model, component models, and it also has many SMART specific models such as an once through helical coiled steam generator model, and a condensate heat transfer model. To ensure that the TASS/SMR-S code has the calculation capability for the safety evaluation of SMART, the code should be validated for the specific models with the separate effect test experimental results. In this study, TASS/SMR-S critical flow model is evaluated as compared with SMD (Super Moby Dick) experiment

  18. Instant Spring Tool Suite

    CERN Document Server

    Chiang, Geoff

    2013-01-01

    Filled with practical, step-by-step instructions and clear explanations for the most important and useful tasks. A tutorial guide that walks you through how to use the features of Spring Tool Suite using well defined sections for the different parts of Spring.Instant Spring Tool Suite is for novice to intermediate Java developers looking to get a head-start in enterprise application development using Spring Tool Suite and the Spring framework. If you are looking for a guide for effective application development using Spring Tool Suite, then this book is for you.

  19. Generation of integral experiment covariance data and their impact on criticality safety validation

    Energy Technology Data Exchange (ETDEWEB)

    Stuke, Maik; Peters, Elisabeth; Sommer, Fabian

    2016-11-15

    The quantification of statistical dependencies in data of critical experiments and how to account for them properly in validation procedures has been discussed in the literature by various groups. However, these subjects are still an active topic in the Expert Group on Uncertainty Analysis for Criticality Safety Assessment (UACSA) of the OECDNEA Nuclear Science Committee. The latter compiles and publishes the freely available experimental data collection, the International Handbook of Evaluated Criticality Safety Benchmark Experiments, ICSBEP. Most of the experiments were performed as series and share parts of experimental setups, consequently leading to correlation effects in the results. The correct consideration of correlated data seems to be inevitable if the experimental data in a validation procedure is limited or one cannot rely on a sufficient number of uncorrelated data sets, e.g. from different laboratories using different setups. The general determination of correlations and the underlying covariance data as well as the consideration of them in a validation procedure is the focus of the following work. We discuss and demonstrate possible effects on calculated k{sub eff}'s, their uncertainties, and the corresponding covariance matrices due to interpretation of evaluated experimental data and its translation into calculation models. The work shows effects of various modeling approaches, varying distribution functions of parameters and compares and discusses results from the applied Monte-Carlo sampling method with available data on correlations. Our findings indicate that for the reliable determination of integral experimental covariance matrices or the correlation coefficients a detailed study of the underlying experimental data, the modeling approach and assumptions made, and the resulting sensitivity analysis seems to be inevitable. Further, a Bayesian method is discussed to include integral experimental covariance data when estimating an

  20. Generation of integral experiment covariance data and their impact on criticality safety validation

    International Nuclear Information System (INIS)

    Stuke, Maik; Peters, Elisabeth; Sommer, Fabian

    2016-11-01

    The quantification of statistical dependencies in data of critical experiments and how to account for them properly in validation procedures has been discussed in the literature by various groups. However, these subjects are still an active topic in the Expert Group on Uncertainty Analysis for Criticality Safety Assessment (UACSA) of the OECDNEA Nuclear Science Committee. The latter compiles and publishes the freely available experimental data collection, the International Handbook of Evaluated Criticality Safety Benchmark Experiments, ICSBEP. Most of the experiments were performed as series and share parts of experimental setups, consequently leading to correlation effects in the results. The correct consideration of correlated data seems to be inevitable if the experimental data in a validation procedure is limited or one cannot rely on a sufficient number of uncorrelated data sets, e.g. from different laboratories using different setups. The general determination of correlations and the underlying covariance data as well as the consideration of them in a validation procedure is the focus of the following work. We discuss and demonstrate possible effects on calculated k eff 's, their uncertainties, and the corresponding covariance matrices due to interpretation of evaluated experimental data and its translation into calculation models. The work shows effects of various modeling approaches, varying distribution functions of parameters and compares and discusses results from the applied Monte-Carlo sampling method with available data on correlations. Our findings indicate that for the reliable determination of integral experimental covariance matrices or the correlation coefficients a detailed study of the underlying experimental data, the modeling approach and assumptions made, and the resulting sensitivity analysis seems to be inevitable. Further, a Bayesian method is discussed to include integral experimental covariance data when estimating an application

  1. Trustworthy Variant Derivation with Translation Validation for Safety Critical Product Lines

    DEFF Research Database (Denmark)

    Iosif-Lazăr, Alexandru Florin; Wasowski, Andrzej

    2016-01-01

    Software product line (SPL) engineering facilitates development of entire families of software products with systematic reuse. Model driven SPLs use models in the design and development process. In the safety critical domain, validation of models and testing of code increases the quality...... of the products altogether. However, to maintain this trustworthiness it is necessary to know that the SPL tools, which manipulate models and code to derive concrete product variants, do not introduce errors in the process. We propose a general technique of checking correctness of product derivation tools through...... translation validation. We demonstrate it using Featherweight VML—a core language for separate variability modeling relying on a single kind of variation point to define transformations of artifacts seen as object models. We use Featherweight VML with its semantics as a correctness specification...

  2. Validation of new 240Pu cross section and covariance data via criticality calculation

    International Nuclear Information System (INIS)

    Kim, Do Heon; Gil, Choong-Sup; Kim, Hyeong Il; Lee, Young-Ouk; Leal, Luiz C.; Dunn, Michael E.

    2011-01-01

    Recent collaboration between KAERI and ORNL has completed an evaluation for 240 Pu neutron cross section with covariance data. The new 240 Pu cross section data has been validated through 28 criticality safety benchmark problems taken from the ICSBEP and/or CSEWG specifications with MCNP calculations. The calculation results based on the new evaluation have been compared with those based on recent evaluations such as ENDF/B-VII.0, JEFF-3.1.1, and JENDL-4.0. In addition, the new 240 Pu covariance data has been tested for some criticality benchmarks via the DANTSYS/SUSD3D-based nuclear data sensitivity and uncertainty analysis of k eff . The k eff uncertainty estimates by the new covariance data has been compared with those by JENDL-4.0, JENDL-3.3, and Low-Fidelity covariance data. (author)

  3. Validation of an e-Learning 3.0 Critical Success Factors Framework: A Qualitative Research

    OpenAIRE

    Paula Miranda; Pedro Isaias; Carlos J Costa; Sara Pifano

    2017-01-01

    Aim/Purpose: As e-Learning 3.0 evolves from a theoretical construct into an actual solution for online learning, it becomes crucial to accompany this progress by scrutinising the elements that are at the origin of its success. Background: This paper outlines a framework of e-Learning 3.0’s critical success factors and its empirical validation. Methodology: The framework is the result of an extensive literature review and its empirical substantiation derives from semi-structured inte...

  4. Validation of the EIR LWR calculation methods for criticality assessment of storage pools

    International Nuclear Information System (INIS)

    Grimm, P.; Paratte, J.M.

    1986-11-01

    The EIR code system for the calculation of light water reactors is presented and the methods used are briefly described. The application of the system to various types of critical experiments and benchmark problems proves its good accuracy, even for heterogeneous configurations containing strong neutron absorbers such as Boral. Since the multiplication factor k eff is normally somewhat overpredicted and the spread of the results is small, this code system is validated for the calculation of storage pools, taking into account a safety margins of 1.5% on k eff . (author)

  5. Development and validation of a criticality calculation scheme based on French deterministic transport codes

    International Nuclear Information System (INIS)

    Santamarina, A.

    1991-01-01

    A criticality-safety calculational scheme using the automated deterministic code system, APOLLO-BISTRO, has been developed. The cell/assembly code APOLLO is used mainly in LWR and HCR design calculations, and its validation spans a wide range of moderation ratios, including voided configurations. Its recent 99-group library and self-shielded cross-sections has been extensively qualified through critical experiments and PWR spent fuel analysis. The PIC self-shielding formalism enables a rigorous treatment of the fuel double heterogeneity in dissolver medium calculations. BISTRO is an optimized multidimensional SN code, part of the modular CCRR package used mainly in FBR calculations. The APOLLO-BISTRO scheme was applied to the 18 experimental benchmarks selected by the OECD/NEACRP Criticality Calculation Working Group. The Calculation-Experiment discrepancy was within ± 1% in ΔK/K and always looked consistent with the experimental uncertainty margin. In the critical experiments corresponding to a dissolver type benchmark, our tools computed a satisfactory Keff. In the VALDUC fuel storage experiments, with hafnium plates, the computed Keff ranged between 0.994 and 1.003 for the various watergaps spacing the fuel clusters from the absorber plates. The APOLLO-KENOEUR statistic calculational scheme, based on the same self-shielded multigroup library, supplied consistent results within 0.3% in ΔK/K. (Author)

  6. Projecting changes in the distribution and productivity of living marine resources: A critical review of the suite of modelling approaches used in the large European project VECTORS

    Science.gov (United States)

    Peck, Myron A.; Arvanitidis, Christos; Butenschön, Momme; Canu, Donata Melaku; Chatzinikolaou, Eva; Cucco, Andrea; Domenici, Paolo; Fernandes, Jose A.; Gasche, Loic; Huebert, Klaus B.; Hufnagl, Marc; Jones, Miranda C.; Kempf, Alexander; Keyl, Friedemann; Maar, Marie; Mahévas, Stéphanie; Marchal, Paul; Nicolas, Delphine; Pinnegar, John K.; Rivot, Etienne; Rochette, Sébastien; Sell, Anne F.; Sinerchia, Matteo; Solidoro, Cosimo; Somerfield, Paul J.; Teal, Lorna R.; Travers-Trolet, Morgan; van de Wolfshaar, Karen E.

    2018-02-01

    We review and compare four broad categories of spatially-explicit modelling approaches currently used to understand and project changes in the distribution and productivity of living marine resources including: 1) statistical species distribution models, 2) physiology-based, biophysical models of single life stages or the whole life cycle of species, 3) food web models, and 4) end-to-end models. Single pressures are rare and, in the future, models must be able to examine multiple factors affecting living marine resources such as interactions between: i) climate-driven changes in temperature regimes and acidification, ii) reductions in water quality due to eutrophication, iii) the introduction of alien invasive species, and/or iv) (over-)exploitation by fisheries. Statistical (correlative) approaches can be used to detect historical patterns which may not be relevant in the future. Advancing predictive capacity of changes in distribution and productivity of living marine resources requires explicit modelling of biological and physical mechanisms. New formulations are needed which (depending on the question) will need to strive for more realism in ecophysiology and behaviour of individuals, life history strategies of species, as well as trophodynamic interactions occurring at different spatial scales. Coupling existing models (e.g. physical, biological, economic) is one avenue that has proven successful. However, fundamental advancements are needed to address key issues such as the adaptive capacity of species/groups and ecosystems. The continued development of end-to-end models (e.g., physics to fish to human sectors) will be critical if we hope to assess how multiple pressures may interact to cause changes in living marine resources including the ecological and economic costs and trade-offs of different spatial management strategies. Given the strengths and weaknesses of the various types of models reviewed here, confidence in projections of changes in the

  7. MCNP5 CRITICALITY VALIDATION AND BIAS FOR INTERMEDIATE ENRICHED URANIUM SYSTEMS

    International Nuclear Information System (INIS)

    Finfrock, S.H.

    2009-01-01

    The purpose of this analysis is to validate the Monte Carlo N-Particle 5 (MCNP5) code Version 1.40 (LA-UR-03-1987, 2005) and its cross-section database for k-code calculations of intermediate enriched uranium systems on INTEL(reg s ign) processor based PC's running any version of the WINDOWS operating system. Configurations with intermediate enriched uranium were modeled with the moderator range of 39 (le) H/Fissile (le) 1438. See Table 2-1 for brief descriptions of selected cases and Table 3-1 for the range of applicability for this validation. A total of 167 input cases were evaluated including bare and reflected systems in a single body or arrays. The 167 cases were taken directly from the previous (Version 4C [Lan 2005]) validation database. Section 2.0 list data used to calculate k-effective (k eff ) for the 167 experimental criticality benchmark cases using the MCNP5 code v1.40 and its cross section database. Appendix B lists the MCNP cross-section database entries validated for use in evaluating the intermediate enriched uranium systems for criticality safety. The dimensions and atom densities for the intermediate enriched uranium experiments were taken from NEA/NSC/DOC(95)03, September 2005, which will be referred to as the benchmark handbook throughout the report. For these input values, the experimental benchmark k eff is approximately 1.0. The MCNP validation computer runs ran to an accuracy of approximately ± 0.001. For the cases where the reported benchmark k eff was not equal to 1.0000 the MCNP calculational results were normalized. The difference between the MCNP validation computer runs and the experimentally measured k eff is the MCNP5 v1.40 bias. The USLSTATS code (ORNL 1998) was utilized to perform the statistical analysis and generate an acceptable maximum k eff limit for calculations of the intermediate enriched uranium type systems.

  8. Pharmacy settles suit.

    Science.gov (United States)

    1998-10-02

    A suit was filed by an HIV-positive man against a pharmacy that inadvertently disclosed his HIV status to his ex-wife and children. His ex-wife tried to use the information in a custody battle for their two children. The suit against the pharmacy was settled, but the terms of the settlement remain confidential.

  9. Towards a validation of a cellular biomarker suite in native and transplanted zebra mussels: A 2-year integrative field study of seasonal and pollution-induced variations

    Energy Technology Data Exchange (ETDEWEB)

    Guerlet, Edwige [Laboratoire Ecotoxicite, Sante Environnementale, CNRS UMR 7146, Universite Paul Verlaine-Metz, Rue General Delestraint, F-57070 Metz (France); Ledy, Karine [Laboratoire Ecotoxicite, Sante Environnementale, CNRS UMR 7146, Universite Paul Verlaine-Metz, Rue General Delestraint, F-57070 Metz (France); Meyer, Antoinette [Laboratoire Ecotoxicite, Sante Environnementale, CNRS UMR 7146, Universite Paul Verlaine-Metz, Rue General Delestraint, F-57070 Metz (France); Giamberini, Laure [Laboratoire Ecotoxicite, Sante Environnementale, CNRS UMR 7146, Universite Paul Verlaine-Metz, Rue General Delestraint, F-57070 Metz (France)]. E-mail: giamb@univ-metz.fr

    2007-03-30

    Two of the questions raised in the validation process of biomarkers are their relevance in the identification and discrimination of environmental perturbations, and the influence of seasonal factors on these biological endpoints. Determining the advantages and restrictions associated with the use of native or transplanted animals and comparing their responses is also needed. To obtain this information, a 2-year integrative field study was conducted in the vicinity of a nuclear power plant in northeastern France. A station was located in the reservoir receiving the cooling waters of the plant, and two other sites were studied 2 km upstream and 5 km downstream from the reservoir's discharge in the Moselle river. Elevated temperatures, copper contamination and a 1.4-fold-concentration factor of dissolved salts affected water quality of the reservoir. Native and transplanted zebra mussels (Dreissena polymorpha) were collected monthly and their digestive glands were processed for histochemical determinations of the lysosomal and peroxisomal systems and of the lipofuscin and neutral lipid contents. The responses were quantified using automated image analysis and stereology. Apart from neutral lipid contents, there were no systematic seasonal patterns in mussel populations or from 1 year to another. Principal Component Analyses showed a general higher discrimination potential of biological responses in transplanted organisms compared to native ones. They also pointed out the relationships between the cellular and physiological markers and abiotic factors. The present multiple biomarker integrative approach in transplanted D. polymorpha brings promising elements in their validation process as relevant biomonitoring tools.

  10. Towards a validation of a cellular biomarker suite in native and transplanted zebra mussels: A 2-year integrative field study of seasonal and pollution-induced variations

    International Nuclear Information System (INIS)

    Guerlet, Edwige; Ledy, Karine; Meyer, Antoinette; Giamberini, Laure

    2007-01-01

    Two of the questions raised in the validation process of biomarkers are their relevance in the identification and discrimination of environmental perturbations, and the influence of seasonal factors on these biological endpoints. Determining the advantages and restrictions associated with the use of native or transplanted animals and comparing their responses is also needed. To obtain this information, a 2-year integrative field study was conducted in the vicinity of a nuclear power plant in northeastern France. A station was located in the reservoir receiving the cooling waters of the plant, and two other sites were studied 2 km upstream and 5 km downstream from the reservoir's discharge in the Moselle river. Elevated temperatures, copper contamination and a 1.4-fold-concentration factor of dissolved salts affected water quality of the reservoir. Native and transplanted zebra mussels (Dreissena polymorpha) were collected monthly and their digestive glands were processed for histochemical determinations of the lysosomal and peroxisomal systems and of the lipofuscin and neutral lipid contents. The responses were quantified using automated image analysis and stereology. Apart from neutral lipid contents, there were no systematic seasonal patterns in mussel populations or from 1 year to another. Principal Component Analyses showed a general higher discrimination potential of biological responses in transplanted organisms compared to native ones. They also pointed out the relationships between the cellular and physiological markers and abiotic factors. The present multiple biomarker integrative approach in transplanted D. polymorpha brings promising elements in their validation process as relevant biomonitoring tools

  11. A critical evaluation of the validity of episodic future thinking: A clinical neuropsychology perspective.

    Science.gov (United States)

    Ward, Amanda M

    2016-11-01

    Episodic future thinking is defined as the ability to mentally simulate a future event. Although episodic future thinking has been studied extensively in neuroscience, this construct has not been explored in depth from the perspective of clinical neuropsychology. The aim of this critical narrative review is to assess the validity and clinical implications of episodic future thinking. A systematic review of episodic future thinking literature was conducted. PubMed and PsycInfo were searched through July 2015 for review and empirical articles with the following search terms: "episodic future thinking," "future mental simulation," "imagining the future," "imagining new experiences," "future mental time travel," "future autobiographical experience," and "prospection." The review discusses evidence that episodic future thinking is important for adaptive functioning, which has implications for neurological populations. To determine the validity of episodic future thinking, the construct is evaluated with respect to related constructs, such as imagination, episodic memory, autobiographical memory, prospective memory, narrative construction, and working memory. Although it has been minimally investigated, there is evidence of convergent and discriminant validity for episodic future thinking. Research has not addressed the incremental validity of episodic future thinking. Practical considerations of episodic future thinking tasks and related constructs in a clinical neuropsychological setting are considered. The utility of episodic future thinking is currently unknown due to the lack of research investigating the validity of episodic future thinking. Future work is discussed, which could determine whether episodic future thinking is an important missing piece in standard clinical neuropsychological assessment. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  12. Suited Contingency Ops Food - 2

    Science.gov (United States)

    Glass, J. W.; Leong, M. L.; Douglas, G. L.

    2014-01-01

    The contingency scenario for an emergency cabin depressurization event may require crewmembers to subsist in a pressurized suit for up to 144 hours. This scenario requires the capability for safe nutrition delivery through a helmet feed port against a 4 psi pressure differential to enable crewmembers to maintain strength and cognition to perform critical tasks. Two nutritional delivery prototypes were developed and analyzed for compatibility with the helmet feed port interface and for operational effectiveness against the pressure differential. The bag-in-bag (BiB) prototype, designed to equalize the suit pressure with the beverage pouch and enable a crewmember to drink normally, delivered water successfully to three different subjects in suits pressurized to 4 psi. The Boa restrainer pouch, designed to provide mechanical leverage to overcome the pressure differential, did not operate sufficiently. Guidelines were developed and compiled for contingency beverages that provide macro-nutritional requirements, a minimum one-year shelf life, and compatibility with the delivery hardware. Evaluation results and food product parameters have the potential to be used to improve future prototype designs and develop complete nutritional beverages for contingency events. These feeding capabilities would have additional use on extended surface mission EVAs, where the current in-suit drinking device may be insufficient.

  13. Design and validation of a questionnaire to evaluate the usability of computerized critical care information systems.

    Science.gov (United States)

    von Dincklage, Falk; Lichtner, Gregor; Suchodolski, Klaudiusz; Ragaller, Maximilian; Friesdorf, Wolfgang; Podtschaske, Beatrice

    2017-08-01

    The implementation of computerized critical care information systems (CCIS) can improve the quality of clinical care and staff satisfaction, but also holds risks of disrupting the workflow with consecutive negative impacts. The usability of CCIS is one of the key factors determining their benefits and weaknesses. However, no tailored instrument exists to measure the usability of such systems. Therefore, the aim of this study was to design and validate a questionnaire that measures the usability of CCIS. Following a mixed-method design approach, we developed a questionnaire comprising two evaluation models to assess the usability of CCIS: (1) the task-specific model rates the usability individually for several tasks which CCIS could support and which we derived by analyzing work processes in the ICU; (2) the characteristic-specific model rates the different aspects of the usability, as defined by the international standard "ergonomics of human-system interaction". We tested validity and reliability of the digital version of the questionnaire in a sample population. In the sample population of 535 participants both usability evaluation models showed a strong correlation with the overall rating of the system (multiple correlation coefficients ≥0.80) as well as a very high internal consistency (Cronbach's alpha ≥0.93). The novel questionnaire is a valid and reliable instrument to measure the usability of CCIS and can be used to study the influence of the usability on their implementation benefits and weaknesses.

  14. Validity Evidence for a Serious Game to Assess Performance on Critical Pediatric Emergency Medicine Scenarios.

    Science.gov (United States)

    Gerard, James M; Scalzo, Anthony J; Borgman, Matthew A; Watson, Christopher M; Byrnes, Chelsie E; Chang, Todd P; Auerbach, Marc; Kessler, David O; Feldman, Brian L; Payne, Brian S; Nibras, Sohail; Chokshi, Riti K; Lopreiato, Joseph O

    2018-01-26

    We developed a first-person serious game, PediatricSim, to teach and assess performances on seven critical pediatric scenarios (anaphylaxis, bronchiolitis, diabetic ketoacidosis, respiratory failure, seizure, septic shock, and supraventricular tachycardia). In the game, players are placed in the role of a code leader and direct patient management by selecting from various assessment and treatment options. The objective of this study was to obtain supportive validity evidence for the PediatricSim game scores. Game content was developed by 11 subject matter experts and followed the American Heart Association's 2011 Pediatric Advanced Life Support Provider Manual and other authoritative references. Sixty subjects with three different levels of experience were enrolled to play the game. Before game play, subjects completed a 40-item written pretest of knowledge. Game scores were compared between subject groups using scoring rubrics developed for the scenarios. Validity evidence was established and interpreted according to Messick's framework. Content validity was supported by a game development process that involved expert experience, focused literature review, and pilot testing. Subjects rated the game favorably for engagement, realism, and educational value. Interrater agreement on game scoring was excellent (intraclass correlation coefficient = 0.91, 95% confidence interval = 0.89-0.9). Game scores were higher for attendings followed by residents then medical students (Pc game and written test scores (r = 0.84, P game scores to assess knowledge of pediatric emergency medicine resuscitation.

  15. Workflow for Criticality Assessment Applied in Biopharmaceutical Process Validation Stage 1

    Directory of Open Access Journals (Sweden)

    Thomas Zahel

    2017-10-01

    Full Text Available Identification of critical process parameters that impact product quality is a central task during regulatory requested process validation. Commonly, this is done via design of experiments and identification of parameters significantly impacting product quality (rejection of the null hypothesis that the effect equals 0. However, parameters which show a large uncertainty and might result in an undesirable product quality limit critical to the product, may be missed. This might occur during the evaluation of experiments since residual/un-modelled variance in the experiments is larger than expected a priori. Estimation of such a risk is the task of the presented novel retrospective power analysis permutation test. This is evaluated using a data set for two unit operations established during characterization of a biopharmaceutical process in industry. The results show that, for one unit operation, the observed variance in the experiments is much larger than expected a priori, resulting in low power levels for all non-significant parameters. Moreover, we present a workflow of how to mitigate the risk associated with overlooked parameter effects. This enables a statistically sound identification of critical process parameters. The developed workflow will substantially support industry in delivering constant product quality, reduce process variance and increase patient safety.

  16. Validation of multigroup neutron cross sections for the Advanced Neutron Source against the FOEHN critical experimental measurements

    International Nuclear Information System (INIS)

    Smith, L.A.; Gehin, J.C.; Worley, B.A.; Renier, J.P.

    1994-01-01

    The FOEHN critical experiments were analyzed to validate the use of multigroup cross sections in the design of the Advanced Neutron Source. Eleven critical configurations were evaluated using the KENO, DORT, and VENTURE neutronics codes. Eigenvalue and power density profiles were computed and show very good agreement with measured values

  17. [Critical reading of articles about diagnostic tests (part I): Are the results of the study valid?].

    Science.gov (United States)

    Arana, E

    2015-01-01

    In the era of evidence-based medicine, one of the most important skills a radiologist should have is the ability to analyze the diagnostic literature critically. This tutorial aims to present guidelines for determining whether primary diagnostic articles are valid for clinical practice. The following elements should be evaluated: whether the study can be applied to clinical practice, whether the technique was compared to the reference test, whether an appropriate spectrum of patients was included, whether expectation bias and verification bias were limited, the reproducibility of the study, the practical consequences of the study, the confidence intervals for the parameters analyzed, the normal range for continuous variables, and the placement of the test in the context of other diagnostic tests. We use elementary practical examples to illustrate how to select and interpret the literature on diagnostic imaging and specific references to provide more details. Copyright © 2014 SERAM. Published by Elsevier España, S.L.U. All rights reserved.

  18. A Critical Analysis and Validation of the Accuracy of Wave Overtopping Prediction Formulae for OWECs

    Directory of Open Access Journals (Sweden)

    David Gallach-Sánchez

    2018-01-01

    Full Text Available The development of wave energy devices is growing in recent years. One type of device is the overtopping wave energy converter (OWEC, for which the knowledge of the wave overtopping rates is a basic and crucial aspect in their design. In particular, the most interesting range to study is for OWECs with steep slopes to vertical walls, and with very small freeboards and zero freeboards where the overtopping rate is maximized, and which can be generalized as steep low-crested structures. Recently, wave overtopping prediction formulae have been published for this type of structures, although their accuracy has not been fully assessed, as the overtopping data available in this range is scarce. We performed a critical analysis of the overtopping prediction formulae for steep low-crested structures and the validation of the accuracy of these formulae, based on new overtopping data for steep low-crested structures obtained at Ghent University. This paper summarizes the existing knowledge about average wave overtopping, describes the physical model tests performed, analyses the results and compares them to existing prediction formulae. The new dataset extends the wave overtopping data towards vertical walls and zero freeboard structures. In general, the new dataset validated the more recent overtopping formulae focused on steep slopes with small freeboards, although the formulae are underpredicting the average overtopping rates for very small and zero relative crest freeboards.

  19. Validation of nuclear criticality safety software and 27 energy group ENDF/B-IV cross sections. Revision 1

    International Nuclear Information System (INIS)

    Lee, B.L. Jr.; D'Aquila, D.M.

    1996-01-01

    The original validation report, POEF-T-3636, was documented in August 1994. The document was based on calculations that were executed during June through August 1992. The statistical analyses in Appendix C and Appendix D were completed in October 1993. This revision is written to clarify the margin of safety being used at Portsmouth for nuclear criticality safety calculations. This validation gives Portsmouth NCS personnel a basis for performing computerized KENO V.a calculations using the Lockheed Martin Nuclear Criticality Safety Software. The first portion of the document outlines basic information in regard to validation of NCSS using ENDF/B-IV 27-group cross sections on the IBM3090 at ORNL. A basic discussion of the NCSS system is provided, some discussion on the validation database and validation in general. Then follows a detailed description of the statistical analysis which was applied. The results of this validation indicate that the NCSS software may be used with confidence for criticality calculations at the Portsmouth Gaseous Diffusion Plant. For calculations of Portsmouth systems using the specified codes and systems covered by this validation, a maximum k eff including 2σ of 0.9605 or lower shall be considered as subcritical to ensure a calculational margin of safety of 0.02. The validation of NCSS on the IBM 3090 at ORNL was extended to include NCSS on the IBM 3090 at K-25

  20. Validation and Verification of Future Integrated Safety-Critical Systems Operating under Off-Nominal Conditions

    Science.gov (United States)

    Belcastro, Christine M.

    2010-01-01

    Loss of control remains one of the largest contributors to aircraft fatal accidents worldwide. Aircraft loss-of-control accidents are highly complex in that they can result from numerous causal and contributing factors acting alone or (more often) in combination. Hence, there is no single intervention strategy to prevent these accidents and reducing them will require a holistic integrated intervention capability. Future onboard integrated system technologies developed for preventing loss of vehicle control accidents must be able to assure safe operation under the associated off-nominal conditions. The transition of these technologies into the commercial fleet will require their extensive validation and verification (V and V) and ultimate certification. The V and V of complex integrated systems poses major nontrivial technical challenges particularly for safety-critical operation under highly off-nominal conditions associated with aircraft loss-of-control events. This paper summarizes the V and V problem and presents a proposed process that could be applied to complex integrated safety-critical systems developed for preventing aircraft loss-of-control accidents. A summary of recent research accomplishments in this effort is also provided.

  1. Standard problem exercise to validate criticality codes for large arrays of packages of fissile materials

    International Nuclear Information System (INIS)

    Whitesides, G.E.; Stephens, M.E.

    1986-01-01

    A study has been conducted by an Office of Economic Cooperation and Development-Committee on the Safety of Nuclear Installations (OECD-CSNI) Working Group that examined computational methods used to compute k/sub eff/ for large greater than or equal to5 3 arrays of fissile material (in which each unit is a substantial fraction of a critical mass). Five fissile materials that might typically be transported were used in the study. The ''packages'' used for this exercise were simplified to allow studies unperturbed by the variety of structural materials which would exist in an actual package. The only material present other than the fissile material was a variation in the moderator (water) surrounding the fissile material. Consistent results were obtained from calculations using several computational methods. That is, when the bias demonstrated by each method for actual critical experiments was used to ''correct'' the results obtained for systems for which there were no experimental data, there was good agreement between the methods. Two major areas of concern were raised by this exercise. First, the lack of experimental data for arrays with size greater than 5 3 limits validation for large systems. Second, there is a distinct possibility that the comingling of two shipments of unlike units could result in a reduction of the safety margins. Additional experiments and calculations will be required to satisfactorily resolve the remaining questions regarding the safe transport of large arrays of fissile materials

  2. Formal verification and validation of the safety-critical software in a digital reactor protection system

    International Nuclear Information System (INIS)

    Kwon, K. C.; Park, G. Y.

    2006-01-01

    This paper describes the Verification and Validation (V and V) activities for the safety-critical software in a Digital Reactor Protection System (DRPS) that is being developed through the Korea nuclear instrumentation and control system project. The main activities of the DRPS V and V process are a preparation of the software planning documentation, a verification of the software according to the software life cycle, a software safety analysis and a software configuration management. The verification works for the Software Requirement Specification (SRS) of the DRPS consist of a technical evaluation, a licensing suitability evaluation, a inspection and traceability analysis, a formal verification, and preparing a test plan and procedure. Especially, the SRS is specified by the formal specification method in the development phase, and the formal SRS is verified by a formal verification method. Through these activities, we believe we can achieve the functionality, performance, reliability, and safety that are the major V and V objectives of the nuclear safety-critical software in a DRPS. (authors)

  3. Validating MCNP5 libraries and tracking the reason for differences between libraries in criticality calculations

    International Nuclear Information System (INIS)

    Hossny, K.

    2015-01-01

    The purpose of this work is to validate MCNP5 libraries by simulating 4 detailed benchmark experiments and comparing MCNP5 results (each library) with the experimental results and also the previously validated codes for the same experiments MORET 4.A coupled with APOLLO2 (France), and MONK8 (UK). The reasons for difference between libraries are also investigated in this work. Investigating the reason for the differences between libraries will be done by specifying a different library for specific part (clad, fuel, light water) and checking the result deviation than the previously calculated result (with all parts of the same library). The investigated benchmark experiments are of single fuel rods arrays that are water-moderated and water-reflected. Rods contained low-enriched (4.738 wt.% 92 235 U)uranium dioxide (UO 2 ) fuel were clad with aluminum alloy AGS. These experiments were subcritical approaches extrapolated to critical, with the multiplication factor reached being very close to 1.000 (within 0.1%); the subcritical approach parameter was the water level. The studied four cases differ from each other in pitch, number of fuel rods and of course critical height of water. The results show that although library ENDF/B-IV lacks light water treatment card, however its results can be reliable as light water treatment library does not have significant differences from library to another, so it will not be necessary to specify light water treatment card. The main reason for differences between ENDF/B-V and ENDF/B-VI is light water material, especially the Hydrogen element. Specifying the library of Uranium is necessary in case of using library ENDF/B-IV. On the other hand it is not necessary to specify library of cladding material whatever the used library. Validated libraries are ENDF/BIV, ENDF/B-V and ENDF/B-VI with codes in MCNP 42C, 50C and 60C respectively. The presentation slides have been added to the article

  4. Design and validation of a critical pathway for hospital management of patients with severe traumatic brain injury.

    Science.gov (United States)

    Espinosa-Aguilar, Amilcar; Reyes-Morales, Hortensia; Huerta-Posada, Carlos E; de León, Itzcoatl Limón-Pérez; López-López, Fernando; Mejía-Hernández, Margarita; Mondragón-Martínez, María A; Calderón-Téllez, Ligia M; Amezcua-Cuevas, Rosa L; Rebollar-González, Jorge A

    2008-05-01

    Critical pathways for the management of patients with severe traumatic brain injury (STBI) may contribute to reducing the incidence of hospital complications, length of hospitalization stay, and cost of care. Such pathways have previously been developed for departments with significant resource availability. In Mexico, STBI is the most important cause of complications and length of stay in neurotrauma services at public hospitals. Although current treatment is designed basically in accordance with the Brain Trauma Foundation guidelines, shortfalls in the availability of local resources make it difficult to comply with these standards, and no critical pathway is available that accords with the resources of public hospitals. The purpose of the present study was to design and to validate a critical pathway for managing STBI patients that would be suitable for implementation in neurotrauma departments of middle-income level countries. The study comprised two phases: design (through literature review and design plan) and validation (content, construct, and appearance) of the critical pathway. The validated critical pathway for managing STBI patients entails four sequential subprocesses summarizing the hospital's care procedures, and includes three components: (1) nodes and criteria (in some cases, indicators are also included); (2) health team members in charge of the patient; (3) maximum estimated time for compliance with recommendations. This validated critical pathway is based on the current scientific evidence and accords with the availability of resources of middle-income countries.

  5. RAJA Performance Suite

    Energy Technology Data Exchange (ETDEWEB)

    2017-09-01

    The RAJA Performance Suite is designed to evaluate performance of the RAJA performance portability library on a wide variety of important high performance computing (HPC) algorithmic lulmels. These kernels assess compiler optimizations and various parallel programming model backends accessible through RAJA, such as OpenMP, CUDA, etc. The Initial version of the suite contains 25 computational kernels, each of which appears in 6 variants: Baseline SequcntiaJ, RAJA SequentiaJ, Baseline OpenMP, RAJA OpenMP, Baseline CUDA, RAJA CUDA. All variants of each kernel perform essentially the same mathematical operations and the loop body code for each kernel is identical across all variants. There are a few kernels, such as those that contain reduction operations, that require CUDA-specific coding for their CUDA variants. ActuaJ computer instructions executed and how they run in parallel differs depending on the parallel programming model backend used and which optimizations are perfonned by the compiler used to build the Perfonnance Suite executable. The Suite will be used primarily by RAJA developers to perform regular assessments of RAJA performance across a range of hardware platforms and compilers as RAJA features are being developed. It will also be used by LLNL hardware and software vendor panners for new defining requirements for future computing platform procurements and acceptance testing. In particular, the RAJA Performance Suite will be used for compiler acceptance testing of the upcoming CORAUSierra machine {initial LLNL delivery expected in late-2017/early 2018) and the CORAL-2 procurement. The Suite will aJso be used to generate concise source code reproducers of compiler and runtime issues we uncover so that we may provide them to relevant vendors to be fixed.

  6. Validation of KENO V.a for criticality safety calculations involving WR-1 fast-neutron fuel arrangements

    Energy Technology Data Exchange (ETDEWEB)

    Gauld, I. C.

    1991-07-15

    The KENO V.a criticality safety code, used with the SCALE 27-energy-group ENDF/B-IV-based cross-section library, has been validated for low-enriched uranium carbide (UC) WR-1 fast-neutron (FN) fuel arrangements. Because of a lack of relevant experimental data for UC fuel in the published literature, the validation is based primarily on calculational comparisons with critical experiments for fuel types with a range of enrichments and densities that cover those of the FN UC fuel. The ability of KENO V.a to handle the unique annular pin arrangement of the WR-1 FN fuel bundle was established using a comparison with the MCNP3B code used with a continuous-energy ENDF/B-V-based cross-section library. This report is part of the AECL--10146 report series documenting the validation of the KENO V.a criticality safety code.

  7. Same admissions tools, different outcomes: a critical perspective on predictive validity in three undergraduate medical schools.

    Science.gov (United States)

    Edwards, Daniel; Friedman, Tim; Pearce, Jacob

    2013-12-27

    Admission to medical school is one of the most highly competitive entry points in higher education. Considerable investment is made by universities to develop selection processes that aim to identify the most appropriate candidates for their medical programs. This paper explores data from three undergraduate medical schools to offer a critical perspective of predictive validity in medical admissions. This study examined 650 undergraduate medical students from three Australian universities as they progressed through the initial years of medical school (accounting for approximately 25 per cent of all commencing undergraduate medical students in Australia in 2006 and 2007). Admissions criteria (aptitude test score based on UMAT, school result and interview score) were correlated with GPA over four years of study. Standard regression of each of the three admissions variables on GPA, for each institution at each year level was also conducted. Overall, the data found positive correlations between performance in medical school, school achievement and UMAT, but not interview. However, there were substantial differences between schools, across year levels, and within sections of UMAT exposed. Despite this, each admission variable was shown to add towards explaining course performance, net of other variables. The findings suggest the strength of multiple admissions tools in predicting outcomes of medical students. However, they also highlight the large differences in outcomes achieved by different schools, thus emphasising the pitfalls of generalising results from predictive validity studies without recognising the diverse ways in which they are designed and the variation in the institutional contexts in which they are administered. The assumption that high-positive correlations are desirable (or even expected) in these studies is also problematised.

  8. Phenomenological modeling of critical heat flux: The GRAMP code and its validation

    International Nuclear Information System (INIS)

    Ahmad, M.; Chandraker, D.K.; Hewitt, G.F.; Vijayan, P.K.; Walker, S.P.

    2013-01-01

    Highlights: ► Assessment of CHF limits is vital for LWR optimization and safety analysis. ► Phenomenological modeling is a valuable adjunct to pure empiricism. ► It is based on empirical representations of the (several, competing) phenomena. ► Phenomenological modeling codes making ‘aggregate’ predictions need careful assessment against experiments. ► The physical and mathematical basis of a phenomenological modeling code GRAMP is presented. ► The GRAMP code is assessed against measurements from BARC (India) and Harwell (UK), and the Look Up Tables. - Abstract: Reliable knowledge of the critical heat flux is vital for the design of light water reactors, for both safety and optimization. The use of wholly empirical correlations, or equivalently “Look Up Tables”, can be very effective, but is generally less so in more complex cases, and in particular cases where the heat flux is axially non-uniform. Phenomenological models are in principle more able to take into account of a wider range of conditions, with a less comprehensive coverage of experimental measurements. These models themselves are in part based upon empirical correlations, albeit of the more fundamental individual phenomena occurring, rather than the aggregate behaviour, and as such they too require experimental validation. In this paper we present the basis of a general-purpose phenomenological code, GRAMP, and then use two independent ‘direct’ sets of measurement, from BARC in India and from Harwell in the United Kingdom, and the large dataset embodied in the Look Up Tables, to perform a validation exercise on it. Very good agreement between predictions and experimental measurements is observed, adding to the confidence with which the phenomenological model can be used. Remaining important uncertainties in the phenomenological modeling of CHF, namely the importance of the initial entrained fraction on entry to annular flow, and the influence of the heat flux on entrainment rate

  9. Validation of an e-Learning 3.0 Critical Success Factors Framework: A Qualitative Research

    Directory of Open Access Journals (Sweden)

    Paula Miranda

    2017-09-01

    Full Text Available Aim/Purpose: As e-Learning 3.0 evolves from a theoretical construct into an actual solution for online learning, it becomes crucial to accompany this progress by scrutinising the elements that are at the origin of its success. Background: This paper outlines a framework of e-Learning 3.0’s critical success factors and its empirical validation. Methodology: The framework is the result of an extensive literature review and its empirical substantiation derives from semi-structured interviews with e-Learning experts. Contribution: The viewpoints of the experts enable the confirmation and the refinement of the original framework and serve as a foundation for the prospective implementation of e-Learning 3.0. Findings: The analysis of the interviews demonstrates that e-Learning 3.0 remains in its early stages with a reticent dissemination. Nonetheless, the interviewees invoked factors related to technology, content and stakeholders as being critical for the success of this new phase of e-Learning. Recommendations for Practitioners: Practitioners can use the framework as a guide for promoting and implementing effective e-Learning 3.0 initiatives. Recommendation for Researchers: As a new phenomenon with uncharted potential, e-Learning 3.0 should be placed at the centre of educational research. Impact on Society: The understanding of what drives the success of e-Learning 3.0 is fundamental for its implementation and for the progress of online education in this new stage of its evolution. Future Research: Future research ventures can include the design of quantitative and self-administered data collection instruments that can provide further insight into the elements of the framework.

  10. Validation of VHTRC calculation benchmark of critical experiment using the MCB code

    Directory of Open Access Journals (Sweden)

    Stanisz Przemysław

    2016-01-01

    Full Text Available The calculation benchmark problem Very High Temperature Reactor Critical (VHTR a pin-in-block type core critical assembly has been investigated with the Monte Carlo Burnup (MCB code in order to validate the latest version of Nuclear Data Library based on ENDF format. Executed benchmark has been made on the basis of VHTR benchmark available from the International Handbook of Evaluated Reactor Physics Benchmark Experiments. This benchmark is useful for verifying the discrepancies in keff values between various libraries and experimental values. This allows to improve accuracy of the neutron transport calculations that may help in designing the high performance commercial VHTRs. Almost all safety parameters depend on the accuracy of neutron transport calculation results that, in turn depend on the accuracy of nuclear data libraries. Thus, evaluation of the libraries applicability to VHTR modelling is one of the important subjects. We compared the numerical experiment results with experimental measurements using two versions of available nuclear data (ENDF-B-VII.1 and JEFF-3.2 prepared for required temperatures. Calculations have been performed with the MCB code which allows to obtain very precise representation of complex VHTR geometry, including the double heterogeneity of a fuel element. In this paper, together with impact of nuclear data, we discuss also the impact of different lattice modelling inside the fuel pins. The discrepancies of keff have been successfully observed and show good agreement with each other and with the experimental data within the 1 σ range of the experimental uncertainty. Because some propagated discrepancies observed, we proposed appropriate corrections in experimental constants which can improve the reactivity coefficient dependency. Obtained results confirm the accuracy of the new Nuclear Data Libraries.

  11. Measuring Critical Thinking in Physics: Development and Validation of a Critical Thinking Test in Electricity and Magnetism

    Science.gov (United States)

    Tiruneh, Dawit Tibebu; De Cock, Mieke; Weldeslassie, Ataklti G.; Elen, Jan; Janssen, Rianne

    2017-01-01

    Although the development of critical thinking (CT) is a major goal of science education, adequate emphasis has not been given to the measurement of CT skills in specific science domains such as physics. Recognizing that adequately assessing CT implies the assessment of both domain-specific and domain-general CT skills, this study reports on the…

  12. Comparison of mortality prediction models and validation of SAPS II in critically ill burns patients.

    Science.gov (United States)

    Pantet, O; Faouzi, M; Brusselaers, N; Vernay, A; Berger, M M

    2016-06-30

    Specific burn outcome prediction scores such as the Abbreviated Burn Severity Index (ABSI), Ryan, Belgian Outcome of Burn Injury (BOBI) and revised Baux scores have been extensively studied. Validation studies of the critical care score SAPS II (Simplified Acute Physiology Score) have included burns patients but not addressed them as a cohort. The study aimed at comparing their performance in a Swiss burns intensive care unit (ICU) and to observe whether they were affected by a standardized definition of inhalation injury. We conducted a retrospective cohort study, including all consecutive ICU burn admissions (n=492) between 1996 and 2013: 5 epochs were defined by protocol changes. As required for SAPS II calculation, stays burned (TBSA) and inhalation injury (systematic standardized diagnosis since 2006). Study epochs were compared (χ2 test, ANOVA). Score performance was assessed by receiver operating characteristic curve analysis. SAPS II performed well (AUC 0.89), particularly in burns burns <40% TBSA. Ryan and BOBI scores were least accurate, as they heavily weight inhalation injury.

  13. Validation of the ABBN/CONSYST constants system. Part 2: Validation through the critical experiments on cores with uranium solutions

    International Nuclear Information System (INIS)

    Ivanova, T.T.; Manturov, G.N.; Nikolaev, M.N.; Rozhikhin, E.V.; Semenov, M.Yu.; Tsiboulia, A.M.

    1999-01-01

    Results of calculations of critical assemblies with the cores of uranium solutions for the considered series of the experiments are presented in this paper. The conclusions about acceptability of the ABBN-93.1 cross sections for the calculations of such systems are made. (author)

  14. Validation of the Continuous-Energy Monte Carlo Criticality-Safety Analysis System MVP and JENDL-3.2 Using the Internationally Evaluated Criticality Benchmarks

    International Nuclear Information System (INIS)

    Mitake, Susumu

    2003-01-01

    Validation of the continuous-energy Monte Carlo criticality-safety analysis system, comprising the MVP code and neutron cross sections based on JENDL-3.2, was examined using benchmarks evaluated in the 'International Handbook of Evaluated Criticality Safety Benchmark Experiments'. Eight experiments (116 configurations) for the plutonium solution and plutonium-uranium mixture systems performed at Valduc, Battelle Pacific Northwest Laboratories, and other facilities were selected and used in the studies. The averaged multiplication factors calculated with MVP and MCNP-4B using the same neutron cross-section libraries based on JENDL-3.2 were in good agreement. Based on methods provided in the Japanese nuclear criticality-safety handbook, the estimated criticality lower-limit multiplication factors to be used as a subcriticality criterion for the criticality-safety evaluation of nuclear facilities were obtained. The analysis proved the applicability of the MVP code to the criticality-safety analysis of nuclear fuel facilities, particularly to the analysis of systems fueled with plutonium and in homogeneous and thermal-energy conditions

  15. A critical evaluation of the validity and the reliability of global competency constructs for supervisor assessment of junior medical trainees

    NARCIS (Netherlands)

    McGill, D.A.; Vleuten, C.P.M. van der; Clarke, M.J.

    2013-01-01

    Supervisor assessments are critical for both formative and summative assessment in the workplace. Supervisor ratings remain an important source of such assessment in many educational jurisdictions even though there is ambiguity about their validity and reliability. The aims of this evaluation is to

  16. Validation of science virtual test to assess 8th grade students' critical thinking on living things and environmental sustainability theme

    Science.gov (United States)

    Rusyati, Lilit; Firman, Harry

    2017-05-01

    This research was motivated by the importance of multiple-choice questions that indicate the elements and sub-elements of critical thinking and implementation of computer-based test. The method used in this research was descriptive research for profiling the validation of science virtual test to measure students' critical thinking in junior high school. The participant is junior high school students of 8th grade (14 years old) while science teacher and expert as the validators. The instrument that used as a tool to capture the necessary data are sheet of an expert judgment, sheet of legibility test, and science virtual test package in multiple choice form with four possible answers. There are four steps to validate science virtual test to measure students' critical thinking on the theme of "Living Things and Environmental Sustainability" in 7th grade Junior High School. These steps are analysis of core competence and basic competence based on curriculum 2013, expert judgment, legibility test and trial test (limited and large trial test). The test item criterion based on trial test are accepted, accepted but need revision, and rejected. The reliability of the test is α = 0.747 that categorized as `high'. It means the test instruments used is reliable and high consistency. The validity of Rxy = 0.63 means that the validity of the instrument was categorized as `high' according to interpretation value of Rxy (correlation).

  17. Investigation of reliability, validity and normality Persian version of the California Critical Thinking Skills Test; Form B (CCTST

    Directory of Open Access Journals (Sweden)

    Khallli H

    2003-04-01

    Full Text Available Background: To evaluate the effectiveness of the present educational programs in terms of students' achieving problem solving, decision making and critical thinking skills, reliable, valid and standard instrument are needed. Purposes: To Investigate the Reliability, validity and Norm of CCTST Form.B .The California Critical Thinking Skills Test contain 34 multi-choice questions with a correct answer in the jive Critical Thinking (CT cognitive skills domain. Methods: The translated CCTST Form.B were given t0405 BSN nursing students ojNursing Faculties located in Tehran (Tehran, Iran and Shahid Beheshti Universitiesthat were selected in the through random sampling. In order to determine the face and content validity the test was translated and edited by Persian and English language professor and researchers. it was also confirmed by judgments of a panel of medical education experts and psychology professor's. CCTST reliability was determined with internal consistency and use of KR-20. The construct validity of the test was investigated with factor analysis and internal consistency and group difference. Results: The test coefficien for reliablity was 0.62. Factor Analysis indicated that CCTST has been formed from 5 factor (element namely: Analysis, Evaluation, lriference, Inductive and Deductive Reasoning. Internal consistency method shows that All subscales have been high and positive correlation with total test score. Group difference method between nursing and philosophy students (n=50 indicated that there is meaningfUl difference between nursing and philosophy students scores (t=-4.95,p=0.OOO1. Scores percentile norm also show that percentile offifty scores related to 11 raw score and 95, 5 percentiles are related to 17 and 6 raw score ordinary. Conclusions: The Results revealed that the questions test is sufficiently reliable as a research tool, and all subscales measure a single construct (Critical Thinking and are able to distinguished the

  18. Assessing Critical Thinking in Higher Education: The HEIghten™ Approach and Preliminary Validity Evidence

    Science.gov (United States)

    Liu, Ou Lydia; Mao, Liyang; Frankel, Lois; Xu, Jun

    2016-01-01

    Critical thinking is a learning outcome highly valued by higher education institutions and the workforce. The Educational Testing Service (ETS) has designed a next generation assessment, the HEIghten™ critical thinking assessment, to measure students' critical thinking skills in analytical and synthetic dimensions. This paper introduces the…

  19. Learning DHTMLX suite UI

    CERN Document Server

    Geske, Eli

    2013-01-01

    A fast-paced, example-based guide to learning DHTMLX.""Learning DHTMLX Suite UI"" is for web designers who have a basic knowledge of JavaScript and who are looking for powerful tools that will give them an extra edge in their own application development. This book is also useful for experienced developers who wish to get started with DHTMLX without going through the trouble of learning its quirks through trial and error. Readers are expected to have some knowledge of JavaScript, HTML, Document Object Model, and the ability to install a local web server.

  20. Critical validity assessment of theoretical models: charge-exchange at intermediate and high energies

    Science.gov (United States)

    Belkić, Dževad

    1999-06-01

    Exact comprehensive computations are carried out by means of four leading second-order approximations yielding differential cross sections dQ/ dΩ for the basic charge exchange process H ++H(1s)→H(1s)+H + at intermediate and high energies. The obtained extensive set of results is thoroughly tested against all the existing experimental data with the purpose of critically assessing the validity of the boundary corrected second-Born (CB2), continuum-distorted wave (CDW), impulse approximation (IA) and the reformulated impulse approximation (RIA). The conclusion which emerges from this comparative study clearly indicates that the RIA agrees most favorably with the measurements available over a large energy range 25 keV-5 MeV. Such a finding reaffirms the few-particle quantum scattering theory which imposes several strict conditions on adequate second-order methods. These requirements satisfied by the RIA are: (i) normalisations of all the scattering wave functions, (ii) correct boundary conditions in both entrance and exit channels, (iii) introduction of a mathematically justified two-center continuum state for the sum of an attractive and a repulsive Coulomb potential with the same interaction strength, (iv) inclusion of the multiple scattering effects neglected in the IA, (v) a proper description of the Thomas double scattering in good agreement with the experiments and without any unobserved peak splittings. Nevertheless, the performed comparative analysis of the above four approximations indicates that none of the methods is free from some basic shortcomings. Despite its success, the RIA remains essentially a high-energy model like the other three methods under study. More importantly, their perturbative character leaves virtually no room for further systematic improvements, since the neglected higher-order terms are prohibitively tedious for practical purposes and have never been computed exactly. To bridge this gap, we presently introduce the variational Pad

  1. Critical validity assessment of theoretical models: charge-exchange at intermediate and high energies

    International Nuclear Information System (INIS)

    Belkic, Dzevad

    1999-01-01

    Exact comprehensive computations are carried out by means of four leading second-order approximations yielding differential cross sections dQ/dΩ for the basic charge exchange process H + +H(1s)→H(1s)+H + at intermediate and high energies. The obtained extensive set of results is thoroughly tested against all the existing experimental data with the purpose of critically assessing the validity of the boundary corrected second-Born (CB2), continuum-distorted wave (CDW), impulse approximation (IA) and the reformulated impulse approximation (RIA). The conclusion which emerges from this comparative study clearly indicates that the RIA agrees most favorably with the measurements available over a large energy range 25 keV-5 MeV. Such a finding reaffirms the few-particle quantum scattering theory which imposes several strict conditions on adequate second-order methods. These requirements satisfied by the RIA are: (i) normalisations of all the scattering wave functions, (ii) correct boundary conditions in both entrance and exit channels, (iii) introduction of a mathematically justified two-center continuum state for the sum of an attractive and a repulsive Coulomb potential with the same interaction strength, (iv) inclusion of the multiple scattering effects neglected in the IA, (v) a proper description of the Thomas double scattering in good agreement with the experiments and without any unobserved peak splittings. Nevertheless, the performed comparative analysis of the above four approximations indicates that none of the methods is free from some basic shortcomings. Despite its success, the RIA remains essentially a high-energy model like the other three methods under study. More importantly, their perturbative character leaves virtually no room for further systematic improvements, since the neglected higher-order terms are prohibitively tedious for practical purposes and have never been computed exactly. To bridge this gap, we presently introduce the variational Pade

  2. Validity in Mixed Methods Research in Education: The Application of Habermas' Critical Theory

    Science.gov (United States)

    Long, Haiying

    2017-01-01

    Mixed methods approach has developed into the third methodological movement in educational research. Validity in mixed methods research as an important issue, however, has not been examined as extensively as that of quantitative and qualitative research. Additionally, the previous discussions of validity in mixed methods research focus on research…

  3. Clementine sensor suite

    Energy Technology Data Exchange (ETDEWEB)

    Ledebuhr, A.G. [Lawrence Livermore National Lab., CA (United States)

    1994-11-15

    LLNL designed and built the suite of six miniaturized light-weight space-qualified sensors utilized in the Clementine mission. A major goal of the Clementine program was to demonstrate technologies originally developed for Ballistic Missile Defense Organization Programs. These sensors were modified to gather data from the moon. This overview presents each of these sensors and some preliminary on-orbit performance estimates. The basic subsystems of these sensors include optical baffles to reject off-axis stray light, light-weight ruggedized optical systems, filter wheel assemblies, radiation tolerant focal plane arrays, radiation hardened control and readout electronics and low mass and power mechanical cryogenic coolers for the infrared sensors. Descriptions of each sensor type are given along with design specifications, photographs and on-orbit data collected.

  4. Validation of the Monte Carlo Criticality Program KENO V.a for highly-enriched uranium systems

    International Nuclear Information System (INIS)

    Knight, J.R.

    1984-11-01

    A series of calculations based on critical experiments have been performed using the KENO V.a Monte Carlo Criticality Program for the purpose of validating KENO V.a for use in evaluating Y-12 Plant criticality problems. The experiments were reflected and unreflected systems of single units and arrays containing highly enriched uranium metal or uranium compounds. Various geometrical shapes were used in the experiments. The SCALE control module CSAS25 with the 27-group ENDF/B-4 cross-section library was used to perform the calculations. Some of the experiments were also calculated using the 16-group Hansen-Roach Library. Results are presented in a series of tables and discussed. Results show that the criteria established for the safe application of the KENO IV program may also be used for KENO V.a results

  5. KAERI software verification and validation guideline for developing safety-critical software in digital I and C system of NPP

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jang Yeol; Lee, Jang Soo; Eom, Heung Seop

    1997-07-01

    This technical report is to present V and V guideline development methodology for safety-critical software in NPP safety system. Therefore it is to present V and V guideline of planning phase for the NPP safety system in addition to critical safety items, for example, independence philosophy, software safety analysis concept, commercial off the shelf (COTS) software evaluation criteria, inter-relationships between other safety assurance organizations, including the concepts of existing industrial standard, IEEE Std-1012, IEEE Std-1059. This technical report includes scope of V and V guideline, guideline framework as part of acceptance criteria, V and V activities and task entrance as part of V and V activity and exit criteria, review and audit, testing and QA records of V and V material and configuration management, software verification and validation plan production etc., and safety-critical software V and V methodology. (author). 11 refs.

  6. KAERI software verification and validation guideline for developing safety-critical software in digital I and C system of NPP

    International Nuclear Information System (INIS)

    Kim, Jang Yeol; Lee, Jang Soo; Eom, Heung Seop.

    1997-07-01

    This technical report is to present V and V guideline development methodology for safety-critical software in NPP safety system. Therefore it is to present V and V guideline of planning phase for the NPP safety system in addition to critical safety items, for example, independence philosophy, software safety analysis concept, commercial off the shelf (COTS) software evaluation criteria, inter-relationships between other safety assurance organizations, including the concepts of existing industrial standard, IEEE Std-1012, IEEE Std-1059. This technical report includes scope of V and V guideline, guideline framework as part of acceptance criteria, V and V activities and task entrance as part of V and V activity and exit criteria, review and audit, testing and QA records of V and V material and configuration management, software verification and validation plan production etc., and safety-critical software V and V methodology. (author). 11 refs

  7. Numerical simulation and experimental validation of internal heat exchanger influence on CO{sub 2} trans-critical cycle performance

    Energy Technology Data Exchange (ETDEWEB)

    Rigola, Joaquim; Ablanque, Nicolas; Perez-Segarra, Carlos D.; Oliva, Assensi [Centre Tecnologic de Transferencia de Calor (CTTC), Universitat Politecnica de Catalunya (UPC), ETSEIAT, C. Colom 11, 08222 Terrassa (Barcelona) (Spain)

    2010-06-15

    The present paper is a numerical and experimental comparative study of the whole vapour compression refrigerating cycle in general, and reciprocating compressors in particular, with the aim of showing the possibilities that CO{sub 2} offers for commercial refrigeration, considering a single-stage trans-critical cycle using semi-hermetic reciprocating compressors under small cooling capacity systems. The present work is focussed on the influence of using an internal heat exchanger (IHX) in order to improve the cycle performance under real working conditions. In order to validate the numerical results, an experimental unit specially designed and built to analyze trans-critical refrigerating equipments considering IHX has been built. Both numerical results and experimental data show reasonable good agreement, while the comparative global values conclude the improvement of cooling capacity and COP when IHX is considered in the CO{sub 2} trans-critical cycle. (author)

  8. Literature research concerning alternative methods for validation of criticality calculation systems; Literaturrecherche zu alternativen Daten und Methoden zur Validierung von Kritikalitaetsrechensystemen

    Energy Technology Data Exchange (ETDEWEB)

    Behler, Matthias

    2016-05-15

    Beside radiochemical analysis of irradiated fuel and critical experiments, which has become a well-established basis for the validation of depletion code and criticality codes respectively, also results of oscillation experiments or the operating conditions of power reactor and research reactors can provide useful information for the validation of the above mentioned codes. Based on a literature review the potential of the utilization of oscillation experiment measurements for the validation of criticality codes is estimated. It is found that the reactivity measurements for actinides and fission products within the CERES program on the reactors DIMPLE (Winfrith, UK) and MINERVE (Cadarache, France) can give a valuable addition to the commonly used critical experiments for criticality code validation. However, there are approaches but yet no generally satisfactory solution for integrating the reactivity measurements in a quantitative bias determination for the neutron multiplication factor of typical application cases including irradiated spent fuel outside reactor cores, calculated using common criticality codes.

  9. Validity of accessible critical nitrogen dilution curves in perennial ryegrass for seed production

    DEFF Research Database (Denmark)

    Gislum, René; Boelt, Birte

    2009-01-01

    The objectives were to test if accessible critical nitrogen dilution curves (NDCs) in rapeseed, pea, alfalfa, tall fescue, wheat, annual ryegrass and linseed could be used in grass species for seed production and to develop a critical NDC especially in grass species for seed production. The criti......The objectives were to test if accessible critical nitrogen dilution curves (NDCs) in rapeseed, pea, alfalfa, tall fescue, wheat, annual ryegrass and linseed could be used in grass species for seed production and to develop a critical NDC especially in grass species for seed production...... the NDCs in tall fescue, alfalfa, pea and rapeseed. These findings should be used to continue the interesting and necessary work on developing a NDC in grass species for seed production....

  10. Validation of MCNP6.1 for Criticality Safety of Pu-Metal, -Solution, and -Oxide Systems

    Energy Technology Data Exchange (ETDEWEB)

    Kiedrowski, Brian C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Conlin, Jeremy Lloyd [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Favorite, Jeffrey A. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Kahler, III, Albert C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Kersting, Alyssa R. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Parsons, Donald K. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Walker, Jessie L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-05-13

    Guidance is offered to the Los Alamos National Laboratory Nuclear Criticality Safety division towards developing an Upper Subcritical Limit (USL) for MCNP6.1 calculations with ENDF/B-VII.1 nuclear data for three classes of problems: Pu-metal, -solution, and -oxide systems. A benchmark suite containing 1,086 benchmarks is prepared, and a sensitivity/uncertainty (S/U) method with a generalized linear least squares (GLLS) data adjustment is used to reject outliers, bringing the total to 959 usable benchmarks. For each class of problem, S/U methods are used to select relevant experimental benchmarks, and the calculational margin is computed using extreme value theory. A portion of the margin of sub criticality is defined considering both a detection limit for errors in codes and data and uncertainty/variability in the nuclear data library. The latter employs S/U methods with a GLLS data adjustment to find representative nuclear data covariances constrained by integral experiments, which are then used to compute uncertainties in keff from nuclear data. The USLs for the classes of problems are as follows: Pu metal, 0.980; Pu solutions, 0.973; dry Pu oxides, 0.978; dilute Pu oxide-water mixes, 0.970; and intermediate-spectrum Pu oxide-water mixes, 0.953.

  11. Enhancing students' learning in problem based learning: validation of a self-assessment scale for active learning and critical thinking.

    Science.gov (United States)

    Khoiriyah, Umatul; Roberts, Chris; Jorm, Christine; Van der Vleuten, C P M

    2015-08-26

    Problem based learning (PBL) is a powerful learning activity but fidelity to intended models may slip and student engagement wane, negatively impacting learning processes, and outcomes. One potential solution to solve this degradation is by encouraging self-assessment in the PBL tutorial. Self-assessment is a central component of the self-regulation of student learning behaviours. There are few measures to investigate self-assessment relevant to PBL processes. We developed a Self-assessment Scale on Active Learning and Critical Thinking (SSACT) to address this gap. We wished to demonstrated evidence of its validity in the context of PBL by exploring its internal structure. We used a mixed methods approach to scale development. We developed scale items from a qualitative investigation, literature review, and consideration of previous existing tools used for study of the PBL process. Expert review panels evaluated its content; a process of validation subsequently reduced the pool of items. We used structural equation modelling to undertake a confirmatory factor analysis (CFA) of the SSACT and coefficient alpha. The 14 item SSACT consisted of two domains "active learning" and "critical thinking." The factorial validity of SSACT was evidenced by all items loading significantly on their expected factors, a good model fit for the data, and good stability across two independent samples. Each subscale had good internal reliability (>0.8) and strongly correlated with each other. The SSACT has sufficient evidence of its validity to support its use in the PBL process to encourage students to self-assess. The implementation of the SSACT may assist students to improve the quality of their learning in achieving PBL goals such as critical thinking and self-directed learning.

  12. Pain Assessment in Critically İll Adult Patients: Validity and Reliability Research of the Turkish Version of the Critical-Care Pain Observation Tool

    Directory of Open Access Journals (Sweden)

    Onur Gündoğan

    2016-12-01

    Full Text Available Objective: Critical-Care Pain Observation Tool (CPOT and the Behavioral Pain Scale (BPS are behavioral pain assessment scales for unconscious intensive care unit (ICU patients. The aim is to determine the validation and reliability of the CPOT in Turkish in mechanically ventilated adult ICU patients. Material and Method: This prospective observational cohort study included 50 mechanically ventilated mixed ICU patients who were unable to report pain. CPOT and BPS was translated into Turkish and language validity was performed by ten intensive care specialists. Pain was assessed in the course of painless and painful routine care procedures using the CPOT and the BPS by a resident and an intensivist concomitantly. Tests reliability, interrater reliability, and validity of the CPOT and the BPS were evaluated. Results: The mean age was 57.4 years and the mean APACHE II score was 18.7. A total of 100 assessments were recorded from 50 patients using CPOT and BPS. Scores of CPOT and BPS during the painful procedures were both significantly higher than painless procedures. The agreement between CPOT and BPS during painful and painless stimuli was ranged as; sensitivity 66.7%-90.3%; specificity 89.7%-97.9%; kappa value 0.712-0.892. The agreement between resident and intensivist during painful and painless stimuli was ranged from 97% to 100% and the kappa value was between 0.904 and 1.0. Conclusion: The Turkish version of the CPOT showed good correlation with the BPS. Interrater reliability between resident and intensivist was good. The study showed that the Turkish version of BPS and CPOT are reliable and valid tools to assess pain in daily clinical practice for intubated and unconscious ICU patients who are mechanically ventilated.

  13. An analysis of critical flow for steam and water extending to supercritical conditions with experimental validation

    International Nuclear Information System (INIS)

    Porter, W.H.L.

    1985-01-01

    The basic method used in this paper for establishing the critical flow of a water steam mixture including subcooled water conditions, the quality range and superheated steam conditions has already been reported and the methods are once more summarised in the next section. These methods can be extended to any fluid and results have been reported for Freon and dissociating NO/sub 2/. If an extended or complex length of pipe is involved before the position where critical flow is established, a more elaborate method is required which involves establishing the losses down the pipe. A code RAPVOID is available for analysing such cases

  14. Validation of lactate clearance at 6 h for mortality prediction in critically ill children

    OpenAIRE

    Rajeev Kumar; Nirmal Kumar

    2016-01-01

    Background and Aims: To validate the lactate clearance (LC) at 6 h for mortality prediction in Pediatric Intensive Care Unit (PICU)-admitted patients and its comparison with a pediatric index of mortality 2 (PIM 2) score. Design: A prospective, observational study in a tertiary care center. Materials and Methods: Children

  15. The concept of borderline conditions: a critical comment on validity issues

    DEFF Research Database (Denmark)

    Parnas, Josef

    1994-01-01

    is dominated by an exaggerated quest for reliability at the expense of concern with validity issues. Concepts of operational criteria, polythetic-prototypic systems, and epistemic peculiarities of psychiatric, clinical typification are briefly exposed. It is suggested that future scientific progress...

  16. The concept of borderline conditions: a critical comment on validity issues

    DEFF Research Database (Denmark)

    Parnas, Josef

    1994-01-01

    is dominated by an exaggerated quest for reliability at the expense of concern with validity issues. Concepts of operational criteria, polythetic-prototypic systems, and epistemic peculiarities of psychiatric, clinical typification are briefly exposed. It is suggested that future scientific progress...... research in normal psychology. Some of the claims of Gunderson and Torgersen (this volume) are discussed....

  17. A Critical Assessment of the Quality and Validity of Composite Indicators of Innovation

    Energy Technology Data Exchange (ETDEWEB)

    Vértesy, D.

    2016-07-01

    While it is generally accepted that monitoring innovation system performance requires a set of indicators, there is a constant debate on whether and how composite indices can be used to summarize them. This paper enters this discussion by assessing the validity and quality of the most commonly used composite indicators of innovation. In our framework, the validity of an index relates to the link between component indicators or aggregates and to the aspect(s) of national systems of innovation they seek to measure, while the quality of an indicator relates to its statistical properties. To better understand validity, we discuss how the evolution of the national system of innovation concept and its use in policy has shifted demand from an advocacy to more analytical functions of composite indicators of innovation.We next examine selected composite indicators of innovation (the WIPO-INSEAD’s Global Innovation Indicator, the Summary Innovation Index and Innovation Output Indicator of the European Commission and the Fraunhofer Innovation Index) in different contexts of external and internal validity and conduct global sensitivity analyses on them. Our policy-relevant findings highlight the need for analytically stronger composites of a more limited set of indicators.We also found significant quality differences across the indices, as some included components which explain little or none of the variance in composite scores, and were more sensitive to modeling choices. The indices studied differed in how validly they represented various innovation system functions and types of innovation, and showed information relevant for a broader or a more limited set of stakeholders. We argue that further development of innovation indicators should put more emphasis on identifying tradeoffs within innovation policy, and unintended consequences of innovative activities. (Author)

  18. Apparatus for storing protective suits

    International Nuclear Information System (INIS)

    Englemann, H.J.; Koller, J.; Schrader, H.R.; Schade, G.; Pedrerol, J.

    1975-01-01

    Arrangements are described for storing one or more protective suits when contaminated on the outside. In order to permit a person wearing a contaminated suit to leave a contaminated area safely, and without contaminating the environment, it has hitherto been the practice for the suit to be passed through a 'lock' and cleansed under decontaminating showers whilst still being worn. This procedure is time wasting and not always completely effective, and it may be necessary to provide a second suit for use whilst the first suit is being decontaminated. Repeated decontamination may also result in undue wear and tear. The arrangements described provide a 'lock' chamber in which a contaminated suit may be stowed away without its interior becoming contaminated, thus allowing repeated use by persons donning and shedding it. (U.K.)

  19. The German Generations and Gender Survey: Some Critical Reflections on the Validity of Fertility Histories

    Directory of Open Access Journals (Sweden)

    Michaela Kreyenfeld

    2013-01-01

    Full Text Available This paper validates the fertility histories of the German Generations and Gender Survey (GGS. Focusing on the cohorts 1930-69 of West German women, the total number of children, the parity distribution and the parity progression ratios are compared to external sources. One major result from this validation is that the German GGS understates the fertility for the older cohorts and overstates it for the younger ones. We presume that two mechanisms are responsible for this pattern in the German GGS: On the one hand, children who have left parental home are underreported in the retrospective fertility histories. On the other hand, women with small children are easier to reach by the interviewer. These two mechanisms taken together produce too low numbers of children for the older and too high ones for the younger cohorts. Extending the validation to marital histories has revealed a similar bias. Our general conclusion from this investigation is that the German GGS may not be used for statistical analyses of cohort fertility and marriage trends. For subsequent surveys, we suggest integrating simple control questions in questionnaires with complex retrospective fertility and union histories.

  20. External validation of the Intensive Care National Audit & Research Centre (ICNARC) risk prediction model in critical care units in Scotland.

    Science.gov (United States)

    Harrison, David A; Lone, Nazir I; Haddow, Catriona; MacGillivray, Moranne; Khan, Angela; Cook, Brian; Rowan, Kathryn M

    2014-01-01

    Risk prediction models are used in critical care for risk stratification, summarising and communicating risk, supporting clinical decision-making and benchmarking performance. However, they require validation before they can be used with confidence, ideally using independently collected data from a different source to that used to develop the model. The aim of this study was to validate the Intensive Care National Audit & Research Centre (ICNARC) model using independently collected data from critical care units in Scotland. Data were extracted from the Scottish Intensive Care Society Audit Group (SICSAG) database for the years 2007 to 2009. Recoding and mapping of variables was performed, as required, to apply the ICNARC model (2009 recalibration) to the SICSAG data using standard computer algorithms. The performance of the ICNARC model was assessed for discrimination, calibration and overall fit and compared with that of the Acute Physiology And Chronic Health Evaluation (APACHE) II model. There were 29,626 admissions to 24 adult, general critical care units in Scotland between 1 January 2007 and 31 December 2009. After exclusions, 23,269 admissions were included in the analysis. The ICNARC model outperformed APACHE II on measures of discrimination (c index 0.848 versus 0.806), calibration (Hosmer-Lemeshow chi-squared statistic 18.8 versus 214) and overall fit (Brier's score 0.140 versus 0.157; Shapiro's R 0.652 versus 0.621). Model performance was consistent across the three years studied. The ICNARC model performed well when validated in an external population to that in which it was developed, using independently collected data.

  1. Validation of the ASSERT subchannel code: Prediction of critical heat flux in standard and nonstandard CANDU bundle geometries

    International Nuclear Information System (INIS)

    Carver, M.B.; Kiteley, J.C.; Zhou, R.Q.N.; Junop, S.V.; Rowe, D.S.

    1995-01-01

    The ASSERT code has been developed to address the three-dimensional computation of flow and phase distribution and fuel element surface temperatures within the horizontal subchannels of Canada uranium deuterium (CANDU) pressurized heavy water reactor fuel channels and to provide a detailed prediction of critical heat flux (CHF) distribution throughout the bundle. The ASSERT subchannel code has been validated extensively against a wide repertoire of experiments; its combination of three-dimensional prediction of local flow conditions with a comprehensive method of predicting CHF at these local conditions makes it a unique tool for predicting CHF for situations outside the existing experimental database. In particular, ASSERT is an appropriate tool to systematically investigate CHF under conditions of local geometric variations, such as pressure tube creep and fuel element strain. The numerical methodology used in ASSERT, the constitutive relationships incorporated, and the CHF assessment methodology are discussed. The evolutionary validation plan is also discussed and early validation exercises are summarized. More recent validation exercises in standard and nonstandard geometries are emphasized

  2. Criticality safety validation: Simple geometry, single unit {sup 233}U systems

    Energy Technology Data Exchange (ETDEWEB)

    Putman, V.L.

    1997-06-01

    Typically used LMITCO criticality safety computational methods are evaluated for suitability when applied to INEEL {sup 233}U systems which reasonably can be modeled as simple-geometry, single-unit systems. Sixty-seven critical experiments of uranium highly enriched in {sup 233}U, including 57 aqueous solution, thermal-energy systems and 10 metal, fast-energy systems, were modeled. These experiments include 41 cylindrical and 26 spherical cores, and 41 reflected and 26 unreflected systems. No experiments were found for intermediate-neutron-energy ranges, or with interstitial non-hydrogenous materials typical of waste systems, mixed {sup 233}U and plutonium, or reflectors such as steel, lead, or concrete. No simple geometry experiments were found with cubic or annular cores, or approximating infinite sea systems. Calculations were performed with various tools and methodologies. Nine cross-section libraries, based on ENDF/B-IV, -V, or -VI.2, or on Hansen-Roach source data, were used with cross-section processing methods of MCNP or SCALE. The k{sub eff} calculations were performed with neutral-particle transport and Monte Carlo methods of criticality codes DANT, MCNP 4A, and KENO Va.

  3. Numerical verification/validation of the theory of coupled reactors for deuterium critical assembly, using MCNP5 and Serpent codes

    International Nuclear Information System (INIS)

    Hussein, M.S; Lewis, B.J.; Bonin, H.W.

    2013-01-01

    The theory of multipoint coupled reactors developed by multi-group transport is verified by using the probabilistic transport code MCNP5 and the continuous-energy Monte Carlo reactor physics burnup calculation Serpent code. The verification was performed by calculating the multiplication factors (or criticality factors) and coupling coefficients for a two-region test reactor known as the Deuterium Critical Assembly, DCA. The multiplication factors k eff calculated numerically and independently from simulations of the DCA by MCNP5 and Serpent codes are compared with the multiplication factors k eff calculated based on the coupled reactor theory. Excellent agreement was obtained between the multiplication factors k eff calculated with the Serpent code, with MCNP5, and from the coupled reactor theory. This analysis demonstrates that the Serpent code is valid for the multipoint coupled reactor calculations. (author)

  4. Numerical verification/validation of the theory of coupled reactors for deuterium critical assembly, using MCNP5 and Serpent codes

    Energy Technology Data Exchange (ETDEWEB)

    Hussein, M.S, E-mail: mohamed.hussein@rmc.ca [Royal Military College of Canada, Dept. of Chemistry and Chemical Engineering, Kingston, Ontario (Canada); Lewis, B.J., E-mail: Brent.Lewis@uoit.ca [Univ. of Ontario Inst. of Technology, Faculty of Energy Systems and Nuclear Science, Oshawa, Ontario (Canada); Bonin, H.W., E-mail: bonin-h@rmc.ca [Royal Military College of Canada, Dept. of Chemistry and Chemical Engineering, Kingston, Ontario (Canada)

    2013-07-01

    The theory of multipoint coupled reactors developed by multi-group transport is verified by using the probabilistic transport code MCNP5 and the continuous-energy Monte Carlo reactor physics burnup calculation Serpent code. The verification was performed by calculating the multiplication factors (or criticality factors) and coupling coefficients for a two-region test reactor known as the Deuterium Critical Assembly, DCA. The multiplication factors k{sub eff} calculated numerically and independently from simulations of the DCA by MCNP5 and Serpent codes are compared with the multiplication factors k{sub eff} calculated based on the coupled reactor theory. Excellent agreement was obtained between the multiplication factors k{sub eff} calculated with the Serpent code, with MCNP5, and from the coupled reactor theory. This analysis demonstrates that the Serpent code is valid for the multipoint coupled reactor calculations. (author)

  5. Advanced Gas Sensing Technology for Space Suits, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Advanced space suits require lightweight, low-power, durable sensors for monitoring critical life support materials. No current compact sensors have the tolerance...

  6. Multi-Trait Multi-Method Matrices for the Validation of Creativity and Critical Thinking Assessments for Secondary School Students in England and Greece

    Directory of Open Access Journals (Sweden)

    Ourania Maria Ventista

    2017-08-01

    Full Text Available The aim of this paper is the validation of measurement tools which assess critical thinking and creativity as general constructs instead of subject-specific skills. Specifically, this research examined whether there is convergent and discriminant (or divergent validity between measurement tools of creativity and critical thinking. For this purpose, the multi-trait and multi-method matrix suggested by Campbell and Fiske (1959 was used. This matrix presented the correlation of scores that students obtain in different assessments in order to reveal whether the assessments measure the same or different constructs. Specifically, the two methods used were written and oral exams, and the two traits measured were critical thinking and creativity. For the validation of the assessments, 30 secondary-school students in Greece and 21 in England completed the assessments. The sample in both countries provided similar results. The critical thinking tools demonstrated convergent validity when compared with each other and discriminant validity with the creativity assessments. Furthermore, creativity assessments which measure the same aspect of creativity demonstrated convergent validity. To conclude, this research provided indicators that critical thinking and creativity as general constructs can be measured in a valid way. However, since the sample was small, further investigation of the validation of the assessment tools with a bigger sample is recommended.

  7. Validation of Cross Sections with Criticality Experiment and Reaction Rates: the Neptunium Case

    CERN Document Server

    Leong, L S; Audouin, L; Berthier, B; Le Naour, C; Stéphan, C; Paradela, C; Tarrío, D; Duran, I

    2014-01-01

    The Np-237 neutron-induced fission cross section has been recently measured in a large energy range (from eV to GeV) at the n\\_TOF facility at CERN. When compared to previous measurements the n\\_TOF fission cross section appears to be higher by 5-7\\% beyond the fission threshold. To check the relevance of the n\\_TOF data, we considered a criticality experiment performed at Los Alamos with a 6 kg sphere of Np-237, surrounded by uranium highly enriched in U-235 so as to approach criticality with fast neutrons. The multiplication factor k(eff) of the calculation is in better agreement with the experiment when we replace the ENDF/B-VII. 0 evaluation of the Np-237 fission cross section by the n\\_TOF data. We also explored the hypothesis of deficiencies of the inelastic cross section in U-235 which has been invoked by some authors to explain the deviation of 750 pcm. The large modification needed to reduce the deviation seems to be incompatible with existing inelastic cross section measurements. Also we show that t...

  8. Analysis of the impact of correlated benchmark experiments on the validation of codes for criticality safety analysis

    International Nuclear Information System (INIS)

    Bock, M.; Stuke, M.; Behler, M.

    2013-01-01

    The validation of a code for criticality safety analysis requires the recalculation of benchmark experiments. The selected benchmark experiments are chosen such that they have properties similar to the application case that has to be assessed. A common source of benchmark experiments is the 'International Handbook of Evaluated Criticality Safety Benchmark Experiments' (ICSBEP Handbook) compiled by the 'International Criticality Safety Benchmark Evaluation Project' (ICSBEP). In order to take full advantage of the information provided by the individual benchmark descriptions for the application case, the recommended procedure is to perform an uncertainty analysis. The latter is based on the uncertainties of experimental results included in most of the benchmark descriptions. They can be performed by means of the Monte Carlo sampling technique. The consideration of uncertainties is also being introduced in the supplementary sheet of DIN 25478 'Application of computer codes in the assessment of criticality safety'. However, for a correct treatment of uncertainties taking into account the individual uncertainties of the benchmark experiments is insufficient. In addition, correlations between benchmark experiments have to be handled correctly. For example, these correlations can arise due to different cases of a benchmark experiment sharing the same components like fuel pins or fissile solutions. Thus, manufacturing tolerances of these components (e.g. diameter of the fuel pellets) have to be considered in a consistent manner in all cases of the benchmark experiment. At the 2012 meeting of the Expert Group on 'Uncertainty Analysis for Criticality Safety Assessment' (UACSA) of the OECD/NEA a benchmark proposal was outlined that aimed for the determination of the impact on benchmark correlations on the estimation of the computational bias of the neutron multiplication factor (k eff ). The analysis presented here is based on this proposal. (orig.)

  9. The measurement, control, and validation of critical parameters in an electron beam sterilization facility

    International Nuclear Information System (INIS)

    Burns, P.; Drewell, N.H.; McKeown, J.

    1996-01-01

    The delivery and validation of a specified dose to a medical device are key concerns of operators of electron beam irradiation facilities. In an IMPELA-based irradiator, four of the parameters that directly influence the absorbed dose distribution in the product are controllable in real time - the electron energy, average beam current, scanned area, and the product exposure time. The 10 MeV accelerator operates at 50 kW with a stream of 200 μs wide, 100 mA pulses at a repetition rate of 250 Hz. The combination of short-term intra-pulse regulation with long-term pulse-to-pulse stability makes the IMPELA output attractive for the sterilization of medical products. The measurement and closed-loop control techniques used in the IMPELA design will be described with reference to facilitating compliance with medical sterilization standards. (orig.)

  10. Computer Games as Virtual Environments for Safety-Critical Software Validation

    Directory of Open Access Journals (Sweden)

    Štefan Korečko

    2017-01-01

    Full Text Available Computer games became an inseparable part of everyday life in modern society and the time people spend playing them every day is increasing. This trend caused a noticeable research activity focused on utilizing the time spent playing in a meaningful way, for example to help solving scientific problems or tasks related to computer systems development. In this paper we present one contribution to this activity, a software system consisting of a modified version of the Open Rails train simulator and an application called TS2JavaConn, which allows to use separately developed software controllers with the simulator. The system is intended for validation of controllers developed by formal methods. The paper describes the overall architecture of the system and operation of its components. It also compares the system with other approaches to purposeful utilization of computer games, specifies suitable formal methods and illustrates its intended use on an example.

  11. Clinical validity of a normal pulmonary angiogram in patients with suspected pulmonary embolism - A critical review

    International Nuclear Information System (INIS)

    Beek, Edwin J.R. van; Brouwers, Elise M.J.; Song Bin; Stein, Paul D.; Oudkerk, Matthijs

    2001-01-01

    AIM: To determine the validity of a normal pulmonary angiogram in the exclusion of pulmonary embolism (PE), based on the safety of withholding anticoagulant therapy in patients with a normal pulmonary angiogram. MATERIALS AND METHODS: A review of English reports published between 1965 and April 1999 was carried out. Eligible articles described prospective studies in patients with suspected PE and a normal pulmonary angiogram, who remained untreated and were followed-up for a minimum of 3 months. Articles were evaluated by two authors, using pre-defined criteria for strength of design. End points consisted of fatal and non-fatal recurrent thromboembolic events. A sensitivity analysis was performed, by removing one study at a time from the overall results and by comparing pre- and post-1990 publications. RESULTS: Among 1050 patients in eight articles included in the analysis, recurrent thromboembolic events were described in 18 patients (1.7% 95% CI: 1.0-2.7%). These were fatal in three patients (0.3% 95% CI: 0.02-0.7%). The recurrence rate of PE decreased from 2.9% (95% CI: 1.4-6.8%) before 1990 to 1.1% (95% CI: 0.5-2.2%) after 1990. CONCLUSION: It would appear that the ability to exclude PE by angiography has improved over the years, as indicated by recurrence rate of PE. The low recurrence rate of PE supports the validity of a normal pulmonary angiogram for the exclusion of PE. Beek, E.J.R. van et al. (2001)

  12. Validating criticality calculations for spent fuel with 252Cf-source-driven noise measurements

    International Nuclear Information System (INIS)

    Mihalczo, J.T.; Krass, A.W.; Valentine, T.E.

    1992-01-01

    The 252 Cf-Source-driven noise analysis method can be used for measuring the subcritical neutron multiplication factor k of arrays of spent light water reactor (LWR) fuel. This type of measurement provides a parameter that is directly related to the criticality state of arrays of LWR fuel. Measurements of this parameter can verify the criticality safety margins of spent LWR fuel configurations and thus could be a means of obtaining the information to justify burnup credit for spent LWR transportation/storage casks. The practicality of a measurement depends on the ability to install the hardware required to perform the measurement. Source chambers containing the 252 Cf at the required source intensity for this application have been constructed and have operated successfully for ∼10 years and can be fabricated to fit into control rod guide tubes of PWR fuel elements. Fission counters especially developed for spent-fuel measurements are available that would allow measurements of a special 3 x 3 spent fuel array and a typical burnup credit rail cask with spent fuel in unborated water. Adding a moderator around these fission counters would allow measurements with the typical burnup credit rail cask with borated water and the special 3 x 3 array with borated water. The recent work of Ficaro on modifying the KENO Va code to calculate by the Monte Carlo method the time sequences of pulses at two detectors near a fissile assembly from the fission chain multiplication process, initiated by a 252 Cf source in the assembly allows a direct computer calculation of the noise analysis data from this measurement method

  13. Methodologies for verification and validation of expert systems as a function of component, criticality and life-cycle phase

    International Nuclear Information System (INIS)

    Miller, L.

    1992-01-01

    The review of verification and validation (V and V) methods presented here is based on results of the initial two tasks of a contract with the US Nuclear Regulatory Commission and the Electric Power Research Institute to Develop and Document Guidelines for Verifying and Validating Expert Systems. The first task was to review the applicability of conventional software techniques to expert systems; the second was to directly survey V and V practices associated with development of expert systems. Subsequent tasks will focus on selecting, synthesizing or developing V and V methods appropriate for the overall system, for specific expert systems components, and for different phases of the life-cycle. In addition, final guidelines will most likely be developed for each of three levels of expert systems: safety-related (systems whose functions directly relate to system safety, so-called safety-critical systems), important-to-safety (systems which support the critical safety functions), and non-safety (systems which are unrelated to safety functions). For the present purposes of categorizing and discussing various types of V and V methods, the authors simplify the life-cycle and consider only two aspects - systems validation phase. The authors identified a number of techniques for the first, combined, phase and two general classes of V and V techniques for the latter phase: static testing techniques, which do not involve execution of the system code, and dynamic testing techniques, which do. In the next two sections the author reviews first the applicability to expert systems of conventional V and V techniques and, second, the techniques expert system developers actually use. In the last section the authors make some general observations

  14. Translation, adaptation, and validation of the behavioral pain scale and the critical-care pain observational tools in Taiwan

    Directory of Open Access Journals (Sweden)

    Hsiung NH

    2016-09-01

    Full Text Available Nai-Huan Hsiung,1 Yen Yang,1 Ming Shinn Lee,2 Koustuv Dalal,3 Graeme D Smith4 1Department of Nursing, College of Nursing, Tzu Chi University of Science and Technology, 2Department of Curriculum Design and Human Potentials Development, National Dong Hwa University, Hualien, Taiwan, Republic of China; 3Department of Public Health Science, School of Health Sciences, Örebro University, Örebro, Sweden; 4School of Nursing, Midwifery & Social Care, Edinburgh Napier University, Edinburgh, UK Abstract: This study describes the cultural adaptation and testing of the behavioral pain scale (BPS and the critical-care pain observation tools (CPOT for pain assessment in Taiwan. The cross-cultural adaptation followed the steps of translation, including forward translation, back-translation, evaluation of the translations by a committee of experts, adjustments, and then piloting of the prefinal versions of the BPS and the CPOT. A content validity index was used to assess content validities of the BPS and the CPOT, with 0.80 preset as the level that would be regarded as acceptable. The principal investigator then made adjustments when the content validity index was <0.80. The pilot test was performed with a sample of ten purposively selected patients by 2 medical staff from a medical care center in Taiwan. The BPS and the CPOT are adequate instruments for the assessment of pain levels in patients who cannot communicate due to sedation and ventilation treatments. Keywords: pain, scales, BPS, CPOT, Taiwan

  15. The validity and reliability of the portuguese versions of three tools used to diagnose delirium in critically ill patients

    Directory of Open Access Journals (Sweden)

    Dimitri Gusmao-Flores

    2011-01-01

    Full Text Available OBJECTIVES: The objectives of this study are to compare the sensitivity and specificity of three diagnostic tools for delirium (the Intensive Care Delirium Screening Checklist, the Confusion Assessment Method for Intensive Care Units and the Confusion Assessment Method for Intensive Care Units Flowsheet in a mixed population of critically ill patients, and to validate the Brazilian Portuguese Confusion Assessment Method for Intensive Care Units. METHODS: The study was conducted in four intensive care units in Brazil. Patients were screened for delirium by a psychiatrist or neurologist using the Diagnostic and Statistical Manual of Mental Disorders. Patients were subsequently screened by an intensivist using Portuguese translations of the three tools. RESULTS: One hundred and nineteen patients were evaluated and 38.6% were diagnosed with delirium by the reference rater. The Confusion Assessment Method for Intensive Care Units had a sensitivity of 72.5% and a specificity of 96.2%; the Confusion Assessment Method for Intensive Care Units Flowsheet had a sensitivity of 72.5% and a specificity of 96.2%; the Intensive Care Delirium Screening Checklist had a sensitivity of 96.0% and a specificity of 72.4%. There was strong agreement between the Confusion Assessment Method for Intensive Care Units and the Confusion Assessment Method for Intensive Care Units Flowsheet (kappa coefficient = 0.96 CONCLUSION: All three instruments are effective diagnostic tools in critically ill intensive care unit patients. In addition, the Brazilian Portuguese version of the Confusion Assessment Method for Intensive Care Units is a valid and reliable instrument for the assessment of delirium among critically ill patients.

  16. Diagnostic Methods of Helicobacter pylori Infection for Epidemiological Studies: Critical Importance of Indirect Test Validation.

    Science.gov (United States)

    Miftahussurur, Muhammad; Yamaoka, Yoshio

    2016-01-01

    Among the methods developed to detect H. pylori infection, determining the gold standard remains debatable, especially for epidemiological studies. Due to the decreasing sensitivity of direct diagnostic tests (histopathology and/or immunohistochemistry [IHC], rapid urease test [RUT], and culture), several indirect tests, including antibody-based tests (serology and urine test), urea breath test (UBT), and stool antigen test (SAT) have been developed to diagnose H. pylori infection. Among the indirect tests, UBT and SAT became the best methods to determine active infection. While antibody-based tests, especially serology, are widely available and relatively sensitive, their specificity is low. Guidelines indicated that no single test can be considered as the gold standard for the diagnosis of H. pylori infection and that one should consider the method's advantages and disadvantages. Based on four epidemiological studies, culture and RUT present a sensitivity of 74.2-90.8% and 83.3-86.9% and a specificity of 97.7-98.8% and 95.1-97.2%, respectively, when using IHC as a gold standard. The sensitivity of serology is quite high, but that of the urine test was lower compared with that of the other methods. Thus, indirect test validation is important although some commercial kits propose universal cut-off values.

  17. Development and validation of advanced oxidation protective coatings for super critical steam power plant

    Energy Technology Data Exchange (ETDEWEB)

    Henderson, M.B.; Scheefer, M. [Alstom Power Ltd., Rugby (United Kingdom); Agueero, A. [Instituto Nacional de Tecnica Aerospacial (INTA) (Spain); Allcock, B. [Monitor Coatings Ltd. (United Kingdom); Norton, B. [Indestructible Paints Ltd. (United Kingdom); Tsipas, D.N. [Aristotle Univ. of Thessaloniki (Greece); Durham, R. [FZ Juelich (Germany); Xiang, Z. [Northumbria Univ. (United Kingdom)

    2006-07-01

    Increasing the efficiency of coal-fired power plant by increasing steam temperatures and pressures brings benefits in terms of cheaper electricity and reduced emissions, particularly CO{sub 2}. In recent years the development of advanced 9%Cr ferritic steels with improved creep strength has enabled power plant operation at temperatures in excess of 600 C, such that these materials are being exploited to construct a new generation of advanced coalfired plant. However, the move to higher temperatures and pressures creates an extremely hostile oxidising environment. To enable the full potential of the new steels to be achieved, it is vital that protective coatings are developed, validated under high temperature steam and applied to candidate components from the steam path. This paper reviews recent work conducted within the Framework V project ''Coatings for Supercritical Steam Cycles'' (SUPERCOAT) to develop and demonstrate advanced slurry and thermal spray coatings capable of providing steam oxidation protection at temperatures in excess of 620 C and up to 300 bar. The programme of work has demonstrated the feasibility of applying a number of candidate coatings to steam turbine power plant components and has generated long-term steam oxidation rate and failure data that underpin the design and application work packages needed to develop and establish this technology for new and retrofit plant. (orig.)

  18. Adobe Creative Suite 4 Bible

    CERN Document Server

    Padova, Ted

    2009-01-01

    As one of the few books to cover integration and workflow issues between Photoshop, Illustrator, InDesign, GoLive, Acrobat, and Version Cue, this comprehensive reference is the one book that Creative Suite users need; Two well-known and respected authors cover topics such as developing consistent color-managed workflows, moving files among the Creative Suite applications, preparing files for print or the Web, repurposing documents, and using the Creative Suite with Microsoft Office documents; More than 1,200 pages are packed with valuable advice and techniques for tackling common everyday issu

  19. Validation of a label dosimeter with regard to dose assurance in critical applications as quarantine control

    International Nuclear Information System (INIS)

    Ehlermann, D.A.E.

    1999-01-01

    A 'label dosimeter' (dose-threshold indicator) for dose ranges of insect disinfestation became commercially available only recently. It was studied for dosimetric (metrological) properties elsewhere. The fundamental problem of its application in practice is the relation between the dose observed at a reference position and the critical minimum dose achieved in a consignment. For this reason several irradiation geometries (relations between the arrangement of the goods during irradiation and the type of the radiation source, gamma, electrons, X-rays) were studied. The observed dose distributions revealed the difficulty that for any such geometry a 'label dosimeter' with a specific but differing threshold dose-value must be utilized in order to guarantee the adherence to the required minimum dose value. The 'label dosimeter' must be placed at a position where the minimum dose is likely to occur. In situations where the position of the minimum dose is not accessible extrapolation from the dose observed at a reference position is less reliable. (author)

  20. Validation of a label dosimeter with regard to dose assurance in critical applications as quarantine control

    Energy Technology Data Exchange (ETDEWEB)

    Ehlermann, D A.E. [Institute of Process Engineering, Federal Research Centre for Nutrition, Karlsruhe (Germany)

    1999-03-01

    A `label dosimeter` (dose-threshold indicator) for dose ranges of insect disinfestation became commercially available only recently. It was studied for dosimetric (metrological) properties elsewhere. The fundamental problem of its application in practice is the relation between the dose observed at a reference position and the critical minimum dose achieved in a consignment. For this reason several irradiation geometries (relations between the arrangement of the goods during irradiation and the type of the radiation source, gamma, electrons, X-rays) were studied. The observed dose distributions revealed the difficulty that for any such geometry a `label dosimeter` with a specific but differing threshold dose-value must be utilized in order to guarantee the adherence to the required minimum dose value. The `label dosimeter` must be placed at a position where the minimum dose is likely to occur. In situations where the position of the minimum dose is not accessible extrapolation from the dose observed at a reference position is less reliable. (author) 6 refs, 5 figs, 1 tab

  1. Validation of the Society of Critical Care Medicine and American Society for Parenteral and Enteral Nutrition Recommendations for Caloric Provision to Critically Ill Obese Patients: A Pilot Study.

    Science.gov (United States)

    Mogensen, Kris M; Andrew, Benjamin Y; Corona, Jasmine C; Robinson, Malcolm K

    2016-07-01

    The Society of Critical Care Medicine (SCCM) and American Society for Parenteral and Enteral Nutrition (ASPEN) recommend that obese, critically ill patients receive 11-14 kcal/kg/d using actual body weight (ABW) or 22-25 kcal/kg/d using ideal body weight (IBW), because feeding these patients 50%-70% maintenance needs while administering high protein may improve outcomes. It is unknown whether these equations achieve this target when validated against indirect calorimetry, perform equally across all degrees of obesity, or compare well with other equations. Measured resting energy expenditure (MREE) was determined in obese (body mass index [BMI] ≥30 kg/m(2)), critically ill patients. Resting energy expenditure was predicted (PREE) using several equations: 12.5 kcal/kg ABW (ASPEN-Actual BW), 23.5 kcal/kg IBW (ASPEN-Ideal BW), Harris-Benedict (adjusted-weight and 1.5 stress-factor), and Ireton-Jones for obesity. Correlation of PREE to 65% MREE, predictive accuracy, precision, bias, and large error incidence were calculated. All equations were significantly correlated with 65% MREE but had poor predictive accuracy, had excessive large error incidence, were imprecise, and were biased in the entire cohort (N = 31). In the obesity cohort (n = 20, BMI 30-50 kg/m(2)), ASPEN-Actual BW had acceptable predictive accuracy and large error incidence, was unbiased, and was nearly precise. In super obesity (n = 11, BMI >50 kg/m(2)), ASPEN-Ideal BW had acceptable predictive accuracy and large error incidence and was precise and unbiased. SCCM/ASPEN-recommended body weight equations are reasonable predictors of 65% MREE depending on the equation and degree of obesity. Assuming that feeding 65% MREE is appropriate, this study suggests that patients with a BMI 30-50 kg/m(2) should receive 11-14 kcal/kg/d using ABW and those with a BMI >50 kg/m(2) should receive 22-25 kcal/kg/d using IBW. © 2015 American Society for Parenteral and Enteral Nutrition.

  2. Specification for the VERA Depletion Benchmark Suite

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Kang Seog [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-12-17

    CASL-X-2015-1014-000 iii Consortium for Advanced Simulation of LWRs EXECUTIVE SUMMARY The CASL neutronics simulator MPACT is under development for the neutronics and T-H coupled simulation for the pressurized water reactor. MPACT includes the ORIGEN-API and internal depletion module to perform depletion calculations based upon neutron-material reaction and radioactive decay. It is a challenge to validate the depletion capability because of the insufficient measured data. One of the detoured methods to validate it is to perform a code-to-code comparison for benchmark problems. In this study a depletion benchmark suite has been developed and a detailed guideline has been provided to obtain meaningful computational outcomes which can be used in the validation of the MPACT depletion capability.

  3. EDL Sensor Suite, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Optical Air Data Systems (OADS) L.L.C. proposes a LIDAR based remote measurement sensor suite capable of satisfying a significant number of the desired sensing...

  4. Satellite Ocean Heat Content Suite

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This collection contains an operational Satellite Ocean Heat Content Suite (SOHCS) product generated by NOAA National Environmental Satellite, Data, and Information...

  5. EVA Suit Microbial Leakage Investigation

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this project is to collect microbial samples from various EVA suits to determine how much microbial contamination is typically released during...

  6. A Critical Review of Validation, Blind Testing, and Real- World Use of Alchemical Protein-Ligand Binding Free Energy Calculations.

    Science.gov (United States)

    Abel, Robert; Wang, Lingle; Mobley, David L; Friesner, Richard A

    2017-01-01

    Protein-ligand binding is among the most fundamental phenomena underlying all molecular biology, and a greater ability to more accurately and robustly predict the binding free energy of a small molecule ligand for its cognate protein is expected to have vast consequences for improving the efficiency of pharmaceutical drug discovery. We briefly reviewed a number of scientific and technical advances that have enabled alchemical free energy calculations to recently emerge as a preferred approach, and critically considered proper validation and effective use of these techniques. In particular, we characterized a selection bias effect which may be important in prospective free energy calculations, and introduced a strategy to improve the accuracy of the free energy predictions. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  7. Development of a test rig and its application for validation and reliability testing of safety-critical software

    Energy Technology Data Exchange (ETDEWEB)

    Thai, N D; McDonald, A M [Atomic Energy of Canada Ltd., Mississauga, ON (Canada)

    1996-12-31

    This paper describes a versatile test rig developed by AECL for functional testing of safety-critical software used in the process trip computers of the Wolsong CANDU stations. The description covers the hardware and software aspects of the test rig, the test language and its interpreter, and other major testing software utilities such as the test oracle, sampler and profiler. The paper also discusses the application of the rig in the final stages of testing of the process trip computer software, namely validation and reliability tests. It shows how random test cases are generated, test scripts prepared and automatically run on the test rig. The versatility of the rig is further demonstrated in other types of testing such as sub-system tests, verification of the test oracle, testing of newly-developed test script, self-test and calibration. (author). 5 tabs., 10 figs.

  8. Development of a test rig and its application for validation and reliability testing of safety-critical software

    International Nuclear Information System (INIS)

    Thai, N.D.; McDonald, A.M.

    1995-01-01

    This paper describes a versatile test rig developed by AECL for functional testing of safety-critical software used in the process trip computers of the Wolsong CANDU stations. The description covers the hardware and software aspects of the test rig, the test language and its interpreter, and other major testing software utilities such as the test oracle, sampler and profiler. The paper also discusses the application of the rig in the final stages of testing of the process trip computer software, namely validation and reliability tests. It shows how random test cases are generated, test scripts prepared and automatically run on the test rig. The versatility of the rig is further demonstrated in other types of testing such as sub-system tests, verification of the test oracle, testing of newly-developed test script, self-test and calibration. (author). 5 tabs., 10 figs

  9. [Validation of EMINA and EVARUCI scales for assessing the risk of developing pressure ulcers in critical patients].

    Science.gov (United States)

    Roca-Biosca, A; Garcia-Fernandez, F P; Chacon-Garcés, S; Rubio-Rico, L; Olona-Cabases, M; Anguera-Saperas, L; Garcia-Grau, N; Tuset-Garijo, G; de Molina-Fernández, I; Velasco-Guillen, M C

    2015-01-01

    To contribute to the validation of the EMINA and EVAUCI scales for assessing the risk of pressure ulcers in the critical patient and compare their predictive capacity in this same context. Prospective study from December 2012 until June 2013. Polyvalent intensive care unit of 14 beds in a reference hospital for two sanitary areas. patients of 18 years of age or older and without pressure ulcers were included. They were followed until development of a pressure ulcer of grade I or greater, medical discharge, death or 30 days. presence of ulcers, daily score of the risk of developing pressure ulcers through EMINA and EVARUCI evaluation. The validity of both scales was calculated using sensitivity, specificity, and positive and negative predictive value. The level of significance was P≤0.05. A total of 189 patients were evaluated. 67.2% were male with a mean age of 59.4 (DE: 16,8) years old, 53 (28%) developed pressure ulcers, being the incidence rate of 41 ulcers per 1000 admission days. The mean day of diagnosis was 7.7 days (DE: 4,4) and the most frequent area was the sacrum. The sensitivity and specificity for the mean of observations was 94.34 (IC95% 87.17-100) and 33.33 (IC95% 25.01-41.66) for the EMINA scale for a risk>10 and 92.45 (IC95% 84.40-100) and 42.96 (IC95% 34.24-51.68) for the EVARUCI scale for a risk of>11. No differences were found in predictive capacity of both scales. For sensitivities>90%the scales show to be insufficiently specific in the pressure ulcer risk detection in critical patients. Copyright © 2014 Elsevier España, S.L.U. y SEEIUC. All rights reserved.

  10. Detecting acute distress and risk of future psychological morbidity in critically ill patients: validation of the intensive care psychological assessment tool.

    Science.gov (United States)

    Wade, Dorothy M; Hankins, Matthew; Smyth, Deborah A; Rhone, Elijah E; Mythen, Michael G; Howell, David C J; Weinman, John A

    2014-09-24

    The psychological impact of critical illness on a patient can be severe, and frequently results in acute distress as well as psychological morbidity after leaving hospital. A UK guideline states that patients should be assessed in critical care units, both for acute distress and risk of future psychological morbidity; but no suitable method for carrying out this assessment exists. The Intensive care psychological assessment tool (IPAT) was developed as a simple, quick screening tool to be used routinely to detect acute distress, and the risk of future psychological morbidity, in critical care units. A validation study of IPAT was conducted in the critical care unit of a London hospital. Once un-sedated, orientated and alert, critical care patients were assessed with the IPAT and validated tools for distress, to determine the IPAT's concurrent validity. Fifty six patients took IPAT again to establish test-retest reliability. Finally, patients completed posttraumatic stress disorder (PTSD), depression and anxiety questionnaires at three months, to determine predictive validity of the IPAT. One hundred and sixty six patients completed the IPAT, and 106 completed follow-up questionnaires at 3 months. Scale analysis showed IPAT was a reliable 10-item measure of critical care-related psychological distress. Test-retest reliability was good (r =0.8). There was good concurrent validity with measures of anxiety and depression (r =0.7, P psychological morbidity was good (r =0.4, P psychological morbidity (AUC =0.7). The IPAT was found to have good reliability and validity. Sensitivity and specificity analysis suggest the IPAT could provide a way of allowing staff to assess psychological distress among critical care patients after further replication and validation. Further work is also needed to determine its utility in predicting future psychological morbidity.

  11. Development and validation of a prediction model for insulin-associated hypoglycemia in non-critically ill hospitalized adults.

    Science.gov (United States)

    Mathioudakis, Nestoras Nicolas; Everett, Estelle; Routh, Shuvodra; Pronovost, Peter J; Yeh, Hsin-Chieh; Golden, Sherita Hill; Saria, Suchi

    2018-01-01

    To develop and validate a multivariable prediction model for insulin-associated hypoglycemia in non-critically ill hospitalized adults. We collected pharmacologic, demographic, laboratory, and diagnostic data from 128 657 inpatient days in which at least 1 unit of subcutaneous insulin was administered in the absence of intravenous insulin, total parenteral nutrition, or insulin pump use (index days). These data were used to develop multivariable prediction models for biochemical and clinically significant hypoglycemia (blood glucose (BG) of ≤70 mg/dL and model development and validation, respectively. Using predictors of age, weight, admitting service, insulin doses, mean BG, nadir BG, BG coefficient of variation (CV BG ), diet status, type 1 diabetes, type 2 diabetes, acute kidney injury, chronic kidney disease (CKD), liver disease, and digestive disease, our model achieved a c-statistic of 0.77 (95% CI 0.75 to 0.78), positive likelihood ratio (+LR) of 3.5 (95% CI 3.4 to 3.6) and negative likelihood ratio (-LR) of 0.32 (95% CI 0.30 to 0.35) for prediction of biochemical hypoglycemia. Using predictors of sex, weight, insulin doses, mean BG, nadir BG, CV BG , diet status, type 1 diabetes, type 2 diabetes, CKD stage, and steroid use, our model achieved a c-statistic of 0.80 (95% CI 0.78 to 0.82), +LR of 3.8 (95% CI 3.7 to 4.0) and -LR of 0.2 (95% CI 0.2 to 0.3) for prediction of clinically significant hypoglycemia. Hospitalized patients at risk of insulin-associated hypoglycemia can be identified using validated prediction models, which may support the development of real-time preventive interventions.

  12. Technical Reference Suite Addressing Challenges of Providing Assurance for Fault Management Architectural Design

    Science.gov (United States)

    Fitz, Rhonda; Whitman, Gerek

    2016-01-01

    Research into complexities of software systems Fault Management (FM) and how architectural design decisions affect safety, preservation of assets, and maintenance of desired system functionality has coalesced into a technical reference (TR) suite that advances the provision of safety and mission assurance. The NASA Independent Verification and Validation (IV&V) Program, with Software Assurance Research Program support, extracted FM architectures across the IV&V portfolio to evaluate robustness, assess visibility for validation and test, and define software assurance methods applied to the architectures and designs. This investigation spanned IV&V projects with seven different primary developers, a wide range of sizes and complexities, and encompassed Deep Space Robotic, Human Spaceflight, and Earth Orbiter mission FM architectures. The initiative continues with an expansion of the TR suite to include Launch Vehicles, adding the benefit of investigating differences intrinsic to model-based FM architectures and insight into complexities of FM within an Agile software development environment, in order to improve awareness of how nontraditional processes affect FM architectural design and system health management. The identification of particular FM architectures, visibility, and associated IV&V techniques provides a TR suite that enables greater assurance that critical software systems will adequately protect against faults and respond to adverse conditions. Additionally, the role FM has with regard to strengthened security requirements, with potential to advance overall asset protection of flight software systems, is being addressed with the development of an adverse conditions database encompassing flight software vulnerabilities. Capitalizing on the established framework, this TR suite provides assurance capability for a variety of FM architectures and varied development approaches. Research results are being disseminated across NASA, other agencies, and the

  13. Implementing Sentinels in the TARGIT BI Suite

    DEFF Research Database (Denmark)

    Middelfart, Morten; Pedersen, Torben Bach

    2011-01-01

    This paper describes the implementation of socalled sentinels in the TARGIT BI Suite. Sentinels are a novel type of rules that can warn a user if one or more measure changes in a multi-dimensional data cube are expected to cause a change to another measure critical to the user. Sentinels notify u...... pattern mining or correlation techniques. We demonstrate, through extensive experiments, that mining and usage of sentinels is feasible with good performance for the typical users on a real, operational data warehouse....

  14. Development of Power Assisting Suit

    Science.gov (United States)

    Yamamoto, Keijiro; Ishii, Mineo; Hyodo, Kazuhito; Yoshimitsu, Toshihiro; Matsuo, Takashi

    In order to realize a wearable power assisting suit for assisting a nurse to carry a patient in her arms, the power supply and control systems of the suit have to be miniaturized, and it has to be wireless and pipeline-less. The new wearable suit consists of shoulders, arms, back, waist and legs units to be fitted on the nurse's body. The arms, waist and legs have new pneumatic rotary actuators driven directly by micro air pumps supplied by portable Ni-Cd batteries. The muscle forces are sensed by a new muscle hardness sensor utilizing a sensing tip mounted on a force sensing film device. An embedded microcomputer is used for the calculations of control signals. The new wearable suit was applied practically to a human body and a series of movement experiments that weights in the arms were held and taken up and down was performed. Each unit of the suit could transmit assisting torque directly to each joint verifying its practicability.

  15. The development and validation of the Blended Socratic Method of Teaching (BSMT: An instructional model to enhance critical thinking skills of undergraduate business students

    Directory of Open Access Journals (Sweden)

    Eugenia Arazo Boa

    2018-01-01

    Full Text Available Enhancing critical thinking skills is one of the paramount goals of many educational institutions. This study presents the development and validation of the Blended Socratic Method of Teaching (BSMT, a teaching model intended to foster critical thinking skills of business students in the undergraduate level. The main objectives of the study were to 1 to survey the critical thinking skills of undergraduate business students, and 2 to develop and validate the BSMT model designed to enhance critical thinking skills. The research procedure comprised of two phases related to the two research objectives: 1 surveying the critical thinking skills of 371 undergraduate business students at Naresuan University International College focusing on the three critical thinking competencies of the RED model—recognize assumptions, evaluate arguments, and draw conclusion, and the determination of the level of their critical thinking; and 2 developing the instructional model followed by validation of the model by five experts. The results of the study were: 1 the undergraduate business students have deficient critical thinking based on the RED Model competencies as they scored “below average” on the critical thinking appraisal, and 2 the developed model comprised six elements: focus, syntax, principles of reaction, the social system, the support system, and application. The experts were in complete agreement that the model is “highly appropriate” in improving the critical thinking skills of the business students. The main essence of the model is the syntax comprising of five steps: group assignment, analysis and writing of case studies; group presentation of the business case analysis in class; Socratic discussion/questioning in class; posting of the case study on the class Facebook account; and online Socratic discussion/questioning. The BSMT model is an authentic and comprehensive model combining the Socratic method of teaching, information and

  16. Validation of multigroup neutron cross sections and calculational methods for the advanced neutron source against the FOEHN critical experiments measurements

    International Nuclear Information System (INIS)

    Smith, L.A.; Gallmeier, F.X.; Gehin, J.C.

    1995-05-01

    The FOEHN critical experiment was analyzed to validate the use of multigroup cross sections and Oak Ridge National Laboratory neutronics computer codes in the design of the Advanced Neutron Source. The ANSL-V 99-group master cross section library was used for all the calculations. Three different critical configurations were evaluated using the multigroup KENO Monte Carlo transport code, the multigroup DORT discrete ordinates transport code, and the multigroup diffusion theory code VENTURE. The simple configuration consists of only the fuel and control elements with the heavy water reflector. The intermediate configuration includes boron endplates at the upper and lower edges of the fuel element. The complex configuration includes both the boron endplates and components in the reflector. Cross sections were processed using modules from the AMPX system. Both 99-group and 20-group cross sections were created and used in two-dimensional models of the FOEHN experiment. KENO calculations were performed using both 99-group and 20-group cross sections. The DORT and VENTURE calculations were performed using 20-group cross sections. Because the simple and intermediate configurations are azimuthally symmetric, these configurations can be explicitly modeled in R-Z geometry. Since the reflector components cannot be modeled explicitly using the current versions of these codes, three reflector component homogenization schemes were developed and evaluated for the complex configuration. Power density distributions were calculated with KENO using 99-group cross sections and with DORT and VENTURE using 20-group cross sections. The average differences between the measured values and the values calculated with the different computer codes range from 2.45 to 5.74%. The maximum differences between the measured and calculated thermal flux values for the simple and intermediate configurations are ∼ 13%, while the average differences are < 8%

  17. Delirium assessment in postoperative patients: Validation of the Portuguese version of the Nursing Delirium Screening Scale in critical care.

    Science.gov (United States)

    Abelha, Fernando; Veiga, Dalila; Norton, Maria; Santos, Cristina; Gaudreau, Jean-David

    2013-01-01

    The aim of this study was to validate the Portuguese version of the Nursing Delirium Screening Scale (Nu-DESC) for use in critical care settings. We simultaneously and independently evaluated all postoperative patients admitted to a surgical Intensive Care Unit (SICU) over a 1-month period for delirium, using the Portuguese versions of both the Nu-DESC and the Intensive Care Delirium Screening Checklist (ICDSC) within 24 hours of admission by both the research staff physician and one bedside nurse. We determined the diagnostic accuracy of the Nu-DESC using sensitivity, specificity and ROC curve analyses. We assessed reliability between nurses and the research staff physician for Nu-DESC by intraclass correlation coefficient (ICC). We assessed agreement and reliability between Nu-DESC and ICDSC by overall and specific proportions of agreement and by kappa statistics. Based on the ICDSC, we diagnosed delirium in 12 of the 78 patients. Reliability between nurses and the staff physician for total Nu-DESC score was high. Agreement between nurses and staff physician in the delirium diagnosis was perfect. The proportion of overall agreement between Nu-DESC and ICDSC in the delirium diagnosis was 0.88 and the kappa ranged from 0.79 to 0.93. Nu-DESC Sensitivity was 100 and specificity was 86%. The Portuguese version of the Nu-DESC appears to be an accurate and reliable assessment and monitoring instrument for delirium in critical care settings. Copyright © 2013 Sociedade Brasileira de Anestesiologia. Published by Elsevier Editora Ltda. All rights reserved.

  18. The ZPIC educational code suite

    Science.gov (United States)

    Calado, R.; Pardal, M.; Ninhos, P.; Helm, A.; Mori, W. B.; Decyk, V. K.; Vieira, J.; Silva, L. O.; Fonseca, R. A.

    2017-10-01

    Particle-in-Cell (PIC) codes are used in almost all areas of plasma physics, such as fusion energy research, plasma accelerators, space physics, ion propulsion, and plasma processing, and many other areas. In this work, we present the ZPIC educational code suite, a new initiative to foster training in plasma physics using computer simulations. Leveraging on our expertise and experience from the development and use of the OSIRIS PIC code, we have developed a suite of 1D/2D fully relativistic electromagnetic PIC codes, as well as 1D electrostatic. These codes are self-contained and require only a standard laptop/desktop computer with a C compiler to be run. The output files are written in a new file format called ZDF that can be easily read using the supplied routines in a number of languages, such as Python, and IDL. The code suite also includes a number of example problems that can be used to illustrate several textbook and advanced plasma mechanisms, including instructions for parameter space exploration. We also invite contributions to this repository of test problems that will be made freely available to the community provided the input files comply with the format defined by the ZPIC team. The code suite is freely available and hosted on GitHub at https://github.com/zambzamb/zpic. Work partially supported by PICKSC.

  19. Validation and Verification (V&V) of Safety-Critical Systems Operating Under Off-Nominal Conditions

    Science.gov (United States)

    Belcastro, Christine M.

    2012-01-01

    Loss of control (LOC) remains one of the largest contributors to aircraft fatal accidents worldwide. Aircraft LOC accidents are highly complex in that they can result from numerous causal and contributing factors acting alone or more often in combination. Hence, there is no single intervention strategy to prevent these accidents. Research is underway at the National Aeronautics and Space Administration (NASA) in the development of advanced onboard system technologies for preventing or recovering from loss of vehicle control and for assuring safe operation under off-nominal conditions associated with aircraft LOC accidents. The transition of these technologies into the commercial fleet will require their extensive validation and verification (V&V) and ultimate certification. The V&V of complex integrated systems poses highly significant technical challenges and is the subject of a parallel research effort at NASA. This chapter summarizes the V&V problem and presents a proposed process that could be applied to complex integrated safety-critical systems developed for preventing aircraft LOC accidents. A summary of recent research accomplishments in this effort is referenced.

  20. Fatigue in chronically critically ill patients following intensive care - reliability and validity of the multidimensional fatigue inventory (MFI-20

    Directory of Open Access Journals (Sweden)

    Gloria-Beatrice Wintermann

    2018-02-01

    Full Text Available Abstract Background Fatigue often occurs as long-term complication in chronically critically ill (CCI patients after prolonged intensive care treatment. The Multidimensional Fatigue Inventory (MFI-20 has been established as valid instrument to measure fatigue in a wide range of medical illnesses. Regarding the measurement of fatigue in CCI patients, the psychometric properties of the MFI-20 have not been investigated so far. Thus, the present study examines reliability and validity of the MFI-20 in CCI patients. Methods A convenience sample of n = 195 patients with Critical Illness Polyneuropathy (CIP or Myopathy (CIM were recruited via personal contact within four weeks (t1 following the transfer from acute care ICU to post-acute ICU at a large rehabilitation hospital. N = 113 (median age 61.1 yrs., 72.6% men patients were again contacted via telephone three (t2 and six (t3 months following the transfer to post-acute ICU. The MFI-20, the Euro-Quality of Life (EQ-5D-3 L and the Structured Clinical Interview for the Diagnostic and Statistical Manual of mental disorders DSM-IV (SCID-I were applied within this prospective cohort study. Results The internal consistency Cronbach’s α was adequate for the MFI-total and all but the subscale Reduced Motivation (RM (range: .50–.91. Item-to-total correlations (range: .22–.80 indicated item redundancy for the subscale RM. Confirmatory Factor analyses (CFAs revealed poor model fit for the original 5-factor model of the MFI-20 (t2/t3, Confirmatory Fit Index, CFI = .783/ .834; Tucker-Lewis Index, TLI = .751/ .809; Root Mean Square Error of Approximation, RMSEA = .112/ .103. Among the alternative models (1-, 2-, 3-factor models, the data best fit to a 3-factor solution summarizing the highly correlated factors General −/ Physical Fatigue/ Reduced Activity (GF/ PF/ RA (t2/ t3, CFI = .878/ .896, TLI = .846/ .869, RMSEA = .089/ .085, 90% Confidence Interval .073–.104

  1. Virtual reality simulation training in a high-fidelity procedure suite

    DEFF Research Database (Denmark)

    Lönn, Lars; Edmond, John J; Marco, Jean

    2012-01-01

    To assess the face and content validity of a novel, full physics, full procedural, virtual reality simulation housed in a hybrid procedure suite.......To assess the face and content validity of a novel, full physics, full procedural, virtual reality simulation housed in a hybrid procedure suite....

  2. Validation of KENO V.a. and two cross-section libraries for criticality calculations of low-enriched uranium systems

    International Nuclear Information System (INIS)

    Easter, M.E.

    1985-07-01

    The SCALE code system, utilizing the Monte Carlo computer code KENO V.a, was employed to calculate 37 critical experiments. The critical assemblies had 235 U enrichments of 5% or less and cover a variety of geometries and materials. Values of k/sub eff/ were calculated using two different results using either of the cross-section libraries. The 16-energy-group Hansen-Roach and the 27-energy-group ENDF/B-IV cross-section libraries, available in SCALE, were used in this validation study, and both give good results for the experiments considered. It is concluded that the code and cross sections are adequate for low-enriched uranium systems and that reliable criticality safety calculations can be made for such systems provided the limits of validated applicability are not exceeded

  3. Validation of the LOD score compared with APACHE II score in prediction of the hospital outcome in critically ill patients.

    Science.gov (United States)

    Khwannimit, Bodin

    2008-01-01

    The Logistic Organ Dysfunction score (LOD) is an organ dysfunction score that can predict hospital mortality. The aim of this study was to validate the performance of the LOD score compared with the Acute Physiology and Chronic Health Evaluation II (APACHE II) score in a mixed intensive care unit (ICU) at a tertiary referral university hospital in Thailand. The data were collected prospectively on consecutive ICU admissions over a 24 month period from July1, 2004 until June 30, 2006. Discrimination was evaluated by the area under the receiver operating characteristic curve (AUROC). The calibration was assessed by the Hosmer-Lemeshow goodness-of-fit H statistic. The overall fit of the model was evaluated by the Brier's score. Overall, 1,429 patients were enrolled during the study period. The mortality in the ICU was 20.9% and in the hospital was 27.9%. The median ICU and hospital lengths of stay were 3 and 18 days, respectively, for all patients. Both models showed excellent discrimination. The AUROC for the LOD and APACHE II were 0.860 [95% confidence interval (CI) = 0.838-0.882] and 0.898 (95% Cl = 0.879-0.917), respectively. The LOD score had perfect calibration with the Hosmer-Lemeshow goodness-of-fit H chi-2 = 10 (p = 0.44). However, the APACHE II had poor calibration with the Hosmer-Lemeshow goodness-of-fit H chi-2 = 75.69 (p < 0.001). Brier's score showed the overall fit for both models were 0.123 (95%Cl = 0.107-0.141) and 0.114 (0.098-0.132) for the LOD and APACHE II, respectively. Thus, the LOD score was found to be accurate for predicting hospital mortality for general critically ill patients in Thailand.

  4. The validation of science virtual test to assess 7th grade students’ critical thinking on matter and heat topic (SVT-MH)

    Science.gov (United States)

    Sya’bandari, Y.; Firman, H.; Rusyati, L.

    2018-05-01

    The method used in this research was descriptive research for profiling the validation of SVT-MH to measure students’ critical thinking on matter and heat topic in junior high school. The subject is junior high school students of 7th grade (13 years old) while science teacher and expert as the validators. The instruments that used as a tool to obtain the data are rubric expert judgment (content, media, education) and rubric of readability test. There are four steps to validate SVT-MH in 7th grade Junior High School. These steps are analysis of core competence and basic competence based on Curriculum 2013, expert judgment (content, media, education), readability test and trial test (limited and larger trial test). The instrument validation resulted 30 items that represent 8 elements and 21 sub-elements to measure students’ critical thinking based on Inch in matter and heat topic. The alpha Cronbach (α) is 0.642 which means that the instrument is sufficient to measure students’ critical thinking matter and heat topic.

  5. Validation of the Monte Carlo criticality program KENO IV and the Hansen-Roach sixteen-energy-group-cross sections for high-assay uranium systems

    International Nuclear Information System (INIS)

    Handley, G.R.; Masters, L.C.; Stachowiak, R.V.

    1981-01-01

    Validation of the Monte Carlo criticality code, KENO IV, and the Hansen-Roach sixteen-energy-group cross sections was accomplished by calculating the effective neutron multiplication constant, k/sub eff/, of 29 experimentally critical assemblies which had uranium enrichments of 92.6% or higher in the uranium-235 isotope. The experiments were chosen so that a large variety of geometries and of neutron energy spectra were covered. Problems, calculating the k/sub eff/ of systems with high-uranium-concentration uranyl nitrate solution that were minimally reflected or unreflected, resulted in the separate examination of five cases

  6. Talking Back to the Media Ideal: The Development and Validation of the Critical Processing of Beauty Images Scale

    Science.gov (United States)

    Engeln-Maddox, Renee; Miller, Steven A.

    2008-01-01

    This article details the development of the Critical Processing of Beauty Images Scale (CPBI) and studies demonstrating the psychometric soundness of this measure. The CPBI measures women's tendency to engage in critical processing of media images featuring idealized female beauty. Three subscales were identified using exploratory factor analysis…

  7. Safety in the use of pressurized suits

    International Nuclear Information System (INIS)

    1984-01-01

    This Code of Practice describes the procedures relating to the safe operation of Pressurized Suit Areas and their supporting services. It is directed at personnel responsible for the design and/or operation of Pressurized Suit Areas. (author)

  8. The process of processing: exploring the validity of Neisser's perceptual cycle model with accounts from critical decision-making in the cockpit.

    Science.gov (United States)

    Plant, Katherine L; Stanton, Neville A

    2015-01-01

    The perceptual cycle model (PCM) has been widely applied in ergonomics research in domains including road, rail and aviation. The PCM assumes that information processing occurs in a cyclical manner drawing on top-down and bottom-up influences to produce perceptual exploration and actions. However, the validity of the model has not been addressed. This paper explores the construct validity of the PCM in the context of aeronautical decision-making. The critical decision method was used to interview 20 helicopter pilots about critical decision-making. The data were qualitatively analysed using an established coding scheme, and composite PCMs for incident phases were constructed. It was found that the PCM provided a mutually exclusive and exhaustive classification of the information-processing cycles for dealing with critical incidents. However, a counter-cycle was also discovered which has been attributed to skill-based behaviour, characteristic of experts. The practical applications and future research questions are discussed. Practitioner Summary: This paper explores whether information processing, when dealing with critical incidents, occurs in the manner anticipated by the perceptual cycle model. In addition to the traditional processing cycle, a reciprocal counter-cycle was found. This research can be utilised by those who use the model as an accident analysis framework.

  9. Complementary role of critical integral experiment and power reactor start-up experiments for LMFBR neutronics data and method validation

    International Nuclear Information System (INIS)

    Salvatores, M.

    1986-09-01

    Both critical experiments and power reactor results play at present a complementary role in reducing the uncertainties in Key design parameters for LMFBR, which can be relevant for the economic performances of this type of reactors

  10. The Air Force Critical Care Air Transport Team (CCATT): Using the Estimating Supplies Program (ESP) to Validate Clinical Requirements

    Science.gov (United States)

    2005-04-05

    Disease, Severe. 0249 Peptic Ulcer , Gastric or Duodenal, Penetrating and/or Perforating . 0250 Peptic Ulcer , Gastric or Duodenal, Uncomplicated. 0251...in US Air Force (USAF) Allowance Standard (AS) development and management . The Critical Care Air Transport Team (CCATT) Unit Type Code (UTC) AS was...tasks enabling the management of the critically ill or injured en route to the appropriate level of care (LOC) or medical treatment facility (MTF

  11. Z-2 Suit Support Stand and MKIII Suit Center of Gravity Test

    Science.gov (United States)

    Nguyen, Tuan Q.

    2014-01-01

    NASA's next generation spacesuits are the Z-Series suits, made for a range of possible exploration missions in the near future. The prototype Z-1 suit has been developed and assembled to incorporate new technologies that has never been utilized before in the Apollo suits and the Extravehicular Mobility Unit (EMU). NASA engineers tested the Z-1 suit extensively in order to developed design requirements for the new Z-2 suit. At the end of 2014, NASA will be receiving the new Z-2 suit to perform more testing and to further develop the new technologies of the suit. In order to do so, a suit support stand will be designed and fabricated to support the Z-2 suit during maintenance, sizing, and structural leakage testing. The Z-2 Suit Support Stand (Z2SSS) will be utilized for these purposes in the early testing stages of the Z-2 suit.

  12. Are chiropractic tests for the lumbo-pelvic spine reliable and valid? A systematic critical literature review

    DEFF Research Database (Denmark)

    Hestbaek, L; Leboeuf-Yde, C

    2000-01-01

    OBJECTIVE: To systematically review the peer-reviewed literature about the reliability and validity of chiropractic tests used to determine the need for spinal manipulative therapy of the lumbo-pelvic spine, taking into account the quality of the studies. DATA SOURCES: The CHIROLARS database......-pelvic spine were included. DATA EXTRACTION: Data quality were assessed independently by the two reviewers, with a quality score based on predefined methodologic criteria. Results of the studies were then evaluated in relation to quality. DATA SYNTHESIS: None of the tests studied had been sufficiently...... evaluated in relation to reliability and validity. Only tests for palpation for pain had consistently acceptable results. Motion palpation of the lumbar spine might be valid but showed poor reliability, whereas motion palpation of the sacroiliac joints seemed to be slightly reliable but was not shown...

  13. The Internal, External, and Diagnostic Validity of Sluggish Cognitive Tempo: A Meta-Analysis and Critical Review

    Science.gov (United States)

    Becker, Stephen P.; Leopold, Daniel R.; Burns, G. Leonard; Jarrett, Matthew A.; Langberg, Joshua M.; Marshall, Stephen A.; McBurnett, Keith; Waschbusch, Daniel A.; Willcutt, Erik G.

    2015-01-01

    Objective To conduct the first meta-analysis evaluating the internal and external validity of the sluggish cognitive tempo (SCT) construct as related to or distinct from attention-deficit/hyperactivity disorder (ADHD) and as associated with functional impairment and neuropsychological functioning. Method Electronic databases were searched through September 2015 for studies examining the factor structure and/or correlates of SCT in children or adults. The search procedures identified 73 papers. The core SCT behaviors included across studies, as well as factor loadings and reliability estimates, were reviewed to evaluate internal validity. Pooled correlation effect sizes using random effects models were used to evaluate SCT in relation to external validity domains (i.e., demographics, other psychopathologies, functional impairment, and neuropsychological functioning). Results Strong support was found for the internal validity of the SCT construct. Specifically, across factor analytic studies including over 19,000 individuals, 13 SCT items loaded consistently on an SCT factor as opposed to an ADHD factor. Findings also support the reliability (i.e., internal consistency, test-retest reliability, inter-rater reliability) of SCT. In terms of external validity, there is some indication that SCT may increase with age (r = 0.11) and be associated with lower socioeconomic status (r = 0.10). Modest (potentially negligible) support was found for SCT symptoms being higher in males than females in children (r = 0.05) but not adults. SCT is more strongly associated with ADHD inattention (r = 0.63 in children, r = 0.72 in adults) than with ADHD hyperactivity-impulsivity (r = 0.32 in children, r = 0.46 in adults), and it likewise appears that SCT is more strongly associated with internalizing symptoms than with externalizing symptoms. SCT is associated with significant global, social, and academic impairment (rs = 0.38–0.44). Effects for neuropsychological functioning are mixed

  14. External Validation and Recalibration of Risk Prediction Models for Acute Traumatic Brain Injury among Critically Ill Adult Patients in the United Kingdom.

    Science.gov (United States)

    Harrison, David A; Griggs, Kathryn A; Prabhu, Gita; Gomes, Manuel; Lecky, Fiona E; Hutchinson, Peter J A; Menon, David K; Rowan, Kathryn M

    2015-10-01

    This study validates risk prediction models for acute traumatic brain injury (TBI) in critical care units in the United Kingdom and recalibrates the models to this population. The Risk Adjustment In Neurocritical care (RAIN) Study was a prospective, observational cohort study in 67 adult critical care units. Adult patients admitted to critical care following acute TBI with a last pre-sedation Glasgow Coma Scale score of less than 15 were recruited. The primary outcomes were mortality and unfavorable outcome (death or severe disability, assessed using the Extended Glasgow Outcome Scale) at six months following TBI. Of 3626 critical care unit admissions, 2975 were analyzed. Following imputation of missing outcomes, mortality at six months was 25.7% and unfavorable outcome 57.4%. Ten risk prediction models were validated from Hukkelhoven and colleagues, the Medical Research Council (MRC) Corticosteroid Randomisation After Significant Head Injury (CRASH) Trial Collaborators, and the International Mission for Prognosis and Analysis of Clinical Trials in TBI (IMPACT) group. The model with the best discrimination was the IMPACT "Lab" model (C index, 0.779 for mortality and 0.713 for unfavorable outcome). This model was well calibrated for mortality at six months but substantially under-predicted the risk of unfavorable outcome. Recalibration of the models resulted in small improvements in discrimination and excellent calibration for all models. The risk prediction models demonstrated sufficient statistical performance to support their use in research and audit but fell below the level required to guide individual patient decision-making. The published models for unfavorable outcome at six months had poor calibration in the UK critical care setting and the models recalibrated to this setting should be used in future research.

  15. Vehicle-network defensive aids suite

    Science.gov (United States)

    Rapanotti, John

    2005-05-01

    Defensive Aids Suites (DAS) developed for vehicles can be extended to the vehicle network level. The vehicle network, typically comprising four platoon vehicles, will benefit from improved communications and automation based on low latency response to threats from a flexible, dynamic, self-healing network environment. Improved DAS performance and reliability relies on four complementary sensor technologies including: acoustics, visible and infrared optics, laser detection and radar. Long-range passive threat detection and avoidance is based on dual-purpose optics, primarily designed for manoeuvring, targeting and surveillance, combined with dazzling, obscuration and countermanoeuvres. Short-range active armour is based on search and track radar and intercepting grenades to defeat the threat. Acoustic threat detection increases the overall robustness of the DAS and extends the detection range to include small calibers. Finally, detection of active targeting systems is carried out with laser and radar warning receivers. Synthetic scene generation will provide the integrated environment needed to investigate, develop and validate these new capabilities. Computer generated imagery, based on validated models and an acceptable set of benchmark vignettes, can be used to investigate and develop fieldable sensors driven by real-time algorithms and countermeasure strategies. The synthetic scene environment will be suitable for sensor and countermeasure development in hardware-in-the-loop simulation. The research effort focuses on two key technical areas: a) computing aspects of the synthetic scene generation and b) and development of adapted models and databases. OneSAF is being developed for research and development, in addition to the original requirement of Simulation and Modelling for Acquisition, Rehearsal, Requirements and Training (SMARRT), and is becoming useful as a means for transferring technology to other users, researchers and contractors. This procedure

  16. Acute Kidney Injury in Trauma Patients Admitted to Critical Care: Development and Validation of a Diagnostic Prediction Model.

    Science.gov (United States)

    Haines, Ryan W; Lin, Shih-Pin; Hewson, Russell; Kirwan, Christopher J; Torrance, Hew D; O'Dwyer, Michael J; West, Anita; Brohi, Karim; Pearse, Rupert M; Zolfaghari, Parjam; Prowle, John R

    2018-02-26

    Acute Kidney Injury (AKI) complicating major trauma is associated with increased mortality and morbidity. Traumatic AKI has specific risk factors and predictable time-course facilitating diagnostic modelling. In a single centre, retrospective observational study we developed risk prediction models for AKI after trauma based on data around intensive care admission. Models predicting AKI were developed using data from 830 patients, using data reduction followed by logistic regression, and were independently validated in a further 564 patients. AKI occurred in 163/830 (19.6%) with 42 (5.1%) receiving renal replacement therapy (RRT). First serum creatinine and phosphate, units of blood transfused in first 24 h, age and Charlson score discriminated need for RRT and AKI early after trauma. For RRT c-statistics were good to excellent: development: 0.92 (0.88-0.96), validation: 0.91 (0.86-0.97). Modelling AKI stage 2-3, c-statistics were also good, development: 0.81 (0.75-0.88) and validation: 0.83 (0.74-0.92). The model predicting AKI stage 1-3 performed moderately, development: c-statistic 0.77 (0.72-0.81), validation: 0.70 (0.64-0.77). Despite good discrimination of need for RRT, positive predictive values (PPV) at the optimal cut-off were only 23.0% (13.7-42.7) in development. However, PPV for the alternative endpoint of RRT and/or death improved to 41.2% (34.8-48.1) highlighting death as a clinically relevant endpoint to RRT.

  17. The BRITNeY Suite Animation Tool

    DEFF Research Database (Denmark)

    Westergaard, Michael; Lassen, Kristian Bisgaard

    2006-01-01

    This paper describes the BRITNeY suite, a tool which enables users to create visualizations of formal models. BRITNeY suite is integrated with CPN Tools, and we give an example of how to extend a simple stop-and-wait protocol with a visualization in the form of message sequence charts. We also sh...... examples of animations created during industrial projects to give an impression of what is possible with the BRITNeY suite....

  18. Factorial Validity of the Toronto Alexithymia Scale (TAS-20) in Clinical Samples: A Critical Examination of the Literature and a Psychometric Study in Anorexia Nervosa.

    Science.gov (United States)

    Torres, Sandra; Guerra, Marina P; Miller, Kylee; Costa, Patrício; Cruz, Inês; Vieira, Filipa M; Brandão, Isabel; Roma-Torres, António; Rocha, Magda

    2018-03-30

    There is extensive use of the 20-item Toronto Alexithymia Scale (TAS-20) in research and clinical practice in anorexia nervosa (AN), though it is not empirically established in this population. This study aims to examine the factorial validity of the TAS-20 in a Portuguese AN sample (N = 125), testing four different models (ranging from 1 to 4 factors) that were identified in critical examination of existing factor analytic studies. Results of confirmatory factor analysis (CFA) suggested that the three-factor solution, measuring difficulty identifying (DIF) and describing feelings (DDF), and externally oriented thinking (EOT), was the best fitting model. The quality of measurement improves if two EOT items (16 and 18) are eliminated. Internal consistency of EOT was low and decreased with age. The results provide support for the factorial validity of the TAS-20 in AN. Nevertheless, the measurement of EOT requires some caution and may be problematic in AN adolescents.

  19. HPC Benchmark Suite NMx, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Intelligent Automation Inc., (IAI) and University of Central Florida (UCF) propose to develop a comprehensive numerical test suite for benchmarking current and...

  20. ASDA - Advanced Suit Design Analyzer computer program

    Science.gov (United States)

    Bue, Grant C.; Conger, Bruce C.; Iovine, John V.; Chang, Chi-Min

    1992-01-01

    An ASDA model developed to evaluate the heat and mass transfer characteristics of advanced pressurized suit design concepts for low pressure or vacuum planetary applications is presented. The model is based on a generalized 3-layer suit that uses the Systems Integrated Numerical Differencing Analyzer '85 in conjunction with a 41-node FORTRAN routine. The latter simulates the transient heat transfer and respiratory processes of a human body in a suited environment. The user options for the suit encompass a liquid cooled garment, a removable jacket, a CO2/H2O permeable layer, and a phase change layer.

  1. Validation study of the reactor physics lattice transport code WIMSD-5B by TRX and BAPL critical experiments of light water reactors

    International Nuclear Information System (INIS)

    Khan, M.J.H.; Alam, A.B.M.K.; Ahsan, M.H.; Mamun, K.A.A.; Islam, S.M.A.

    2015-01-01

    Highlights: • To validate the reactor physics lattice code WIMSD-5B by this analysis. • To model TRX and BAPL critical experiments using WIMSD-5B. • To compare the calculated results with experiment and MCNP results. • To rely on WIMSD-5B code for TRIGA calculations. - Abstract: The aim of this analysis is to validate the reactor physics lattice transport code WIMSD-5B by TRX (thermal reactor-one region lattice) and BAPL (Bettis Atomic Power Laboratory-one region lattice) critical experiments of light water reactors for neutronics analysis of 3 MW TRIGA Mark-II research reactor at AERE, Dhaka, Bangladesh. This analysis is achieved through the analysis of integral parameters of five light water reactor critical experiments TRX-1, TRX-2, BAPL-UO 2 -1, BAPL-UO 2 -2 and BAPL-UO 2 -3 based on evaluated nuclear data libraries JEFF-3.1 and ENDF/B-VII.1. In integral measurements, these experiments are considered as standard benchmark lattices for validating the reactor physics lattice transport code WIMSD-5B as well as evaluated nuclear data libraries. The integral parameters of the said critical experiments are calculated using the reactor physics lattice transport code WIMSD-5B. The calculated integral parameters are compared to the measured values as well as the earlier published MCNP results based on the Chinese evaluated nuclear data library CENDL-3.0 for assessment of deterministic calculation. It was found that the calculated integral parameters give mostly reasonable and globally consistent results with the experiment and the MCNP results. Besides, the group constants in WIMS format for the isotopes U-235 and U-238 between two data files have been compared using WIMS library utility code WILLIE and it was found that the group constants are well consistent with each other. Therefore, this analysis reveals the validation study of the reactor physics lattice transport code WIMSD-5B based on JEFF-3.1 and ENDF/B-VII.1 libraries and can also be essential to

  2. Development and validation of a critical gradient energetic particle driven Alfven eigenmode transport model for DIII-D tilted neutral beam experiments

    Science.gov (United States)

    Waltz, R. E.; Bass, E. M.; Heidbrink, W. W.; VanZeeland, M. A.

    2015-11-01

    Recent experiments with the DIII-D tilted neutral beam injection (NBI) varying the beam energetic particle (EP) source profiles have provided strong evidence that unstable Alfven eigenmodes (AE) drive stiff EP transport at a critical EP density gradient [Heidbrink et al 2013 Nucl. Fusion 53 093006]. Here the critical gradient is identified by the local AE growth rate being equal to the local ITG/TEM growth rate at the same low toroidal mode number. The growth rates are taken from the gyrokinetic code GYRO. Simulation show that the slowing down beam-like EP distribution has a slightly lower critical gradient than the Maxwellian. The ALPHA EP density transport code [Waltz and Bass 2014 Nucl. Fusion 54 104006], used to validate the model, combines the low-n stiff EP critical density gradient AE mid-core transport with the Angioni et al (2009 Nucl. Fusion 49 055013) energy independent high-n ITG/TEM density transport model controling the central core EP density profile. For the on-axis NBI heated DIII-D shot 146102, while the net loss to the edge is small, about half the birth fast ions are transported from the central core r/a  <  0.5 and the central density is about half the slowing down density. These results are in good agreement with experimental fast ion pressure profiles inferred from MSE constrained EFIT equilibria.

  3. Higher Order Thinking in the Australian Army Suite of Logistic Officer Courses

    National Research Council Canada - National Science Library

    Bradford, Scott R

    2006-01-01

    .... The current Suite of Logistic Officer Courses (SOLOC) has been recently criticized for failing to meet this requirement, with the general perception that there is a distinct lack of higher-order thinking competencies within this continuum...

  4. Aircraft Loss-of-Control: Analysis and Requirements for Future Safety-Critical Systems and Their Validation

    Science.gov (United States)

    Belcastro, Christine M.

    2011-01-01

    Loss of control remains one of the largest contributors to fatal aircraft accidents worldwide. Aircraft loss-of-control accidents are complex, resulting from numerous causal and contributing factors acting alone or more often in combination. Hence, there is no single intervention strategy to prevent these accidents. This paper summarizes recent analysis results in identifying worst-case combinations of loss-of-control accident precursors and their time sequences, a holistic approach to preventing loss-of-control accidents in the future, and key requirements for validating the associated technologies.

  5. Structural biomechanics of the craniomaxillofacial skeleton under maximal masticatory loading: Inferences and critical analysis based on a validated computational model.

    Science.gov (United States)

    Pakdel, Amir R; Whyne, Cari M; Fialkov, Jeffrey A

    2017-06-01

    The trend towards optimizing stabilization of the craniomaxillofacial skeleton (CMFS) with the minimum amount of fixation required to achieve union, and away from maximizing rigidity, requires a quantitative understanding of craniomaxillofacial biomechanics. This study uses computational modeling to quantify the structural biomechanics of the CMFS under maximal physiologic masticatory loading. Using an experimentally validated subject-specific finite element (FE) model of the CMFS, the patterns of stress and strain distribution as a result of physiological masticatory loading were calculated. The trajectories of the stresses were plotted to delineate compressive and tensile regimes over the entire CMFS volume. The lateral maxilla was found to be the primary vertical buttress under maximal bite force loading, with much smaller involvement of the naso-maxillary buttress. There was no evidence that the pterygo-maxillary region is a buttressing structure, counter to classical buttress theory. The stresses at the zygomatic sutures suggest that two-point fixation of zygomatic complex fractures may be sufficient for fixation under bite force loading. The current experimentally validated biomechanical FE model of the CMFS is a practical tool for in silico optimization of current practice techniques and may be used as a foundation for the development of design criteria for future technologies for the treatment of CMFS injury and disease. Copyright © 2017 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.

  6. Sibelius. Karelia Suite, Op. 11 / Robert Layton

    Index Scriptorium Estoniae

    Layton, Robert

    1996-01-01

    Uuest heliplaadist "Sibelius. Karelia Suite, Op. 11. Luonnotar, Op. 70 a. Andante festivo. The Oceanides, Op. 73. King Christian II, Op. 27-Suite. Finlandia, Op. 26a. Gothenburg Symphony Orchester, Neeme Järvi" DG 447 760-2GH (72 minutes: DDD)

  7. Exploring the concept of "caring cultures": A critical examination of the conceptual, methodological and validity issues with the "caring cultures" construct.

    Science.gov (United States)

    Gillin, Nicola; Taylor, Ruth; Walker, Susan

    2017-12-01

    To critically examine the conceptual, methodological and validity issues with the "caring cultures" construct. Post the Francis Report, "caring cultures" and alternative terminology such as "culture/s of care/caring/compassionate care" have gained prominence in the literature, especially within a UK policy context. However, in order to understand the value these "caring cultures" hold in terms of clinical practice, the concept itself first needs to be understood. A discussion and critical examination of the concept of "caring cultures" and associated terminology. Grey literature, database, library and reference list searches were conducted. It is implied that "caring cultures" influence patient care. However, evidence which verifies this assertion is limited. In this article, the concept of "caring cultures" is deconstructed and its validity explored. An alternative to "caring cultures" is proposed in terms of research, whereby the concept of culture is instead explored in detail, on a microsystem level, using appropriate methodology. The concept of "caring cultures", although attractive in terms of its apparent simplicity, is not considered the most useful nor appropriate phrases in terms of advancing research. Instead, research which examines the established concept of "culture" in relation to outcomes such as patient care, doing so with an appropriate methodology, is viewed as a more suitable alternative. Clarifying concepts and terminology relating to "caring cultures" is essential for research to progress and the impact of culture on clinical practice to be better understood. © 2017 John Wiley & Sons Ltd.

  8. When Eating Right, Is Measured Wrong! A Validation and Critical Examination of the ORTO-15 Questionnaire in German.

    Directory of Open Access Journals (Sweden)

    Benjamin Missbach

    Full Text Available The characteristic trait of individuals developing a pathological obsession and preoccupation with healthy foods and a restrictive and avoidant eating behavior is described as orthorexia nervosa (ON. For ON, neither universal diagnosis criteria nor valid tools for large-scale epidemiologic assessment are available in the literature. The aim of the current study is to analyze the psychometric properties of a translated German version of the ORTO-15 questionnaire. The German version of the ORTO-15, a eating behavior and dieting habits questionnaire were completed by 1029 German-speaking participants (74.6% female aged between 19 and 70 years (M = 31.21 ± 10.43 years. Our results showed that after confirmatory factor analysis, the best fitting model of the original version is a single-factor structure (9-item shortened version: ORTO-9-GE. The final model showed only moderate internal consistency (Cronbach's alpha = .67, even after omitting 40% of the original question. A total of 69.1% participants showed orthorectic tendencies. Orthorectic tendencies are associated with special eating behavior features (dieting frequency, vegetarian and vegan diet. Education level did not influence ON tendency and nutritional students did not show higher ON tendency compared to students from other disciplines. This study is the first attempt to translate and to evaluate the psychometric properties of a German version of the ORTO-15 questionnaire. The ORTO-9-GE questionnaire, however, is only a mediocre tool for assessing orthorectic tendencies in individuals and shows moderate reliability and internal consistency. Our research suggests, that future studies are needed to provide more reliable and valid assessment tools to investigate orthorexia nervosa.

  9. When Eating Right, Is Measured Wrong! A Validation and Critical Examination of the ORTO-15 Questionnaire in German.

    Science.gov (United States)

    Missbach, Benjamin; Hinterbuchinger, Barbara; Dreiseitl, Verena; Zellhofer, Silvia; Kurz, Carina; König, Jürgen

    2015-01-01

    The characteristic trait of individuals developing a pathological obsession and preoccupation with healthy foods and a restrictive and avoidant eating behavior is described as orthorexia nervosa (ON). For ON, neither universal diagnosis criteria nor valid tools for large-scale epidemiologic assessment are available in the literature. The aim of the current study is to analyze the psychometric properties of a translated German version of the ORTO-15 questionnaire. The German version of the ORTO-15, a eating behavior and dieting habits questionnaire were completed by 1029 German-speaking participants (74.6% female) aged between 19 and 70 years (M = 31.21 ± 10.43 years). Our results showed that after confirmatory factor analysis, the best fitting model of the original version is a single-factor structure (9-item shortened version: ORTO-9-GE). The final model showed only moderate internal consistency (Cronbach's alpha = .67), even after omitting 40% of the original question. A total of 69.1% participants showed orthorectic tendencies. Orthorectic tendencies are associated with special eating behavior features (dieting frequency, vegetarian and vegan diet). Education level did not influence ON tendency and nutritional students did not show higher ON tendency compared to students from other disciplines. This study is the first attempt to translate and to evaluate the psychometric properties of a German version of the ORTO-15 questionnaire. The ORTO-9-GE questionnaire, however, is only a mediocre tool for assessing orthorectic tendencies in individuals and shows moderate reliability and internal consistency. Our research suggests, that future studies are needed to provide more reliable and valid assessment tools to investigate orthorexia nervosa.

  10. Heat and mass transfer in air-fed pressurised suits

    International Nuclear Information System (INIS)

    Tesch, K.; Collins, M.W.; Karayiannis, T.G.; Atherton, M.A.; Edwards, P.

    2009-01-01

    Air-fed pressurised suits are used to protect workers against contamination and hazardous environments. The specific application here is the necessity for regular clean-up maintenance within the torus chamber of fusion reactors. The current design of suiting has been developed empirically. It is, therefore, very desirable to formulate a thermo-fluids model, which will be able to define optimum designs and operating parameters. Two factors indicate that the modelling should be as comprehensive as possible. Firstly, the overall thermo-fluids problem is three-dimensional and includes mass as well as heat transfer. The fluid field is complex, bounded on one side by the human body and on the other by what may be distensible, porous and multi-layer clothing. In this paper, we report firstly the modelling necessary for the additional mass and heat transport processes. This involves the use of Fick's and Fourier's laws and conjugate heat transfer. The results of an initial validation study are presented. Temperatures at the outlet of the suits were obtained experimentally and compared with those predicted by the overall CFD model. Realistic three-dimensional geometries were used for the suit and human body. Calculations were for turbulent flow with single- and two-component (species) models

  11. Evaluating Suit Fit Using Performance Degradation

    Science.gov (United States)

    Margerum, Sarah E.; Cowley, Matthew; Harvill, Lauren; Benson, Elizabeth; Rajulu, Sudhakar

    2011-01-01

    The Mark III suit has multiple sizes of suit components (arm, leg, and gloves) as well as sizing inserts to tailor the fit of the suit to an individual. This study sought to determine a way to identify the point an ideal suit fit transforms into a bad fit and how to quantify this breakdown using mobility-based physical performance data. This study examined the changes in human physical performance via degradation of the elbow and wrist range of motion of the planetary suit prototype (Mark III) with respect to changes in sizing and as well as how to apply that knowledge to suit sizing options and improvements in suit fit. The methods implemented in this study focused on changes in elbow and wrist mobility due to incremental suit sizing modifications. This incremental sizing was within a range that included both optimum and poor fit. Suited range of motion data was collected using a motion analysis system for nine isolated and functional tasks encompassing the elbow and wrist joints. A total of four subjects were tested with motions involving both arms simultaneously as well as the right arm only. The results were then compared across sizing configurations. The results of this study indicate that range of motion may be used as a viable parameter to quantify at what stage suit sizing causes a detriment in performance; however the human performance decrement appeared to be based on the interaction of multiple joints along a limb, not a single joint angle. The study was able to identify a preliminary method to quantify the impact of size on performance and to develop a means to gauge tolerances around optimal size. More work is needed to improve the assessment of optimal fit and to compensate for multiple joint interactions.

  12. Clinical and preclinical validation of the serum free light chain assay: identification of the critical difference for optimized clinical use.

    Science.gov (United States)

    Hansen, Charlotte T; Münster, Anna-Marie; Nielsen, Lars; Pedersen, Per; Abildgaard, Niels

    2012-12-01

    The use of the assay for the measurements of free light chains in serum (sFLCs) is increasing. However, there are technical limitations that potentially affect the use in serial measurements. We need further knowledge on the standards of analytical precision, the utility of conventional population-based reference values and the critical difference (CD) between serial results required for significance. To answer these questions, the biological variation must be known. We determined the biological variation in healthy individuals and patients with plasma cell dyscrasia (PCD). We assessed the imprecision of the analysis in use from FreeLite™. We determined the reference interval (RI) in 170 healthy individuals. The biological variation is identical for healthy individuals and patients with PCD. The imprecision of the sFLC analysis cannot fulfil the desirable performance standards for a laboratory test, but are within the manufacturer's ±20% variation for quality control samples. RI showed a significant increase for κ FLC and κ/λ ratio with age, but not for λ. Critical difference was calculated to be 24% and 23% for κ and λ, respectively. We suggest the use of an age-dependent RI. When monitoring patients with PCD, their own former results are the best reference, and knowledge on CD is a valuable tool, which we describe for the first time. Also, it challenges the recently proposed International Myeloma Working Group 'paraprotein relapse criteria', recommending an increase of more than 25% in the involved FLC to indicate the need for initiation of retreatment. We recommend revision of this criterion. © 2012 John Wiley & Sons A/S.

  13. Validation of Neutron Calculation Codes and Models by means of benchmark cases in the frame of the Binational Commission of Nuclear Energy. Criticality Experiments

    International Nuclear Information System (INIS)

    Dos Santos, Adimir; Siqueira, Paulo de Tarso D.; Andrade e Silva, Graciete Simões; Grant, Carlos; Tarazaga, Ariel E.; Barberis, Claudia

    2013-01-01

    In year 2008 the Atomic Energy National Commission (CNEA) of Argentina, and the Brazilian Institute of Energetic and Nuclear Research (IPEN), under the frame of Nuclear Energy Argentine Brazilian Agreement (COBEN), among many others, included the project “Validation and Verification of Calculation Methods used for Research and Experimental Reactors . At this time, it was established that the validation was to be performed with models implemented in the deterministic codes HUEMUL and PUMA (cell and reactor codes) developed by CNEA and those ones implemented in MCNP by CNEA and IPEN. The necessary data for these validations would correspond to theoretical-experimental reference cases in the research reactor IPEN/MB-01 located in São Paulo, Brazil. The staff of the group Reactor and Nuclear Power Studies (SERC) of CNEA, from the argentine side, performed calculations with deterministic models (HUEMUL-PUMA) and probabilistic methods (MCNP) modeling a great number of physical situations of de reactor, which previously have been studied and modeled by members of the Center of Nuclear Engineering of the IPEN, whose results were extensively provided to CNEA. In this paper results for critical configurations are shown. (author)

  14. Psychometric validation of the behavioral indicators of pain scale for the assessment of pain in mechanically ventilated and unable to self-report critical care patients.

    Science.gov (United States)

    Latorre-Marco, I; Acevedo-Nuevo, M; Solís-Muñoz, M; Hernández-Sánchez, L; López-López, C; Sánchez-Sánchez, M M; Wojtysiak-Wojcicka, M; de Las Pozas-Abril, J; Robleda-Font, G; Frade-Mera, M J; De Blas-García, R; Górgolas-Ortiz, C; De la Figuera-Bayón, J; Cavia-García, C

    2016-11-01

    To assess the psychometric properties of the behavioral indicators of pain scale (ESCID) when applied to a wide range of medical and surgical critical patients. A multicentre, prospective observational study was designed to validate a scale measuring instrument. Twenty Intensive Care Units of 14 hospitals belonging to the Spanish National Health System. A total of 286 mechanically ventilated, unable to self-report critically ill medical and surgical adult patients. Pain levels were measured by two independent evaluators simultaneously, using two scales: ESCID and the behavioral pain scale (BPS). Pain was observed before, during, and after two painful procedures (turning, tracheal suctioning) and one non-painful procedure. ESCID reliability was measured on the basis of internal consistency using the Cronbach-α coefficient. Inter-rater and intra-rater agreement were measured. The Spearman correlation coefficient was used to assess the correlation between ESCID and BPS. A total of 4386 observations were made in 286 patients (62% medical and 38% surgical). High correlation was found between ESCID and BPS (r=0.94-0.99; p<0.001), together with high intra-rater and inter-rater concordance. ESCID was internally reliable, with a Cronbach-α value of 0.85 (95%CI 0.81-0.88). Cronbach-α coefficients for ESCID domains were high: facial expression 0.87 (95%CI 0.84-0.89), calmness 0.84 (95%CI 0.81-0.87), muscle tone 0.80 (95%CI 0.75-0.84), compliance with mechanical ventilation 0.70 (95%CI 0.63-0.75) and consolability 0.85 (95%CI 0.81-0.88). ESCID is valid and reliable for measuring pain in mechanically ventilated unable to self-report medical and surgical critical care patients. CLINICALTRIALS.GOV: NCT01744717. Copyright © 2016 The Authors. Publicado por Elsevier España, S.L.U. All rights reserved.

  15. Z-1 Prototype Space Suit Testing Summary

    Science.gov (United States)

    Ross, Amy

    2013-01-01

    The Advanced Space Suit team of the NASA-Johnson Space Center performed a series of test with the Z-1 prototype space suit in 2012. This paper discusses, at a summary level, the tests performed and results from those tests. The purpose of the tests were two-fold: 1) characterize the suit performance so that the data could be used in the downselection of components for the Z-2 Space Suit and 2) develop interfaces with the suitport and exploration vehicles through pressurized suit evaluations. Tests performed included isolated and functional range of motion data capture, Z-1 waist and hip testing, joint torque testing, CO2 washout testing, fit checks and subject familiarizations, an exploration vehicle aft deck and suitport controls interface evaluation, delta pressure suitport tests including pressurized suit don and doff, and gross mobility and suitport ingress and egress demonstrations in reduced gravity. Lessons learned specific to the Z-1 prototype and to suit testing techniques will be presented.

  16. Validation of the Society for Vascular Surgery's objective performance goals for critical limb ischemia in everyday vascular surgery practice.

    Science.gov (United States)

    Goodney, Philip P; Schanzer, Andres; Demartino, Randall R; Nolan, Brian W; Hevelone, Nathanael D; Conte, Michael S; Powell, Richard J; Cronenwett, Jack L

    2011-07-01

    To develop standardized metrics for expected outcomes in lower extremity revascularization for critical limb ischemia (CLI), the Society for Vascular Surgery (SVS) has developed objective performance goals (OPGs) based on aggregate data from randomized trials of lower extremity bypass (LEB). It remains unknown, however, if these targets can be achieved in everyday vascular surgery practice. We applied SVS OPG criteria to 1039 patients undergoing 1039 LEB operations for CLI with autogenous vein (excluding patients on dialysis) within the Vascular Study Group of New England (VSGNE). Each of the individual OPGs was calculated within the VSGNE dataset, along with its surrounding 95% confidence intervals (CIs) and compared to published SVS OPGs using χ(2) comparisons and survival analysis. Across most risk strata, patients in the VSGNE and SVS OPG cohorts were similar (clinical high-risk [age >80 years and tissue loss]: 15.3% VSGNE; 16.2% SVS OPG; P = .58; anatomic high risk [infrapopliteal target artery]: 57.8% VSGNE; 60.2% SVS OPG; P = .32). However, the proportion of VSGNE patients designated as conduit high-risk (lack of single-segment great saphenous vein) was lower (10.2% VSGNE; 26.9% SVS OPG;P Society for Vascular Surgery. Published by Mosby, Inc. All rights reserved.

  17. Oracle SOA Suite 11g performance cookbook

    CERN Document Server

    Brasier, Matthew; Wright, Nicholas

    2013-01-01

    This is a Cookbook with interesting, hands-on recipes, giving detailed descriptions and lots of practical walkthroughs for boosting the performance of your Oracle SOA Suite.This book is for Oracle SOA Suite 11g administrators, developers, and architects who want to understand how they can maximise the performance of their SOA Suite infrastructure. The recipes contain easy to follow step-by-step instructions and include many helpful and practical tips. It is suitable for anyone with basic operating system and application server administration experience.

  18. Analysis of student’s scientific attitude behaviour change effects blended learning supported by I-spring Suite 8 application

    Science.gov (United States)

    Budiharti, Rini; Waras, N. S.

    2018-05-01

    This article aims to describe the student’s scientific attitude behaviour change as treatment effect of Blended Learning supported by I-Spring Suite 8 application on the material balance and the rotational dynamics. Blended Learning models is learning strategy that integrate between face-to-face learning and online learning by combination of various media. Blended Learning model supported I-Spring Suite 8 media setting can direct learning becomes interactive. Students are guided to actively interact with the media as well as with other students to discuss getting the concept by the phenomena or facts presented. The scientific attitude is a natural attitude of students in the learning process. In interactive learning, scientific attitude is so needed. The research was conducted using a model Lesson Study which consists of the stages Plan-Do-Check-Act (PDCA) and applied to the subject of learning is students at class XI MIPA 2 of Senior High School 6 Surakarta. The validity of the data used triangulation techniques of observation, interviews and document review. Based on the discussion, it can be concluded that the use of Blended Learning supported media I-Spring Suite 8 is able to give the effect of changes in student behaviour on all dimensions of scientific attitude that is inquisitive, respect the data or fact, critical thinking, discovery and creativity, open minded and cooperation, and perseverance. Display e-learning media supported student worksheet makes the students enthusiastically started earlier, the core until the end of learning

  19. HPC Benchmark Suite NMx, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — In the phase II effort, Intelligent Automation Inc., (IAI) and University of Central Florida (UCF) propose to develop a comprehensive numerical test suite for...

  20. Coupled Human-Space Suit Mobility Studies

    Data.gov (United States)

    National Aeronautics and Space Administration — Current EVA mobility studies only allow for comparisons of how the suit moves when actuated by a human and how the human moves when unsuited. There are now new...

  1. Strauss: Der Rosenkavalier - Suite / Michael Kennedy

    Index Scriptorium Estoniae

    Kennedy, Michael

    1990-01-01

    Uuest heliplaadist "Strauss: Der Rosenkavalier - Suite, Salome-Dance of the seven veils, Capriccio-Prelude, Intermezzo, Morgen Mittag um elf! Felicity Lott, Scottish National Orchestra, Neeme Järvi" Chandos ABRD 1397. ABTD 1397. CHAN 8758

  2. Modeling and validation of a mechanistic tool (MEFISTO) for the prediction of critical power in BWR fuel assemblies

    International Nuclear Information System (INIS)

    Adamsson, Carl; Le Corre, Jean-Marie

    2011-01-01

    Highlights: → The MEFISTO code efficiently and accurately predicts the dryout event in a BWR fuel bundle, using a mechanistic model. → A hybrid approach between a fast and robust sub-channel analysis and a three-field two-phase analysis is adopted. → MEFISTO modeling approach, calibration, CPU usage, sensitivity, trend analysis and performance evaluation are presented. → The calibration parameters and process were carefully selected to preserve the mechanistic nature of the code. → The code dryout prediction performance is near the level of fuel-specific empirical dryout correlations. - Abstract: Westinghouse is currently developing the MEFISTO code with the main goal to achieve fast, robust, practical and reliable prediction of steady-state dryout Critical Power in Boiling Water Reactor (BWR) fuel bundle based on a mechanistic approach. A computationally efficient simulation scheme was used to achieve this goal, where the code resolves all relevant field (drop, steam and multi-film) mass balance equations, within the annular flow region, at the sub-channel level while relying on a fast and robust two-phase (liquid/steam) sub-channel solution to provide the cross-flow information. The MEFISTO code can hence provide highly detailed solution of the multi-film flow in BWR fuel bundle while enhancing flexibility and reducing the computer time by an order of magnitude as compared to a standard three-field sub-channel analysis approach. Models for the numerical computation of the one-dimensional field flowrate distributions in an open channel (e.g. a sub-channel), including the numerical treatment of field cross-flows, part-length rods, spacers grids and post-dryout conditions are presented in this paper. The MEFISTO code is then applied to dryout prediction in BWR fuel bundle using VIPRE-W as a fast and robust two-phase sub-channel driver code. The dryout power is numerically predicted by iterating on the bundle power so that the minimum film flowrate in the

  3. Interoperative efficiency in minimally invasive surgery suites.

    Science.gov (United States)

    van Det, M J; Meijerink, W J H J; Hoff, C; Pierie, J P E N

    2009-10-01

    Performing minimally invasive surgery (MIS) in a conventional operating room (OR) requires additional specialized equipment otherwise stored outside the OR. Before the procedure, the OR team must collect, prepare, and connect the equipment, then take it away afterward. These extra tasks pose a thread to OR efficiency and may lengthen turnover times. The dedicated MIS suite has permanently installed laparoscopic equipment that is operational on demand. This study presents two experiments that quantify the superior efficiency of the MIS suite in the interoperative period. Preoperative setup and postoperative breakdown times in the conventional OR and the MIS suite in an experimental setting and in daily practice were analyzed. In the experimental setting, randomly chosen OR teams simulated the setup and breakdown for a standard laparoscopic cholecystectomy (LC) and a complex laparoscopic sigmoid resection (LS). In the clinical setting, the interoperative period for 66 LCs randomly assigned to the conventional OR or the MIS suite were analyzed. In the experimental setting, the setup and breakdown times were significantly shorter in the MIS suite. The difference between the two types of OR increased for the complex procedure: 2:41 min for the LC (p < 0.001) and 10:47 min for the LS (p < 0.001). In the clinical setting, the setup and breakdown times as a whole were not reduced in the MIS suite. Laparoscopic setup and breakdown times were significantly shorter in the MIS suite (mean difference, 5:39 min; p < 0.001). Efficiency during the interoperative period is significantly improved in the MIS suite. The OR nurses' tasks are relieved, which may reduce mental and physical workload and improve job satisfaction and patient safety. Due to simultaneous tasks of other disciplines, an overall turnover time reduction could not be achieved.

  4. Exploration Space Suit Architecture: Destination Environmental-Based Technology Development

    Science.gov (United States)

    Hill, Terry R.

    2010-01-01

    This paper picks up where EVA Space Suit Architecture: Low Earth Orbit Vs. Moon Vs. Mars (Hill, Johnson, IEEEAC paper #1209) left off in the development of a space suit architecture that is modular in design and interfaces and could be reconfigured to meet the mission or during any given mission depending on the tasks or destination. This paper will walk though the continued development of a space suit system architecture, and how it should evolve to meeting the future exploration EVA needs of the United States space program. In looking forward to future US space exploration and determining how the work performed to date in the CxP and how this would map to a future space suit architecture with maximum re-use of technology and functionality, a series of thought exercises and analysis have provided a strong indication that the CxP space suit architecture is well postured to provide a viable solution for future exploration missions. Through the destination environmental analysis that is presented in this paper, the modular architecture approach provides the lowest mass, lowest mission cost for the protection of the crew given any human mission outside of low Earth orbit. Some of the studies presented here provide a look and validation of the non-environmental design drivers that will become every-increasingly important the further away from Earth humans venture and the longer they are away. Additionally, the analysis demonstrates a logical clustering of design environments that allows a very focused approach to technology prioritization, development and design that will maximize the return on investment independent of any particular program and provide architecture and design solutions for space suit systems in time or ahead of being required for any particular manned flight program in the future. The new approach to space suit design and interface definition the discussion will show how the architecture is very adaptable to programmatic and funding changes with

  5. The ESA's Space Trajectory Analysis software suite

    Science.gov (United States)

    Ortega, Guillermo

    The European Space Agency (ESA) initiated in 2005 an internal activity to develop an open source software suite involving university science departments and research institutions all over the world. This project is called the "Space Trajectory Analysis" or STA. This article describes the birth of STA and its present configuration. One of the STA aims is to promote the exchange of technical ideas, and raise knowledge and competence in the areas of applied mathematics, space engineering, and informatics at University level. Conceived as a research and education tool to support the analysis phase of a space mission, STA is able to visualize a wide range of space trajectories. These include among others ascent, re-entry, descent and landing trajectories, orbits around planets and moons, interplanetary trajectories, rendezvous trajectories, etc. The article explains that STA project is an original idea of the Technical Directorate of ESA. It was born in August 2005 to provide a framework in astrodynamics research at University level. As research and education software applicable to Academia, a number of Universities support this development by joining ESA in leading the development. ESA and Universities partnership are expressed in the STA Steering Board. Together with ESA, each University has a chair in the board whose tasks are develop, control, promote, maintain, and expand the software suite. The article describes that STA provides calculations in the fields of spacecraft tracking, attitude analysis, coverage and visibility analysis, orbit determination, position and velocity of solar system bodies, etc. STA implements the concept of "space scenario" composed of Solar system bodies, spacecraft, ground stations, pads, etc. It is able to propagate the orbit of a spacecraft where orbital propagators are included. STA is able to compute communication links between objects of a scenario (coverage, line of sight), and to represent the trajectory computations and

  6. Criticality calculations on pebble-bed HTR-PROTEUS configuration as a validation for the pseudo-scattering tracking method implemented in the MORET 5 Monte Carlo code

    Energy Technology Data Exchange (ETDEWEB)

    Forestier, Benoit; Miss, Joachim; Bernard, Franck; Dorval, Aurelien [Institut de Radioprotection et Surete Nucleaire, Fontenay aux Roses (France); Jacquet, Olivier [Independent consultant (France); Verboomen, Bernard [Belgian Nuclear Research Center - SCK-CEN (Belgium)

    2008-07-01

    The MORET code is a three dimensional Monte Carlo criticality code. It is designed to calculate the effective multiplication factor (k{sub eff}) of any geometrical configuration as well as the reaction rates in the various volumes and the neutron leakage out of the system. A recent development for the MORET code consists of the implementation of an alternate neutron tracking method, known as the pseudo-scattering tracking method. This method has been successfully implemented in the MORET code and its performances have been tested by mean of an extensive parametric study on very simple geometrical configurations. In this context, the goal of the present work is to validate the pseudo-scattering method against realistic configurations. In this perspective, pebble-bed cores are particularly well-adapted cases to model, as they exhibit large amount of volumes stochastically arranged on two different levels (the pebbles in the core and the TRISO particles inside each pebble). This paper will introduce the techniques and methods used to model pebble-bed cores in a realistic way. The results of the criticality calculations, as well as the pseudo-scattering tracking method performance in terms of computation time, will also be presented. (authors)

  7. Verification of MCNP6.2 for Nuclear Criticality Safety Applications

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rising, Michael Evan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Alwin, Jennifer Louise [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-05-10

    Several suites of verification/validation benchmark problems were run in early 2017 to verify that the new production release of MCNP6.2 performs correctly for nuclear criticality safety applications (NCS). MCNP6.2 results for several NCS validation suites were compared to the results from MCNP6.1 [1] and MCNP6.1.1 [2]. MCNP6.1 is the production version of MCNP® released in 2013, and MCNP6.1.1 is the update released in 2014. MCNP6.2 includes all of the standard features for NCS calculations that have been available for the past 15 years, along with new features for sensitivity-uncertainty based methods for NCS validation [3]. Results from the benchmark suites were compared with results from previous verification testing [4-8]. Criticality safety analysts should consider testing MCNP6.2 on their particular problems and validation suites. No further development of MCNP5 is planned. MCNP6.1 is now 4 years old, and MCNP6.1.1 is now 3 years old. In general, released versions of MCNP are supported only for about 5 years, due to resource limitations. All future MCNP improvements, bug fixes, user support, and new capabilities are targeted only to MCNP6.2 and beyond.

  8. Suites of dwarfs around Nearby giant galaxies

    International Nuclear Information System (INIS)

    Karachentsev, Igor D.; Kaisina, Elena I.; Makarov, Dmitry I.

    2014-01-01

    The Updated Nearby Galaxy Catalog (UNGC) contains the most comprehensive summary of distances, radial velocities, and luminosities for 800 galaxies located within 11 Mpc from us. The high density of observables in the UNGC makes this sample indispensable for checking results of N-body simulations of cosmic structures on a ∼1 Mpc scale. The environment of each galaxy in the UNGC was characterized by a tidal index Θ 1 , depending on the separation and mass of the galaxy's main disturber (MD). We grouped UNGC galaxies with a common MD in suites, and ranked suite members according to their Θ 1 . All suite members with positive Θ 1 are assumed to be physical companions of the MD. About 58% of the sample are members of physical groups. The distribution of suites by the number of members, n, follows a relation N(n) ∼ n –2 . The 20 most populated suites contain 468 galaxies, i.e., 59% of the UNGC sample. The fraction of MDs among the brightest galaxies is almost 100% and drops to 50% at M B = –18 m . We discuss various properties of MDs, as well as galaxies belonging to their suites. The suite abundance practically does not depend on the morphological type, linear diameter, or hydrogen mass of the MD, the tightest correlation being with the MD dynamical mass. Dwarf galaxies around MDs exhibit well-known segregation effects: the population of the outskirts has later morphological types, richer H I contents, and higher rates of star formation activity. Nevertheless, there are some intriguing cases where dwarf spheroidal galaxies occur at the far periphery of the suites, as well as some late-type dwarfs residing close to MDs. Comparing simulation results with galaxy groups, most studies assume the Local Group is fairly typical. However, we recognize that the nearby groups significantly differ from each other and there is considerable variation in their properties. The suites of companions around the Milky Way and M31, consisting of the Local Group, do not

  9. Disease-specific questionnaire for quality of life in patients with peripheral arterial occlusive disease in the stage of critical ischemia (FLeQKI) - methodical development of a specific measuring instrument and psychometric evaluation of its validity and reliability. Pt. 1

    International Nuclear Information System (INIS)

    Wohlgemuth, W.A.; Bohndorf, K.; Kirchhof, K.; Olbricht, W.; Klarmann, S.; Engelhardt, M.; Freitag, M.H.; Woelfle, K.

    2007-01-01

    well suited for determining the specific impairments of life quality in patients with peripheral arterial occlusive disease at the stage of critical ischemia. Its psychometric scores for validity and reliability corresponded to those of the SF-36. (orig.)

  10. Analytical Tools for Space Suit Design

    Science.gov (United States)

    Aitchison, Lindsay

    2011-01-01

    As indicated by the implementation of multiple small project teams within the agency, NASA is adopting a lean approach to hardware development that emphasizes quick product realization and rapid response to shifting program and agency goals. Over the past two decades, space suit design has been evolutionary in approach with emphasis on building prototypes then testing with the largest practical range of subjects possible. The results of these efforts show continuous improvement but make scaled design and performance predictions almost impossible with limited budgets and little time. Thus, in an effort to start changing the way NASA approaches space suit design and analysis, the Advanced Space Suit group has initiated the development of an integrated design and analysis tool. It is a multi-year-if not decadal-development effort that, when fully implemented, is envisioned to generate analysis of any given space suit architecture or, conversely, predictions of ideal space suit architectures given specific mission parameters. The master tool will exchange information to and from a set of five sub-tool groups in order to generate the desired output. The basic functions of each sub-tool group, the initial relationships between the sub-tools, and a comparison to state of the art software and tools are discussed.

  11. Z-2 Prototype Space Suit Development

    Science.gov (United States)

    Ross, Amy; Rhodes, Richard; Graziosi, David; Jones, Bobby; Lee, Ryan; Haque, Bazle Z.; Gillespie, John W., Jr.

    2014-01-01

    NASA's Z-2 prototype space suit is the highest fidelity pressure garment from both hardware and systems design perspectives since the Space Shuttle Extravehicular Mobility Unit (EMU) was developed in the late 1970's. Upon completion the Z-2 will be tested in the 11 foot human-rated vacuum chamber and the Neutral Buoyancy Laboratory (NBL) at the NASA Johnson Space Center to assess the design and to determine applicability of the configuration to micro-, low- (asteroid), and planetary- (surface) gravity missions. This paper discusses the 'firsts' that the Z-2 represents. For example, the Z-2 sizes to the smallest suit scye bearing plane distance for at least the last 25 years and is being designed with the most intensive use of human models with the suit model.

  12. Advanced EVA Suit Camera System Development Project

    Science.gov (United States)

    Mock, Kyla

    2016-01-01

    The National Aeronautics and Space Administration (NASA) at the Johnson Space Center (JSC) is developing a new extra-vehicular activity (EVA) suit known as the Advanced EVA Z2 Suit. All of the improvements to the EVA Suit provide the opportunity to update the technology of the video imagery. My summer internship project involved improving the video streaming capabilities of the cameras that will be used on the Z2 Suit for data acquisition. To accomplish this, I familiarized myself with the architecture of the camera that is currently being tested to be able to make improvements on the design. Because there is a lot of benefit to saving space, power, and weight on the EVA suit, my job was to use Altium Design to start designing a much smaller and simplified interface board for the camera's microprocessor and external components. This involved checking datasheets of various components and checking signal connections to ensure that this architecture could be used for both the Z2 suit and potentially other future projects. The Orion spacecraft is a specific project that may benefit from this condensed camera interface design. The camera's physical placement on the suit also needed to be determined and tested so that image resolution can be maximized. Many of the options of the camera placement may be tested along with other future suit testing. There are multiple teams that work on different parts of the suit, so the camera's placement could directly affect their research or design. For this reason, a big part of my project was initiating contact with other branches and setting up multiple meetings to learn more about the pros and cons of the potential camera placements we are analyzing. Collaboration with the multiple teams working on the Advanced EVA Z2 Suit is absolutely necessary and these comparisons will be used as further progress is made for the overall suit design. This prototype will not be finished in time for the scheduled Z2 Suit testing, so my time was

  13. Extreme-Environment Silicon-Carbide (SiC) Wireless Sensor Suite

    Science.gov (United States)

    Yang, Jie

    2015-01-01

    Phase II objectives: Develop an integrated silicon-carbide wireless sensor suite capable of in situ measurements of critical characteristics of NTP engine; Compose silicon-carbide wireless sensor suite of: Extreme-environment sensors center, Dedicated high-temperature (450 deg C) silicon-carbide electronics that provide power and signal conditioning capabilities as well as radio frequency modulation and wireless data transmission capabilities center, An onboard energy harvesting system as a power source.

  14. A Molecular Host Response Assay to Discriminate Between Sepsis and Infection-Negative Systemic Inflammation in Critically Ill Patients: Discovery and Validation in Independent Cohorts.

    Directory of Open Access Journals (Sweden)

    Leo McHugh

    2015-12-01

    Full Text Available Systemic inflammation is a whole body reaction having an infection-positive (i.e., sepsis or infection-negative origin. It is important to distinguish between these two etiologies early and accurately because this has significant therapeutic implications for critically ill patients. We hypothesized that a molecular classifier based on peripheral blood RNAs could be discovered that would (1 determine which patients with systemic inflammation had sepsis, (2 be robust across independent patient cohorts, (3 be insensitive to disease severity, and (4 provide diagnostic utility. The goal of this study was to identify and validate such a molecular classifier.We conducted an observational, non-interventional study of adult patients recruited from tertiary intensive care units (ICUs. Biomarker discovery utilized an Australian cohort (n = 105 consisting of 74 cases (sepsis patients and 31 controls (post-surgical patients with infection-negative systemic inflammation recruited at five tertiary care settings in Brisbane, Australia, from June 3, 2008, to December 22, 2011. A four-gene classifier combining CEACAM4, LAMP1, PLA2G7, and PLAC8 RNA biomarkers was identified. This classifier, designated SeptiCyte Lab, was validated using reverse transcription quantitative PCR and receiver operating characteristic (ROC curve analysis in five cohorts (n = 345 from the Netherlands. Patients for validation were selected from the Molecular Diagnosis and Risk Stratification of Sepsis study (ClinicalTrials.gov, NCT01905033, which recruited ICU patients from the Academic Medical Center in Amsterdam and the University Medical Center Utrecht. Patients recruited from November 30, 2012, to August 5, 2013, were eligible for inclusion in the present study. Validation cohort 1 (n = 59 consisted entirely of unambiguous cases and controls; SeptiCyte Lab gave an area under curve (AUC of 0.95 (95% CI 0.91-1.00 in this cohort. ROC curve analysis of an independent, more heterogeneous

  15. Immersion Suit Usage Within the RAAF

    Science.gov (United States)

    1992-01-01

    IMMERSION SUIT USED UVIC QDIS HOLDINGS 202. in 12 Sizes, held by ALSS 492SQN REQUIREMENTS No comment USAGE POLICY REFERENCE DIRAF) AAP 7215.004-1 (P3C...held by ALSS 492SQN. REQUIREMENTS No comment ISACE POLICY REFERENCE DIIAF) AAP 7215.004-1 (P3C Flight Manual) RAAF Supplement No 92 USAGE POUICY UVIC...TYPE P3C REFERENCE Telecon FLTLT Toft I I SQNfRESO AVMED Dated 22 Mar 91 IMMERSION SUIT USED UVIC QDIS HOLDINGS No comment REQUIREMENTS No comment USAGE

  16. [Prognostic estimation in critical patients. Validation of a new and very simple system of prognostic estimation of survival in an intensive care unit].

    Science.gov (United States)

    Abizanda, R; Padron, A; Vidal, B; Mas, S; Belenguer, A; Madero, J; Heras, A

    2006-04-01

    To make the validation of a new system of prognostic estimation of survival in critical patients (EPEC) seen in a multidisciplinar Intensive care unit (ICU). Prospective analysis of a patient cohort seen in the ICU of a multidisciplinar Intensive Medicine Service of a reference teaching hospital with 19 beds. Four hundred eighty four patients admitted consecutively over 6 months in 2003. Data collection of a basic minimum data set that includes patient identification data (gender, age), reason for admission and their origin, prognostic estimation of survival by EPEC, MPM II 0 and SAPS II (the latter two considered as gold standard). Mortality was evaluated on hospital discharge. EPEC validation was done with analysis of its discriminating capacity (ROC curve), calibration of its prognostic capacity (Hosmer Lemeshow C test), resolution of the 2 x 2 Contingency tables around different probability values (20, 50, 70 and mean value of prognostic estimation). The standardized mortality rate (SMR) for each one of the methods was calculated. Linear regression of the EPEC regarding the MPM II 0 and SAPS II was established and concordance analyses were done (Bland-Altman test) of the prediction of mortality by the three systems. In spite of an apparently good linear correlation, similar accuracy of prediction and discrimination capacity, EPEC is not well-calibrated (no likelihood of death greater than 50%) and the concordance analyses show that more than 10% of the pairs were outside the 95% confidence interval. In spite of its ease of application and calculation and of incorporating delay of admission in ICU as a variable, EPEC does not offer any predictive advantage on MPM II 0 or SAPS II, and its predictions adapt to reality worse.

  17. Advanced Sensor Platform to Evaluate Manloads for Exploration Suit Architectures

    Data.gov (United States)

    National Aeronautics and Space Administration — Space suit manloads are defined as the outer bounds of force that the human occupant of a suit is able to exert onto the suit during motion. They are defined on a...

  18. Investigation into the Impacts of Migration to Emergent NSA Suite B Encryption Standards

    Science.gov (United States)

    2009-06-01

    that ECC really does offer as much strength as advertised . However, there is some evidence that the use of special elliptic curves, which provide...been commonly advertised . (Gueneysu & Paar & Pelzl, 2007) F. KEY MEASURES OF EFFECTIVENESS/PERFORMANCE (MOE/MOP) 1. Key Efficiency Elliptic curve...of use as Suite B incrementally supplants older methods. 3. RSA Critical Path Analysis We will be using the Critical Path Method ( CPM ) or Critical

  19. What's New with MS Office Suites

    Science.gov (United States)

    Goldsborough, Reid

    2012-01-01

    If one buys a new PC, laptop, or netbook computer today, it probably comes preloaded with Microsoft Office 2010 Starter Edition. This is a significantly limited, advertising-laden version of Microsoft's suite of productivity programs, Microsoft Office. This continues the trend of PC makers providing ever more crippled versions of Microsoft's…

  20. Antigravity Suits For Studies Of Weightlessness

    Science.gov (United States)

    Kravik, Stein E.; Greenleaf, John

    1992-01-01

    Report presents results of research on use of "antigravity" suit, one applying positive pressure to lower body to simulate some effects of microgravity. Research suggests lower-body positive pressure is alternative to bed rest or immersion in water in terrestrial studies of cardioregulatory, renal, electrolyte, and hormonal changes induced in humans by microgravity.

  1. Prokofiev. "Romeo and Juliet" - Suites / Iran March

    Index Scriptorium Estoniae

    March, Iran

    1991-01-01

    Uuest heliplaadist "Prokofiev. "Romeo and Juliet" - Suites: N 1 Op. 64 bis a; N 2 Op. 64 ter b; N 3 Op. 101 c. Royal Scottish National Orchestra /Neeme Järvi" Chandos cassette ABTD 1536; CD CHAN 8940 (78 minutes) etc

  2. A Suite of Tools for Technology Assessment

    Science.gov (United States)

    2007-09-01

    Saden, Povinelli & Rosen, 1989). • This was a significant change in emphasis on the part of NASA, where technology had previously viewed as merely...Cost Analysis Symposium, April 13, 2005. A Suite of Tools for Technology Assessment 24 Bibliography - continued: • Sadin, Stanley T.; Povinelli

  3. 28 CFR 36.501 - Private suits.

    Science.gov (United States)

    2010-07-01

    ... ACCOMMODATIONS AND IN COMMERCIAL FACILITIES Enforcement § 36.501 Private suits. (a) General. Any person who is... order. Upon timely application, the court may, in its discretion, permit the Attorney General to... general public importance. Upon application by the complainant and in such circumstances as the court may...

  4. Open architecture of smart sensor suites

    Science.gov (United States)

    Müller, Wilmuth; Kuwertz, Achim; Grönwall, Christina; Petersson, Henrik; Dekker, Rob; Reinert, Frank; Ditzel, Maarten

    2017-10-01

    Experiences from recent conflicts show the strong need for smart sensor suites comprising different multi-spectral imaging sensors as core elements as well as additional non-imaging sensors. Smart sensor suites should be part of a smart sensor network - a network of sensors, databases, evaluation stations and user terminals. Its goal is to optimize the use of various information sources for military operations such as situation assessment, intelligence, surveillance, reconnaissance, target recognition and tracking. Such a smart sensor network will enable commanders to achieve higher levels of situational awareness. Within the study at hand, an open system architecture was developed in order to increase the efficiency of sensor suites. The open system architecture for smart sensor suites, based on a system-of-systems approach, enables combining different sensors in multiple physical configurations, such as distributed sensors, co-located sensors combined in a single package, tower-mounted sensors, sensors integrated in a mobile platform, and trigger sensors. The architecture was derived from a set of system requirements and relevant scenarios. Its mode of operation is adaptable to a series of scenarios with respect to relevant objects of interest, activities to be observed, available transmission bandwidth, etc. The presented open architecture is designed in accordance with the NATO Architecture Framework (NAF). The architecture allows smart sensor suites to be part of a surveillance network, linked e.g. to a sensor planning system and a C4ISR center, and to be used in combination with future RPAS (Remotely Piloted Aircraft Systems) for supporting a more flexible dynamic configuration of RPAS payloads.

  5. Development and Validation of NODAL-LAMBDA Program for the Calculation of the Sub-criticality of LAMDA MODES By Nodal Methods in BWR reactors

    International Nuclear Information System (INIS)

    Munoz-Cobo, J. L.; Merino, R.; Escriva, A.; Melara, J.; Concejal, A.

    2014-01-01

    We have developed a 3D code with two energy groups and diffusion theory that is capable of calculating eigenvalues lambda of a BWR reactor using nodal methods and boundary conditions that calculates ALBEDO NODAL-LAMBDA from the properties of the reflector code itself. The code calculates the sub-criticality of the first harmonic, which is involved in the stability against oscillations reactor out of phase, and which is needed for calculating the decay rate for data out of phase oscillations. The code is very fast and in a few seconds is able to make a calculation of the first eigenvalues and eigenvectors, discretized solving the problem with different matrix elements zero. The code uses the LAPACK and ARPACK libraries. It was necessary to modify the LAPACK library to perform various operations with five non-diagonal matrices simultaneously in order to reduce the number of calls to bookstores and simplify the procedure for calculating the matrices in compressed format CSR. The code is validated by comparing it with the results for SIMULATE different cases and making 3D BENCHMAR of the IAEA. (Author)

  6. Validation of MCNP and WIMS-AECL/DRAGON/RFSP for ACR-1000 applications

    International Nuclear Information System (INIS)

    Bromley, Blair P.; Adams, Fred P.; Zeller, Michael B.; Watts, David G.; Shukhman, Boris V.; Pencer, Jeremy

    2008-01-01

    This paper gives a summary of the validation of the reactor physics codes WIMS-AECL, DRAGON, RFSP and MCNP5, which are being used in the design, operation, and safety analysis of the ACR-1000 R . The standards and guidelines being followed for code validation of the suite are established in CSA Standard N286.7-99 and ANS Standard ANS-19.3-2005. These codes are being validated for the calculation of key output parameters associated with various reactor physics phenomena of importance during normal operations and postulated accident conditions in an ACR-1000 reactor. Experimental data from a variety of sources are being used for validation. The bulk of the validation data is from critical experiments in the ZED-2 research reactor with ACR-type lattices. To supplement and complement ZED-2 data, qualified and applicable data are being taken from other power and research reactors, such as existing CANDU R units, FUGEN, NRU and SPERT research reactors, and the DCA critical facility. MCNP simulations of the ACR-1000 are also being used for validating WIMS-AECL/ DRAGON/RFSP, which involves extending the validation results for MCNP through the assistance of TSUNAMI analyses. Code validation against commissioning data in the first-build ACR-1000 will be confirmatory. The code validation is establishing the biases and uncertainties in the calculations of the WIMS-AECL/DRAGON/RFSP suite for the evaluation of various key parameters of importance in the reactor physics analysis of the ACR-1000. (authors)

  7. Surgical Critical Care Initiative

    Data.gov (United States)

    Federal Laboratory Consortium — The Surgical Critical Care Initiative (SC2i) is a USU research program established in October 2013 to develop, translate, and validate biology-driven critical care....

  8. ANALYSIS OF DESIGN ELEMENTS IN SKI SUITS

    Directory of Open Access Journals (Sweden)

    Birsen Çileroğlu

    2014-06-01

    Full Text Available Popularity of Ski Sport in 19th century necessitated a new perspective on protective skiing clothing ag ainst the mountain climates and excessive cold. Winter clothing were the basis of ski attire during this period. By the beginning of 20th century lining cloth were used to minimize the wind effect. The difference between the men and women’s ski attire of the time consisted of a knee - length skirts worn over the golf trousers. Subsequent to the First World War, skiing suit models were influenced by the period uniforms and the producers reflected the fashion trends to the ski clothing. In conformance with th e prevailing trends, ski trousers were designed and produced for the women thus leading to reduction in gender differences. Increases in the ski tourism and holding of the first winter olympics in 1924 resulted in variations in ski attires, development of design characteristics, growth in user numbers, and enlargement of production capacities. Designers emphasized in their collections combined presence of elegance and practicality in the skiing attire. In 1930s, the ski suits influenced by pilots’ uniforms included characteristics permitting freedom of motion, and the design elements exhibited changes in terms of style, material and aerodynamics. In time, the ski attires showed varying design features distinguishing professionals from the amateurs. While protective functionality was primary consideration for the amateurs, for professionals the aerodynamic design was also a leading factor. Eventually, the increased differences in design characteristics were exhibited in ski suit collections, World reknown brands were formed, production and sales volumes showed significant rise. During 20th century the ski suits influenced by fashion trends to acquire unique styles reached a position of dominance to impact current fashion trends, and apart from sports attir es they became a style determinant in the clothing of cold climates. Ski suits

  9. Miniature Sensor Probe for O2, CO2, and H2O Monitoring in Space Suits, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Advanced space suits require lightweight, low-power, durable sensors for monitoring critical life support materials. No current compact sensors have the tolerance...

  10. Extending and Enhancing SAS (Static Analysis Suite)

    CERN Document Server

    Ho, David

    2016-01-01

    The Static Analysis Suite (SAS) is an open-source software package used to perform static analysis on C and C++ code, helping to ensure safety, readability and maintainability. In this Summer Student project, SAS was enhanced to improve ease of use and user customisation. A straightforward method of integrating static analysis into a project at compilation time was provided using the automated build tool CMake. The process of adding checkers to the suite was streamlined and simplied by developing an automatic code generator. To make SAS more suitable for continuous integration, a reporting mechanism summarising results was added. This suitability has been demonstrated by inclusion of SAS in the Future Circular Collider Software nightly build system. Scalability of the improved package was demonstrated by using the tool to analyse the ROOT code base.

  11. Enhancements to the opera-3d suite

    International Nuclear Information System (INIS)

    Riley, C.P.

    1997-01-01

    The OPERA-3D suite of programs has been enhanced to include 2 additional 3 dimensional finite element based solvers, with complimentary features in the pre- and postprocessing. SOPRANO computes electromagnetic fields at high frequency including displacement current effects. It has 2 modules emdash a deterministic solution at a user defined frequency and an eigenvalue solution for modal analysis. It is suitable for designing microwave structures and cavities found in particle accelerators. SCALA computes electrostatic fields in the presence of space charge from charged particle beams. The user may define the emission characteristics of electrodes or plasma surfaces and compute the resultant space charge limited beams, including the presence of magnetic fields. Typical applications in particle accelerators are electron guns and ion sources. Other enhancements to the suite include additional capabilities in TOSCA and ELEKTRA, the static and dynamic solvers. copyright 1997 American Institute of Physics

  12. The BTeV Software Tutorial Suite

    International Nuclear Information System (INIS)

    Kutschke, Robert K.

    2004-01-01

    The BTeV Collaboration is starting to develop its C++ based offline software suite, an integral part of which is a series of tutorials. These tutorials are targeted at a diverse audience, including new graduate students, experienced physicists with little or no C++ experience, those with just enough C++ to be dangerous, and experts who need only an overview of the available tools. The tutorials must both teach C++ in general and the BTeV specific tools in particular. Finally, they must teach physicists how to find and use the detailed documentation. This report will review the status of the BTeV experiment, give an overview of the plans for and the state of the software and will then describe the plans for the tutorial suite

  13. A new device for the inflation of the antigravity suit.

    Science.gov (United States)

    Brodrick, P M

    1986-02-01

    The 'Schuco' orthopaedic tourniquet inflator can be simply converted into a suitable device for inflating an antigravity suit (G-suit). The antigravity suit may be used on neurosurgical patients undergoing procedures in the sitting position to help prevent hypotension and air embolism. The availability of this device may encourage the more widespread use of an antigravity suit in neuro-anaesthetic practice.

  14. Development and validation of a new scoring system to predict wound healing after endovascular therapy in critical limb ischemia with tissue loss.

    Science.gov (United States)

    Kobayashi, Norihiro; Hirano, Keisuke; Nakano, Masatsugu; Muramatsu, Toshiya; Tsukahara, Reiko; Ito, Yoshiaki; Ishimori, Hiroshi; Yamawaki, Masahiro; Araki, Motoharu; Takimura, Hideyuki; Sakamoto, Yasunari

    2015-02-01

    To develop a scoring system to predict wound healing in critical limb ischemia (CLI) patients treated with endovascular therapy (EVT). Between July 2007 and January 2013, 184 patients (118 men; mean age 73.0 years) with CLI (217 limbs) and tissue loss underwent EVT. From this cohort 236 separate wounds were divided into development (n = 118) and validation (n = 118) groups. Predictors of wound healing were identified using multivariable analysis. Each predictor was assigned a score based on its regression coefficient, and total scores were calculated, ranging from 0 to 1 for low risk up to ≥ 4 for high risk of a nonhealing wound. The performance of the scoring system in the prediction of wound healing was evaluated by calculating the area under the receiver operating characteristics (ROC) curve. By multivariable analysis, a University of Texas grade ≥ 2 (HR 0.524, 95% CI 0.288-0.951, p = 0.034), an infected wound (HR 0.497, 95% CI 0.276-0.894, p = 0.020), dependence on hemodialysis (HR 0.459, 95% CI 0.259-0.814, p = 0.008), no visible blood flow to the wound (HR 0.343, 95% CI 0.146-0.802, p = 0.014), and major tissue loss (HR 0.322, 95% CI 0.165-0.630, p = 0.001) predicted a non-healing wound. The 1-year rates of wound healing in the low-, intermediate-, and high-risk groups were 94.6%, 67.6%, and 9.1%, respectively, in the development group (p wound healing in CLI patients after endovascular revascularization and is potentially helpful in deciding if additional adjuncts or revascularization should be considered. © The Author(s) 2015.

  15. Space suit bioenergetics: framework and analysis of unsuited and suited activity.

    Science.gov (United States)

    Carr, Christopher E; Newman, Dava J

    2007-11-01

    Metabolic costs limit the duration and intensity of extravehicular activity (EVA), an essential component of future human missions to the Moon and Mars. Energetics Framework: We present a framework for comparison of energetics data across and between studies. This framework, applied to locomotion, differentiates between muscle efficiency and energy recovery, two concepts often confused in the literature. The human run-walk transition in Earth gravity occurs at the point for which energy recovery is approximately the same for walking and running, suggesting a possible role for recovery in gait transitions. Muscular Energetics: Muscle physiology limits the overall efficiency by which chemical energy is converted through metabolism to useful work. Unsuited Locomotion: Walking and running use different methods of energy storage and release. These differences contribute to the relative changes in the metabolic cost of walking and running as gravity is varied, with the metabolic cost of locomoting at a given velocity changing in proportion to gravity for running and less than in proportion for walking. Space Suits: Major factors affecting the energetic cost of suited movement include suit pressurization, gravity, velocity, surface slope, and space suit configuration. Apollo lunar surface EVA traverse metabolic rates, while unexpectedly low, were higher than other activity categories. The Lunar Roving Vehicle facilitated even lower metabolic rates, thus longer duration EVAs. Muscles and tendons act like springs during running; similarly, longitudinal pressure forces in gas pressure space suits allow spring-like storage and release of energy when suits are self-supporting.

  16. Experience with Wolsong-1 Phase-B pre-simulations using WIMS/DRAGON/RFSP-IST code suite

    International Nuclear Information System (INIS)

    Chung, D-H.; Kim, B-G.; Kim, S-M.; Suh, H-B.; Kim, H-S.; Kim, H-J.

    2010-01-01

    The Wolsong-1 Phase-B pre-simulations have been carried out with the exclusive use of the code suite WIMS/DRAGON/RFSP-IST in replacement of the previous PPV/MULTICELL/RFSP code system in preparation of tests to be conducted as scheduled in December 2010 after the refurbishment. A comprehensive simulation package has been undertaken starting from the approach to first criticality to the flux measurements and scan. In order to secure the validity of the results, the simulations are performed using both the Uniform and SCM fuel tables. An elaborating contribution has been invested into the work in view of the inexperience of using WIMS/SCM fuel tables as well as incremental cross sections generated by using DRAGON-IST. The overall assessment of simulation results indicates that the newly adopted WIMS/DRAGON/RFSP-IST code suite could be used in replacement of PPV/MULTICELL/RFSP for the verification against the Phase-B test results. (author)

  17. NIH bows to part of Rifkin suit.

    Science.gov (United States)

    Sun, M

    1984-11-30

    Having lost a round in its legal battle with Jeremy Rifkin over field tests of genetically engineered bacteria, the National Institutes of Health will conduct the simpler of two ecological analyses required by the National Environmental Policy Act on three proposed experiments. In May 1984 a federal district court ruling halted a University of California field test pending a decision on Rifkin's 1983 suit, which alleged that NIH had violated the Act by approving experiments without studying the ecological consequences. Still to be decided by the U.S. Court of Appeals is whether NIH must also issue full-scale environmental impact statements.

  18. Geophysical characterization from Itu intrusive suite

    International Nuclear Information System (INIS)

    Pascholati, M.E.

    1989-01-01

    The integrated use of geophysical, geological, geochemical, petrographical and remote sensing data resulted in a substantial increase in the knowledge of the Itu Intrusive Suite. The main geophysical method was gamma-ray spectrometry together with fluorimetry and autoradiography. Three methods were used for calculation of laboratory gamma-ray spectrometry data. For U, the regression method was the best one. For K and Th, equations system and absolute calibration presented the best results. Surface gamma-ray spectrometry allowed comparison with laboratory data and permitted important contribution to the study of environmental radiation. (author)

  19. ANALYSIS OF DESIGN ELEMENTS IN SKI SUITS

    OpenAIRE

    Çileroğlu, Birsen; Kelleci Özeren, Figen; Kıvılcımlar, İnci Seda

    2015-01-01

    Popularity of Ski Sport in 19th century necessitated a new perspective on protective skiing clothing against the mountain climates and excessive cold. Winter clothing were the basis of ski attire during this period.  By the beginning of 20th century lining cloth were used to minimize the wind effect. The difference between the men and women’s ski attire of the time consisted of a knee-length skirts worn over the golf trousers.  Subsequent to the First World War, skiing suit models were influe...

  20. DEALed : A tool suite for distributed real-time systems development

    NARCIS (Netherlands)

    Bolshakov, K.; Karpov, Y.; Sintotski, A.; Malyshkin, V.

    1999-01-01

    DEALed is a tool suite for development of distributed systems using DEAL language. DEAL is being developed at Eindhoven University of Technology as a part of DEDOS project. Area of application of the DEALed is the development of the distributed real- time safety-critical control systems.

  1. Durable Suit Bladder with Improved Water Permeability for Pressure and Environment Suits

    Science.gov (United States)

    Bue, Grant C.; Kuznetz, Larry; Orndoff, Evelyne; Tang, Henry; Aitchison, Lindsay; Ross, Amy

    2009-01-01

    Water vapor permeability is shown to be useful in rejecting heat and managing moisture accumulation in launch-and-entry pressure suits. Currently this is accomplished through a porous Gortex layer in the Advanced Crew and Escape Suit (ACES) and in the baseline design of the Constellation Suit System Element (CSSE) Suit 1. Non-porous dense monolithic membranes (DMM) that are available offer potential improvements for water vapor permeability with reduced gas leak. Accordingly, three different pressure bladder materials were investigated for water vapor permeability and oxygen leak: ElasthaneTM 80A (thermoplastic polyether urethane) provided from stock polymer material and two custom thermoplastic polyether urethanes. Water vapor, carbon dioxide and oxygen permeability of the DMM's was measured in a 0.13 mm thick stand-alone layer, a 0.08 mm and 0.05 mm thick layer each bonded to two different nylon and polyester woven reinforcing materials. Additional water vapor permeability and mechanical compression measurements were made with the reinforced 0.05 mm thick layers, further bonded with a polyester wicking and overlaid with moistened polyester fleece thermal underwear .This simulated the pressure from a supine crew person. The 0.05 mm thick nylon reinforced sample with polyester wicking layer was further mechanically tested for wear and abrasion. Concepts for incorporating these materials in launch/entry and Extravehicular Activity pressure suits are presented.

  2. Integrated Instrument Simulator Suites for Earth Science

    Science.gov (United States)

    Tanelli, Simone; Tao, Wei-Kuo; Matsui, Toshihisa; Hostetler, Chris; Hair, John; Butler, Carolyn; Kuo, Kwo-Sen; Niamsuwan, Noppasin; Johnson, Michael P.; Jacob, Joseph C.; hide

    2012-01-01

    The NASA Earth Observing System Simulators Suite (NEOS3) is a modular framework of forward simulations tools for remote sensing of Earth's Atmosphere from space. It was initiated as the Instrument Simulator Suite for Atmospheric Remote Sensing (ISSARS) under the NASA Advanced Information Systems Technology (AIST) program of the Earth Science Technology Office (ESTO) to enable science users to perform simulations based on advanced atmospheric and simple land surface models, and to rapidly integrate in a broad framework any experimental or innovative tools that they may have developed in this context. The name was changed to NEOS3 when the project was expanded to include more advanced modeling tools for the surface contributions, accounting for scattering and emission properties of layered surface (e.g., soil moisture, vegetation, snow and ice, subsurface layers). NEOS3 relies on a web-based graphic user interface, and a three-stage processing strategy to generate simulated measurements. The user has full control over a wide range of customizations both in terms of a priori assumptions and in terms of specific solvers or models used to calculate the measured signals.This presentation will demonstrate the general architecture, the configuration procedures and illustrate some sample products and the fundamental interface requirements for modules candidate for integration.

  3. Study of the suit inflation effect on crew safety during landing using a full-pressure IVA suit for new-generation reentry space vehicles

    Science.gov (United States)

    Wataru, Suzuki

    Recently, manned space capsules have been recognized as beneficial and reasonable human space vehicles again. The Dragon capsule already achieved several significant successes. The Orion capsule is going to be sent to a high-apogee orbit without crews for experimental purposes in September 2014. For such human-rated space capsules, the study of acceleration impacts against the human body during splashdown is essential to ensure the safety of crews. Moreover, it is also known that wearing a full pressure rescue suit significantly increases safety of a crew, compared to wearing a partial pressure suit. This is mainly because it enables the use of a personal life support system independently in addition to that which installed in the space vehicle. However, it is unclear how the inflation of the full pressure suit due to pressurization affects the crew safety during splashdown, especially in the case of the new generation manned space vehicles. Therefore, the purpose of this work is to investigate the effect of the suit inflation on crew safety against acceleration impact during splashdown. For this objective, the displacements of the safety harness in relation with the suit, a human surrogate, and the crew seats during pressurizing the suit in order to determine if the safety and survivability of a crew can be improved by wearing a full pressure suit. For these tests, the DL/H-1 full pressure IVA suit, developed by Pablo de Leon and Gary L. Harris, will be used. These tests use image analysis techniques to determine the displacements. It is expected, as a result of these tests, that wearing a full pressure suit will help to mitigate the impacts and will increase the safety and survivability of a crew during landing since it works as a buffer to mitigate impact forces during splashdown. This work also proposes a future plan for sled test experiments using a sled facility such as the one in use by the Civil Aerospace Medical Institute (CAMI) for experimental validation

  4. MODEL: A software suite for data acquisition

    Energy Technology Data Exchange (ETDEWEB)

    Sendall, D M; Boissat, C; Bozzoli, W; Burkimsher, P; Jones, R; Matheys, J P; Mornacchi, G; Nguyen, T; Vyvre, P vande; Vascotto, A; Weaver, D [European Organization for Nuclear Research, Geneva (Switzerland). DD Div.

    1989-12-01

    MODEL is a new suite of modular data-acquisition software. It is aimed at the needs of LEP experiments, and is also general enough to be more widely used. It can accomodate a variety of users styles. It runs on a set of loosely coupled processors, and makes use of the remote procedure call technique. Implemented originally for the VAX family, some of its services have already been extended to other systems, including embedded microprocessors. The software modules available include facilities for data-flow management, a framework for monitoring programs, a window-oriented human interface, an error message utility, a process control utility and a run control scheme. It is already in use in a variety of experiments, and is still under development in the light of user experience. (orig.).

  5. UniPOPS: Unified data reduction suite

    Science.gov (United States)

    Maddalena, Ronald J.; Garwood, Robert W.; Salter, Christopher J.; Stobie, Elizabeth B.; Cram, Thomas R.; Morgan, Lorrie; Vance, Bob; Hudson, Jerome

    2015-03-01

    UniPOPS, a suite of programs and utilities developed at the National Radio Astronomy Observatory (NRAO), reduced data from the observatory's single-dish telescopes: the Tucson 12-m, the Green Bank 140-ft, and archived data from the Green Bank 300-ft. The primary reduction programs, 'line' (for spectral-line reduction) and 'condar' (for continuum reduction), used the People-Oriented Parsing Service (POPS) as the command line interpreter. UniPOPS unified previous analysis packages and provided new capabilities; development of UniPOPS continued within the NRAO until 2004 when the 12-m was turned over to the Arizona Radio Observatory (ARO). The submitted code is version 3.5 from 2004, the last supported by the NRAO.

  6. Vadose zone flow convergence test suite

    Energy Technology Data Exchange (ETDEWEB)

    Butcher, B. T. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-06-05

    Performance Assessment (PA) simulations for engineered disposal systems at the Savannah River Site involve highly contrasting materials and moisture conditions at and near saturation. These conditions cause severe convergence difficulties that typically result in unacceptable convergence or long simulation times or excessive analyst effort. Adequate convergence is usually achieved in a trial-anderror manner by applying under-relaxation to the Saturation or Pressure variable, in a series of everdecreasing RELAxation values. SRNL would like a more efficient scheme implemented inside PORFLOW to achieve flow convergence in a more reliable and efficient manner. To this end, a suite of test problems that illustrate these convergence problems is provided to facilitate diagnosis and development of an improved convergence strategy. The attached files are being transmitted to you describing the test problem and proposed resolution.

  7. Metallogenic aspects of Itu intrusive suite

    International Nuclear Information System (INIS)

    Amaral, G.; Pascholati, E.M.

    1990-01-01

    The integrated use of geological, geochemical, geophysical and remote sensing data is providing interesting new information on the metallogenic characteristics of the Itu Intrusive Suite. During World War II, up to 1959, a wolframite deposit was mined near the border of the northernmost body (Itupeva Granite). This deposit is formed by greisen veins associated with cassiterite and topaz, clearly linked with later phases of magmatic differentiation. Generally those veins are related to hydrothermal alteration of the granites and the above mentioned shear zone. U, Th and K determinations by field and laboratory gammaspectrometry were used for regional distribution analysis of those elements and its ratios and calculation of radioactivity heat production. In this aspects, the Itupeva Granite is the hottest and presents several anomalies in the Th/U ratio, indicative of late or post magmatic oxidation processes. (author)

  8. The European space suit, a design for productivity and crew safety

    Science.gov (United States)

    Skoog, A. Ingemar; Berthier, S.; Ollivier, Y.

    In order to fulfil the two major mission objectives, i.e. support planned and unplanned external servicing of the COLUMBUS FFL and support the HERMES vehicle for safety critical operations and emergencies, the European Space Suit System baseline configuration incorporates a number of design features, which shall enhance the productivity and the crew safety of EVA astronauts. The work in EVA is today - and will be for several years - a manual work. Consequently, to improve productivity, the first challenge is to design a suit enclosure which minimizes movement restrictions and crew fatigue. It is covered by the "ergonomic" aspect of the suit design. Furthermore, it is also necessary to help the EVA crewmember in his work, by giving him the right information at the right time. Many solutions exist in this field of Man-Machine Interface, from a very simple system, based on cuff check lists, up to advanced systems, including Head-Up Displays. The design concept for improved productivity encompasses following features: • easy donning/doffing thru rear entry, • suit ergonomy optimisation, • display of operational information in alpha-numerical and graphical from, and • voice processing for operations and safety critical information. Concerning crew safety the major design features are: • a lower R-factor for emergency EVA operations thru incressed suit pressure, • zero prebreath conditions for normal operations, • visual and voice processing of all safety critical functions, and • an autonomous life support system to permit unrestricted operations around HERMES and the CFFL. The paper analyses crew safety and productivity criteria and describes how these features are being built into the design of the European Space Suit System.

  9. Instrumented Suit Hard Upper Torso (HUT) for Ergonomic Assessment

    Data.gov (United States)

    National Aeronautics and Space Administration — It is well known that the EVA suit (EMU) has the potential to cause crew injury and decreased performance. Engineering data on the suit interaction of the human...

  10. Variable Vector Countermeasure Suit for Space Habitation and Exploration

    Data.gov (United States)

    National Aeronautics and Space Administration — The "Variable Vector Countermeasure Suit (V2Suit) for Space Habitation and Exploration" is a visionary system concept that will revolutionize space missions by...

  11. Validación de puntos críticos de la producción de Surfacen Validation of critical features of Surfacen production

    Directory of Open Access Journals (Sweden)

    Yamilka Riverón Alemán

    2009-08-01

    of the available regulations, it was demonstrated that pig breeding, its sacrifice process, and materials used in achievement of lung lavages, allow us to obtain a defined microbiologic quality and under control. Production process was able to eliminate microbial charge present in lung lavages, which together with the remainder preparation tasks of sterile materials, its transfer, use, as well as cleaning and disinfection of clean areas, the aseptic filling, and staff guaranteed sterility of end product. All these results allow us to conclude that the critical processes of Surfacen® production are validated and guarantee that this one be sterile, pyrogen-free, and without toxic residuals, demonstrating its safety, reproducibility and consistence.

  12. Enhanced Verification Test Suite for Physics Simulation Codes

    Energy Technology Data Exchange (ETDEWEB)

    Kamm, J R; Brock, J S; Brandon, S T; Cotrell, D L; Johnson, B; Knupp, P; Rider, W; Trucano, T; Weirs, V G

    2008-10-10

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations. The key points of this document are: (1) Verification deals with mathematical correctness of the numerical algorithms in a code, while validation deals with physical correctness of a simulation in a regime of interest. This document is about verification. (2) The current seven-problem Tri-Laboratory Verification Test Suite, which has been used for approximately five years at the DOE WP laboratories, is limited. (3) Both the methodology for and technology used in verification analysis have evolved and been improved since the original test suite was proposed. (4) The proposed test problems are in three basic areas: (a) Hydrodynamics; (b) Transport processes; and (c) Dynamic strength-of-materials. (5) For several of the proposed problems we provide a 'strong sense verification benchmark', consisting of (i) a clear mathematical statement of the problem with sufficient information to run a computer simulation, (ii) an explanation of how the code result and benchmark solution are to be evaluated, and (iii) a description of the acceptance criterion for simulation code results. (6) It is proposed that the set of verification test problems with which any particular code be evaluated include some of the problems described in this document. Analysis of the proposed verification test problems constitutes part of a necessary--but not sufficient--step that builds confidence in physics and engineering simulation codes. More complicated test cases, including physics models of

  13. Validation philosophy

    International Nuclear Information System (INIS)

    Vornehm, D.

    1994-01-01

    To determine when a set of calculations falls within an umbrella of an existing validation documentation, it is necessary to generate a quantitative definition of range of applicability (our definition is only qualitative) for two reasons: (1) the current trend in our regulatory environment will soon make it impossible to support the legitimacy of a validation without quantitative guidelines; and (2) in my opinion, the lack of support by DOE for further critical experiment work is directly tied to our inability to draw a quantitative open-quotes line-in-the-sandclose quotes beyond which we will not use computer-generated values

  14. Equation-of-State Test Suite for the DYNA3D Code

    Energy Technology Data Exchange (ETDEWEB)

    Benjamin, Russell D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-11-05

    This document describes the creation and implementation of a test suite for the Equationof- State models in the DYNA3D code. A customized input deck has been created for each model, as well as a script that extracts the relevant data from the high-speed edit file created by DYNA3D. Each equation-of-state model is broken apart and individual elements of the model are tested, as well as testing the entire model. The input deck for each model is described and the results of the tests are discussed. The intent of this work is to add this test suite to the validation suite presently used for DYNA3D.

  15. Improved airline-type supplied-air plastic suit

    International Nuclear Information System (INIS)

    Jolley, L. Jr.; Zippler, D.B.; Cofer, C.H.; Harper, J.A.

    1978-06-01

    Two piece supplied-air plastic suits are used extensively at the Savannah River Plant for personnel protection against inhalation of airborne plutonium and tritium. Worker comfort and noise level problems gave impetus to development of an improved suit and aid distribution system. The resulting plastic suit and development work are discussed. The plastic suit unit cost is less than $20, the hearing zone noise level is less than 75 dBA, protection factors exceed 10,000, and user comfort is approved. This suit is expected to meet performance requirements for unrestricted use

  16. [Antigravity suit used for neurosurgical operations in sitting position].

    Science.gov (United States)

    Szpiro-Zurkowska, A; Milczarek, Z; Marchel, A; Jagielski, J

    1996-01-01

    The aviator's antigravity suit (G-suit) was used for 40 operations on neurosurgical patients operated on in sitting position. The G-suit was filled with air to 0.2 atmosphere (20 kPa) pressure in 26 cases, and 0.3 atm. (30 kPa) in 14 cases. In all cases G-suit filling was followed by central venous pressure rise and mean arterial pressure rise. Venous air embolism was found in 5 (12.5%) patients. No other complications connected with the use of G-suit were observed.

  17. The Inelastic Instrument suite at the SNS

    International Nuclear Information System (INIS)

    Granroth, Garrett E; Abernathy, Douglas L; Ehlers, Georg; Hagen, Mark E; Herwig, Kenneth W; Mamontov, Eugene; Ohl, Michael E; Wildgruber, Christoph U

    2008-01-01

    The instruments in the extensive suite of spectrometers at the SNS are in various stages of installation and commissioning. The Back Scattering Spectrometer (BASIS) is installed and is in commissioning. It's near backscattering analyzer crystals provide the 3 eV resolution as expected. BASIS will enter the user program in the fall of 2007. The ARCS wide angular-range thermal to epithermal neutron spectrometer will come on line in the fall of 2007 followed shortly by the Cold Neutron Chopper Spectrometer. These two direct geometry instruments provide moderate resolution and the ability to trade resolution for flux. In addition both instruments have detector coverage out to 140o to provide a large Q range. The SEQUOIA spectrometer, complete in 2008, is the direct geometry instrument that will provide fine resolution in the thermal to epithermal range. The Spin-Echo spectrometer, to be completed on a similar time scale, will provide the finest energy resolution worldwide. The HYSPEC spectrometer, available no later than 2011, will provide polarized capabilities and optimized flux in the thermal energy range. Finally, the Vision chemical spectrometer will use crystal analyzers to study energy transfers into the epithermal range

  18. Automated Structure Solution with the PHENIX Suite

    Energy Technology Data Exchange (ETDEWEB)

    Zwart, Peter H.; Zwart, Peter H.; Afonine, Pavel; Grosse-Kunstleve, Ralf W.; Hung, Li-Wei; Ioerger, Tom R.; McCoy, A.J.; McKee, Eric; Moriarty, Nigel; Read, Randy J.; Sacchettini, James C.; Sauter, Nicholas K.; Storoni, L.C.; Terwilliger, Tomas C.; Adams, Paul D.

    2008-06-09

    Significant time and effort are often required to solve and complete a macromolecular crystal structure. The development of automated computational methods for the analysis, solution and completion of crystallographic structures has the potential to produce minimally biased models in a short time without the need for manual intervention. The PHENIX software suite is a highly automated system for macromolecular structure determination that can rapidly arrive at an initial partial model of a structure without significant human intervention, given moderate resolution and good quality data. This achievement has been made possible by the development of new algorithms for structure determination, maximum-likelihood molecular replacement (PHASER), heavy-atom search (HySS), template and pattern-based automated model-building (RESOLVE, TEXTAL), automated macromolecular refinement (phenix.refine), and iterative model-building, density modification and refinement that can operate at moderate resolution (RESOLVE, AutoBuild). These algorithms are based on a highly integrated and comprehensive set of crystallographic libraries that have been built and made available to the community. The algorithms are tightly linked and made easily accessible to users through the PHENIX Wizards and the PHENIX GUI.

  19. Automated structure solution with the PHENIX suite

    Energy Technology Data Exchange (ETDEWEB)

    Terwilliger, Thomas C [Los Alamos National Laboratory; Zwart, Peter H [LBNL; Afonine, Pavel V [LBNL; Grosse - Kunstleve, Ralf W [LBNL

    2008-01-01

    Significant time and effort are often required to solve and complete a macromolecular crystal structure. The development of automated computational methods for the analysis, solution, and completion of crystallographic structures has the potential to produce minimally biased models in a short time without the need for manual intervention. The PHENIX software suite is a highly automated system for macromolecular structure determination that can rapidly arrive at an initial partial model of a structure without significant human intervention, given moderate resolution, and good quality data. This achievement has been made possible by the development of new algorithms for structure determination, maximum-likelihood molecular replacement (PHASER), heavy-atom search (HySS), template- and pattern-based automated model-building (RESOLVE, TEXTAL), automated macromolecular refinement (phenix. refine), and iterative model-building, density modification and refinement that can operate at moderate resolution (RESOLVE, AutoBuild). These algorithms are based on a highly integrated and comprehensive set of crystallographic libraries that have been built and made available to the community. The algorithms are tightly linked and made easily accessible to users through the PHENIX Wizards and the PHENIX GUI.

  20. The Variable Vector Countermeasure Suit (V2Suit for Space Habitation and Exploration

    Directory of Open Access Journals (Sweden)

    Kevin R Duda

    2015-04-01

    Full Text Available The Variable Vector Countermeasure Suit (V2Suit for Space Habitation and Exploration is a novel system concept that provides a platform for integrating sensors and actuators with daily astronaut intravehicular activities to improve health and performance, while reducing the mass and volume of the physiologic adaptation countermeasure systems, as well as the required exercise time during long-duration space exploration missions. The V2Suit system leverages wearable kinematic monitoring technology and uses inertial measurement units (IMUs and control moment gyroscopes (CMGs within miniaturized modules placed on body segments to provide a viscous resistance during movements against a specified direction of down – initially as a countermeasure to the sensorimotor adaptation performance decrements that manifest themselves while living and working in microgravity and during gravitational transitions during long-duration spaceflight, including post-flight recovery and rehabilitation. Several aspects of the V2Suit system concept were explored and simulated prior to developing a brassboard prototype for technology demonstration. This included a system architecture for identifying the key components and their interconnects, initial identification of key human-system integration challenges, development of a simulation architecture for CMG selection and parameter sizing, and the detailed mechanical design and fabrication of a module. The brassboard prototype demonstrates closed-loop control from down initialization through CMG actuation, and provides a research platform for human performance evaluations to mitigate sensorimotor adaptation, as well as a tool for determining the performance requirements when used as a musculoskeletal deconditioning countermeasure. This type of countermeasure system also has Earth benefits, particularly in gait or movement stabilization and rehabilitation.

  1. An integrative variant analysis suite for whole exome next-generation sequencing data

    Directory of Open Access Journals (Sweden)

    Challis Danny

    2012-01-01

    Full Text Available Abstract Background Whole exome capture sequencing allows researchers to cost-effectively sequence the coding regions of the genome. Although the exome capture sequencing methods have become routine and well established, there is currently a lack of tools specialized for variant calling in this type of data. Results Using statistical models trained on validated whole-exome capture sequencing data, the Atlas2 Suite is an integrative variant analysis pipeline optimized for variant discovery on all three of the widely used next generation sequencing platforms (SOLiD, Illumina, and Roche 454. The suite employs logistic regression models in conjunction with user-adjustable cutoffs to accurately separate true SNPs and INDELs from sequencing and mapping errors with high sensitivity (96.7%. Conclusion We have implemented the Atlas2 Suite and applied it to 92 whole exome samples from the 1000 Genomes Project. The Atlas2 Suite is available for download at http://sourceforge.net/projects/atlas2/. In addition to a command line version, the suite has been integrated into the Genboree Workbench, allowing biomedical scientists with minimal informatics expertise to remotely call, view, and further analyze variants through a simple web interface. The existing genomic databases displayed via the Genboree browser also streamline the process from variant discovery to functional genomics analysis, resulting in an off-the-shelf toolkit for the broader community.

  2. Arguments for the Normative Validity of Human Rights. Philosophical Predecessors and Contemporary Criticisms of the 1789 French Declaration of Human and Civic Rights

    Directory of Open Access Journals (Sweden)

    Esther Oluffa Pedersen

    2016-11-01

    Full Text Available The paper highlights clashes between different conceptions of right, law and justice crystalizing in the French Declaration of Human and Civic Rights from 1789 and the criticisms it aroused. Hobbes’ Leviathan (1651 and Rousseau’s Social Contract (1762 are discussed as important predecessors. The philosophical conceptions of law, justice and right stated by Hobbes and Rousseau and in the Declaration will be discussed in connection with two seminal criticisms. By excluding women from politics, Olympe de Gouge objected, the Declaration contradicted the universal understanding of human rights. Jeremy Bentham protested against the Declaration’s core idea of inalienable human rights.

  3. OCAMS: The OSIRIS-REx Camera Suite

    Science.gov (United States)

    Rizk, B.; Drouet d'Aubigny, C.; Golish, D.; Fellows, C.; Merrill, C.; Smith, P.; Walker, M. S.; Hendershot, J. E.; Hancock, J.; Bailey, S. H.; DellaGiustina, D. N.; Lauretta, D. S.; Tanner, R.; Williams, M.; Harshman, K.; Fitzgibbon, M.; Verts, W.; Chen, J.; Connors, T.; Hamara, D.; Dowd, A.; Lowman, A.; Dubin, M.; Burt, R.; Whiteley, M.; Watson, M.; McMahon, T.; Ward, M.; Booher, D.; Read, M.; Williams, B.; Hunten, M.; Little, E.; Saltzman, T.; Alfred, D.; O'Dougherty, S.; Walthall, M.; Kenagy, K.; Peterson, S.; Crowther, B.; Perry, M. L.; See, C.; Selznick, S.; Sauve, C.; Beiser, M.; Black, W.; Pfisterer, R. N.; Lancaster, A.; Oliver, S.; Oquest, C.; Crowley, D.; Morgan, C.; Castle, C.; Dominguez, R.; Sullivan, M.

    2018-02-01

    The OSIRIS-REx Camera Suite (OCAMS) will acquire images essential to collecting a sample from the surface of Bennu. During proximity operations, these images will document the presence of satellites and plumes, record spin state, enable an accurate model of the asteroid's shape, and identify any surface hazards. They will confirm the presence of sampleable regolith on the surface, observe the sampling event itself, and image the sample head in order to verify its readiness to be stowed. They will document Bennu's history as an example of early solar system material, as a microgravity body with a planetesimal size-scale, and as a carbonaceous object. OCAMS is fitted with three cameras. The MapCam will record color images of Bennu as a point source on approach to the asteroid in order to connect Bennu's ground-based point-source observational record to later higher-resolution surface spectral imaging. The SamCam will document the sample site before, during, and after it is disturbed by the sample mechanism. The PolyCam, using its focus mechanism, will observe the sample site at sub-centimeter resolutions, revealing surface texture and morphology. While their imaging requirements divide naturally between the three cameras, they preserve a strong degree of functional overlap. OCAMS and the other spacecraft instruments will allow the OSIRIS-REx mission to collect a sample from a microgravity body on the same visit during which it was first optically acquired from long range, a useful capability as humanity reaches out to explore near-Earth, Main-Belt and Jupiter Trojan asteroids.

  4. Advanced Sensor Platform to Evaluate Manloads For Exploration Suit Architectures

    Science.gov (United States)

    McFarland, Shane; Pierce, Gregory

    2016-01-01

    Space suit manloads are defined as the outer bounds of force that the human occupant of a suit is able to exert onto the suit during motion. They are defined on a suit-component basis as a unit of maximum force that the suit component in question must withstand without failure. Existing legacy manloads requirements are specific to the suit architecture of the EMU and were developed in an iterative fashion; however, future exploration needs dictate a new suit architecture with bearings, load paths, and entry capability not previously used in any flight suit. No capability currently exists to easily evaluate manloads imparted by a suited occupant, which would be required to develop requirements for a flight-rated design. However, sensor technology has now progressed to the point where an easily-deployable, repeatable and flexible manloads measuring technique could be developed leveraging recent advances in sensor technology. INNOVATION: This development positively impacts schedule, cost and safety risk associated with new suit exploration architectures. For a final flight design, a comprehensive and accurate man loads requirements set must be communicated to the contractor; failing that, a suit design which does not meet necessary manloads limits is prone to failure during testing or worse, during an EVA, which could cause catastrophic failure of the pressure garment posing risk to the crew. This work facilitates a viable means of developing manloads requirements using a range of human sizes & strengths. OUTCOME / RESULTS: Performed sensor market research. Highlighted three viable options (primary, secondary, and flexible packaging option). Designed/fabricated custom bracket to evaluate primary option on a single suit axial. Manned suited manload testing completed and general approach verified.

  5. A wearable exoskeleton suit for motion assistance to paralysed patients.

    Science.gov (United States)

    Chen, Bing; Zhong, Chun-Hao; Zhao, Xuan; Ma, Hao; Guan, Xiao; Li, Xi; Liang, Feng-Yan; Cheng, Jack Chun Yiu; Qin, Ling; Law, Sheung-Wai; Liao, Wei-Hsin

    2017-10-01

    The number of patients paralysed due to stroke, spinal cord injury, or other related diseases is increasing. In order to improve the physical and mental health of these patients, robotic devices that can help them to regain the mobility to stand and walk are highly desirable. The aim of this study is to develop a wearable exoskeleton suit to help paralysed patients regain the ability to stand up/sit down (STS) and walk. A lower extremity exoskeleton named CUHK-EXO was developed with considerations of ergonomics, user-friendly interface, safety, and comfort. The mechanical structure, human-machine interface, reference trajectories of the exoskeleton hip and knee joints, and control architecture of CUHK-EXO were designed. Clinical trials with a paralysed patient were performed to validate the effectiveness of the whole system design. With the assistance provided by CUHK-EXO, the paralysed patient was able to STS and walk. As designed, the actual joint angles of the exoskeleton well followed the designed reference trajectories, and assistive torques generated from the exoskeleton actuators were able to support the patient's STS and walking motions. The whole system design of CUHK-EXO is effective and can be optimised for clinical application. The exoskeleton can provide proper assistance in enabling paralysed patients to STS and walk.

  6. Innovative technology summary report: Sealed-seam sack suits

    International Nuclear Information System (INIS)

    1998-09-01

    Sealed-seam sack suits are an improved/innovative safety and industrial hygiene technology designed to protect workers from dermal exposure to contamination. Most of these disposable, synthetic-fabric suits are more protective than cotton suits, and are also water-resistant and gas permeable. Some fabrics provide a filter to aerosols, which is important to protection against contamination, while allowing air to pass, increasing comfort level of workers. It is easier to detect body-moisture breakthrough with the disposable suits than with cotton, which is also important to protecting workers from contamination. These suits present a safe and cost-effective (6% to 17% less expensive than the baseline) alternative to traditional protective clothing. This report covers the period from October 1996 to August 1997. During that time, sealed-seam sack suits were demonstrated during daily activities under normal working conditions at the C Reactor and under environmentally controlled conditions at the Los Alamos National Laboratory (LANL)

  7. Alterations in MAST suit pressure with changes in ambient temperature.

    Science.gov (United States)

    Sanders, A B; Meislin, H W; Daub, E

    1983-01-01

    A study was undertaken to test the hypothesis that change in ambient air temperature has an effect on MAST suit pressure according to the ideal gas law. Two different MAST suits were tested on Resusci-Annie dummies. The MAST suits were applied in a cold room at 4.4 degrees C and warmed to 44 degrees C. Positive linear correlations were found in nine trials, but the two suits differed in their rate of increase in pressure. Three trials using humans were conducted showing increased pressure with temperature but at a lesser rate than with dummies. A correlation of 0.5 to 1.0 mm Hg increase in MAST suit pressure for each 1.0 degrees C increase in ambient temperature was found. Implications are discussed for the use of the MAST suit in environmental conditions where the temperature changes.

  8. Problem of Office Suite Training at the University

    Directory of Open Access Journals (Sweden)

    Natalia A. Nastashchuk

    2013-01-01

    Full Text Available Te paper considers the problem of office suite applications training, caused by a rapid change of their versions, variety of software developers and a rapid development of software and hardware platforms. The content of office suite applications training, based on the system of office suite notions, its basic functional and standards of information technologies development (OpenDocument Format Standard, ISO 26300-200Х is presented.

  9. Miniature Flexible Humidity Sensitive Patches for Space Suits, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Advanced space suit technologies demand improved, simplified, long-life regenerative sensing technologies, including humidity sensors, that exceed the performance of...

  10. Leveraging Active Knit Technologies for Aerospace Pressure Suit Applications

    Data.gov (United States)

    National Aeronautics and Space Administration — Anti-Gravity Suits (AGS) are garments used in astronautics to prevent crew from experiencing orthostatic intolerance (OI) and consequential blackouts while...

  11. CriticalEd

    DEFF Research Database (Denmark)

    Kjellberg, Caspar Mølholt; Meredith, David

    2014-01-01

    . Since the comments are not input sequentially, with regard to position, but in arbitrary order, this list must be sorted by copy/pasting the rows into place—an error-prone and time-consuming process. Scholars who produce critical editions typically use off-the-shelf music notation software......The best text method is commonly applied among music scholars engaged in producing critical editions. In this method, a comment list is compiled, consisting of variant readings and editorial emendations. This list is maintained by inserting the comments into a document as the changes are made......, consisting of a Sibelius plug-in, a cross-platform application, called CriticalEd, and a REST-based solution, which handles data storage/retrieval. A prototype has been tested at the Danish Centre for Music Publication, and the results suggest that the system could greatly improve the efficiency...

  12. Development and validation of a risk model for identification of non-neutropenic, critically ill adult patients at high risk of invasive Candida infection: the Fungal Infection Risk Evaluation (FIRE) Study.

    Science.gov (United States)

    Harrison, D; Muskett, H; Harvey, S; Grieve, R; Shahin, J; Patel, K; Sadique, Z; Allen, E; Dybowski, R; Jit, M; Edgeworth, J; Kibbler, C; Barnes, R; Soni, N; Rowan, K

    2013-02-01

    There is increasing evidence that invasive fungal disease (IFD) is more likely to occur in non-neutropenic patients in critical care units. A number of randomised controlled trials (RCTs) have evaluated antifungal prophylaxis in non-neutropenic, critically ill patients, demonstrating a reduction in the risk of proven IFD and suggesting a reduction in mortality. It is necessary to establish a method to identify and target antifungal prophylaxis at those patients at highest risk of IFD, who stand to benefit most from any antifungal prophylaxis strategy. To develop and validate risk models to identify non-neutropenic, critically ill adult patients at high risk of invasive Candida infection, who would benefit from antifungal prophylaxis, and to assess the cost-effectiveness of targeting antifungal prophylaxis to high-risk patients based on these models. Systematic review, prospective data collection, statistical modelling, economic decision modelling and value of information analysis. Ninety-six UK adult general critical care units. Consecutive admissions to participating critical care units. None. Invasive fungal disease, defined as a blood culture or sample from a normally sterile site showing yeast/mould cells in a microbiological or histopathological report. For statistical and economic modelling, the primary outcome was invasive Candida infection, defined as IFD-positive for Candida species. Systematic review: Thirteen articles exploring risk factors, risk models or clinical decision rules for IFD in critically ill adult patients were identified. Risk factors reported to be significantly associated with IFD were included in the final data set for the prospective data collection. Data were collected on 60,778 admissions between July 2009 and March 2011. Overall, 383 patients (0.6%) were admitted with or developed IFD. The majority of IFD patients (94%) were positive for Candida species. The most common site of infection was blood (55%). The incidence of IFD

  13. Using Piezoelectric Ceramics for Dust Mitigation of Space Suits

    Science.gov (United States)

    Angel, Heather K.

    2004-01-01

    The particles that make up moon dust and Mars soil can be hazardous to an astronaut s health if not handled properly. In the near future, while exploring outer space, astronauts plan to wander the surfaces of unknown planets. During these explorations, dust and soil will cling to their space suits and become imbedded in the fabric. The astronauts will track moon dust and mars soil back into their living quarters. This not only will create a mess with millions of tiny air-born particles floating around, but will also be dangerous in the case that the fine particles are breathed in and become trapped in an astronaut s lungs. research center are investigating ways to remove these particles from space suits. This problem is very difficult due to the nature of the particles: They are extremely small and have jagged edges which can easily latch onto the fibers of the fabric. For the past summer, I have been involved in researching the potential problems, investigating ways to remove the particles, and conducting experiments to validate the techniques. The current technique under investigation uses piezoelectric ceramics imbedded in the fabric that vibrate and shake the particles free. The particles will be left on the planet s surface or collected a vacuum to be disposed of later. The ceramics vibrate when connected to an AC voltage supply and create a small scale motion similar to what people use at the beach to shake sand off of a beach towel. Because the particles are so small, similar to volcanic ash, caution must be taken to make sure that this technique does not further inbed them in the fabric and make removal more difficult. Only a very precise range of frequency and voltage will produce a suitable vibration. My summer project involved many experiments to determine the correct range. Analysis involved hands on experience with oscilloscopes, amplifiers, piezoelectrics, a high speed camera, microscopes and computers. perfect this technology. Someday, vibration to

  14. Morphing: A Novel Approach to Astronaut Suit Sizing

    Science.gov (United States)

    Margerum, Sarah; Clowers, Kurt; Rajulu, Sudhakar

    2006-01-01

    The fitting of a spacesuit to an astronaut is an iterative process consisting of two parts. The first uses anthropometric data to provide an approximation of the suit components that will fit the astronaut. The second part is the subjective fitting, where small adjustments are made based on the astronaut s preference. By providing a better approximation of the correct suit components, the entire fit process time can be reduced significantly. The goals of this project are twofold: (1) To evaluate the effectiveness of the existing sizing algorithm for the Mark III Hybrid suit and (2) to determine what additional components are needed in order to provide adequate sizing for the existing astronaut population. A single subject was scanned using a 3D whole-body scanner (VITUS 3D) in the Mark III suit in eight different poses and four subjects in minimal clothing were also scanned in similar poses. The 3D external body scans of the suit and the subject are overlaid and visually aligned in a customized MATLAB program. The suit components were contracted or expanded linearly along the subjects limbs to match the subjects segmental lengths. Two independent measures were obtained from the morphing program on four subjects and compared with the existing sizing information. Two of the four subjects were in correspondence with the sizing algorithm and morphing results. The morphing outcome for a third subject, incompatible with the suit, suggested that an additional arm element at least 6 inches smaller than the existing smallest suit component would need to be acquired. The morphing result of the fourth subject, deemed incompatible with the suit using the sizing algorithm, indicated a different suit configuration which would be compatible. This configuration matched with the existing suit fit check data.

  15. Clinical Validation of Therapeutic Drug Monitoring of Imipenem in Spent Effluent in Critically Ill Patients Receiving Continuous Renal Replacement Therapy: A Pilot Study.

    Science.gov (United States)

    Wen, Aiping; Li, Zhe; Yu, Junxian; Li, Ren; Cheng, Sheng; Duan, Meili; Bai, Jing

    2016-01-01

    The primary objective of this pilot study was to investigate whether the therapeutic drug monitoring of imipenem could be performed with spent effluent instead of blood sampling collected from critically ill patients under continuous renal replacement therapy. A prospective open-label study was conducted in a real clinical setting. Both blood and effluent samples were collected pairwise before imipenem administration and 0.5, 1, 1.5, 2, 3, 4, 6, and 8 h after imipenem administration. Plasma and effluent imipenem concentrations were determined by reversed-phase high-performance liquid chromatography with ultraviolet detection. Pharmacokinetic and pharmacodynamic parameters of blood and effluent samples were calculated. Eighty-three paired plasma and effluent samples were obtained from 10 patients. The Pearson correlation coefficient of the imipenem concentrations in plasma and effluent was 0.950 (Pimipenem concentration ratio was 1.044 (95% confidence interval, 0.975 to 1.114) with Bland-Altman analysis. No statistically significant difference was found in the pharmacokinetic and pharmacodynamic parameters tested in paired plasma and effluent samples with Wilcoxon test. Spent effluent of continuous renal replacement therapy could be used for therapeutic drug monitoring of imipenem instead of blood sampling in critically ill patients.

  16. Use of deterministic methods in survey calculations for criticality problems

    International Nuclear Information System (INIS)

    Hutton, J.L.; Phenix, J.; Course, A.F.

    1991-01-01

    A code package using deterministic methods for solving the Boltzmann Transport equation is the WIMS suite. This has been very successful for a range of situations. In particular it has been used with great success to analyse trends in reactivity with a range of changes in state. The WIMS suite of codes have a range of methods and are very flexible in the way they can be combined. A wide variety of situations can be modelled ranging through all the current Thermal Reactor variants to storage systems and items of chemical plant. These methods have recently been enhanced by the introduction of the CACTUS method. This is based on a characteristics technique for solving the Transport equation and has the advantage that complex geometrical situations can be treated. In this paper the basis of the method is outlined and examples of its use are illustrated. In parallel with these developments the validation for out of pile situations has been extended to include experiments with relevance to criticality situations. The paper will summarise this evidence and show how these results point to a partial re-adoption of deterministic methods for some areas of criticality. The paper also presents results to illustrate the use of WIMS in criticality situations and in particular show how it can complement codes such as MONK when used for surveying the reactivity effect due to changes in geometry or materials. (Author)

  17. The BRITNeY Suite: A Platfor for Experiments

    DEFF Research Database (Denmark)

    Westergaard, Michael

    2006-01-01

    This paper describes a platform, the BRITNeY Suite, for experimenting with Coloured Petri nets. The BRITNeY Suite provides access to data-structures and a simulator for Coloured Petri nets via a powerful scripting language and plug-in-mechanism, thereby making it easy to perform customized...

  18. Rapid evaluation of the neutron dose following a criticality accident by measurement of {sup 24}Na activity; Evaluation rapide de la dose de neutrons a la suite d'un accident de criticite par mesure de l'activite de {sup 24}Na

    Energy Technology Data Exchange (ETDEWEB)

    Estournel, R [Centre de Production de Plutonium de Marcoule, Service de Protection contre les Rayonnements, 30 (France); Henry, Ph [Centre de Production de Plutonium de Marcoule, Section Medicale et Sociale, 30 (France); Beau, P; Ergas, A [Commissariat a l' Energie Atomique, Service d' Hygiene Atomique, Dept. de la Protection Sanitaire, Chusclan, (France)

    1966-07-01

    By external measurement of the gamma activity of {sup 24}Na induced in the human organs by a neutron flux during a criticality accident, it is possible to evaluate the personal dose received. Detectors designed for everyday use in health physics can be applied to these measurements, and this is described in the first part of the work. The response of a certain number of induced-activity detectors is presented. The induced activity-dose relationship is studied theoretically in the second part taking into account the neutron spectrum to which the individual has been subjected. The characteristic spectra of three possible types of accident have been used for deducing this relationship. The results obtained show that the method is sufficiently sensitive for present purposes. The accuracy of this method for calculating the dose received during an experiment is discussed. (authors) [French] La mesure par detection externe de l'activite gamma du sodium 24 induit dans l'organisme humain par un flux de neutrons lors d'un accident de criticite rend possible l'evaluation de la dose recue par un individu irradie. L'utilisation de detecteurs d'un emploi courant en radioprotection fait l'objet d'une experimentation qui constitue la premiere partie de cette etude. La reponse d'un certain nombre de detecteurs a une activite induite connue est presentee. La relation dose-activite induite, est etudiee, de maniere theorique, dans la seconde partie, correlativement au spectre des neutrons qui ont atteint l'individu irradie. Les spectres caracteristiques de trois types d'accidents possibles ont ete retenus pour l'etablissement de ces relations. Les resultats obtenus montrent que la methode satisfait avec une sensibilite suffisante au but recherche. La precision avec laquelle on peut ainsi calculer la dose recue au cours d'un accident de criticite est discutee. (auteurs)

  19. Criticality safety enhancements for SCALE 6.2 and beyond

    International Nuclear Information System (INIS)

    Rearden, Bradley T.; Bekar, Kursat B.; Celik, Cihangir; Clarno, Kevin T.; Dunn, Michael E.; Hart, Shane W.; Ibrahim, Ahmad M.; Johnson, Seth R.; Langley, Brandon R.; Lefebvre, Jordan P.; Lefebvre, Robert A.; Marshall, William J.; Mertyurek, Ugur; Mueller, Don; Peplow, Douglas E.; Perfetti, Christopher M.; Petrie Jr, Lester M.; Thompson, Adam B.; Wiarda, Dorothea; Wieselquist, William A.; Williams, Mark L.

    2015-01-01

    SCALE is a widely used suite of tools for nuclear systems modeling and simulation that provides comprehensive, verified and validated, user-friendly capabilities for criticality safety, reactor physics, radiation shielding, and sensitivity and uncertainty analysis. Since 1980, regulators, industry, and research institutions around the world have relied on SCALE for nuclear safety analysis and design. SCALE 6.2 provides several new capabilities and significant improvements in many existing features for criticality safety analysis. Enhancements are realized for nuclear data; multigroup resonance self-shielding; continuous-energy Monte Carlo analysis for sensitivity/uncertainty analysis, radiation shielding, and depletion; and graphical user interfaces. An overview of these capabilities is provided in this paper, and additional details are provided in several companion papers.

  20. Planetary Suit Hip Bearing Model for Predicting Design vs. Performance

    Science.gov (United States)

    Cowley, Matthew S.; Margerum, Sarah; Harvil, Lauren; Rajulu, Sudhakar

    2011-01-01

    Designing a planetary suit is very complex and often requires difficult trade-offs between performance, cost, mass, and system complexity. In order to verifying that new suit designs meet requirements, full prototypes must eventually be built and tested with human subjects. Using computer models early in the design phase of new hardware development can be advantageous, allowing virtual prototyping to take place. Having easily modifiable models of the suit hard sections may reduce the time it takes to make changes to the hardware designs and then to understand their impact on suit and human performance. A virtual design environment gives designers the ability to think outside the box and exhaust design possibilities before building and testing physical prototypes with human subjects. Reductions in prototyping and testing may eventually reduce development costs. This study is an attempt to develop computer models of the hard components of the suit with known physical characteristics, supplemented with human subject performance data. Objectives: The primary objective was to develop an articulating solid model of the Mark III hip bearings to be used for evaluating suit design performance of the hip joint. Methods: Solid models of a planetary prototype (Mark III) suit s hip bearings and brief section were reverse-engineered from the prototype. The performance of the models was then compared by evaluating the mobility performance differences between the nominal hardware configuration and hardware modifications. This was accomplished by gathering data from specific suited tasks. Subjects performed maximum flexion and abduction tasks while in a nominal suit bearing configuration and in three off-nominal configurations. Performance data for the hip were recorded using state-of-the-art motion capture technology. Results: The results demonstrate that solid models of planetary suit hard segments for use as a performance design tool is feasible. From a general trend perspective

  1. Russian nuclear criticality experiments. Status and prospects

    International Nuclear Information System (INIS)

    Gagarinski, A.Yu.

    2003-01-01

    After the nuclear criticality had been reached on a uranium-graphite assembly for the first time in the Soviet Union on December 25, 1946, by I.V. Kurchatov and his team (1), the critical conditions in a great variety of multiplying media have been realized only in the Kurchatov Institute for at least several thousand times. Even the first Russian critical experiments carried out by Igor Kurchatov confirmed the unique merits of zero-power reactors: the most practically convenient range of parameters of kinetic response for variation of critical conditions, as well as invariability, over a wide range of the most important functions of neutron flux to reactor power. Neutron physics experiments have become a necessary stage in creation and improvement of nuclear reactors. Most critical experiments were performed mainly as a necessary stage of reactor design in the 60ies and 70ies, which has been the reactor 'golden age', when most of the total of over thousand nuclear reactors of various type and destination have been created worldwide. Though the ways of conducting critical measurements were very diversified, there are two main types of experiments. The first is so-called mock-up or prototype experiments when an exact (to the extent possible) simulation of the core is constructed to minimize the error in forecasting the operating reactor characteristics. Such experiments, which often represent the quality control of the core manufacturing and adjustment of core parameters to the design requirements, were carried out in Russia on critical assemblies of several plants, in design institutions (OKBM, Nizhni Novgorod; Electrostal and others), as well as in research centers (RRC 'Kurchatov Institute', etc.). Their results, which prevail today in the criticality database, even taking into account the capabilities provided by present-day calculation codes, are not well suited for new applications. It is hard to expect that the error resulting from inevitable idealization of

  2. Correction factors for assessing immersion suits under harsh conditions.

    Science.gov (United States)

    Power, Jonathan; Tikuisis, Peter; Ré, António Simões; Barwood, Martin; Tipton, Michael

    2016-03-01

    Many immersion suit standards require testing of thermal protective properties in calm, circulating water while these suits are typically used in harsher environments where they often underperform. Yet it can be expensive and logistically challenging to test immersion suits in realistic conditions. The goal of this work was to develop a set of correction factors that would allow suits to be tested in calm water yet ensure they will offer sufficient protection in harsher conditions. Two immersion studies, one dry and the other with 500 mL of water within the suit, were conducted in wind and waves to measure the change in suit insulation. In both studies, wind and waves resulted in a significantly lower immersed insulation value compared to calm water. The minimum required thermal insulation for maintaining heat balance can be calculated for a given mean skin temperature, metabolic heat production, and water temperature. Combining the physiological limits of sustainable cold water immersion and actual suit insulation, correction factors can be deduced for harsh conditions compared to calm. The minimum in-situ suit insulation to maintain thermal balance is 1.553-0.0624·TW + 0.00018·TW(2) for a dry calm condition. Multiplicative correction factors to the above equation are 1.37, 1.25, and 1.72 for wind + waves, 500 mL suit wetness, and both combined, respectively. Calm water certification tests of suit insulation should meet or exceed the minimum in-situ requirements to maintain thermal balance, and correction factors should be applied for a more realistic determination of minimum insulation for harsh conditions. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.

  3. Criticality and safety parameter studies for upgrading 3 MW TRIGA MARK II research reactor and validation of generated cross section library and computational method

    International Nuclear Information System (INIS)

    Bhuiyan, S.I.; Mondal, M.A.W.; Sarker, M.M.; Rahman, M.; Shahdatullah, M.S.; Huda, M.Q.; Chakrroborty, T.K.; Khan, M.J.H.

    2000-01-01

    This study deals with the neutronic and thermal hydraulic analysis of the 3MW TRIGA MARK II research reactor to upgrade it to a higher flux. The upgrading will need a major reshuffling and reconfiguration of the current core. To reshuffle the current core configuration, the chain of NJOY94.10 - WIMSD-5A - CITATION - PARET - MCNP4B2 codes has been used for the overall analysis. The computational methods, tools and techniques, customisation of cross section libraries, various models for cells and super cells, and a lot of associated utilities have been standardised and established/validated for the overall core analysis. Analyses using the 4-group and 7-group libraries of macroscopic cross sections generated from the 69-group WIMSD-5 library showed that a 7-group structure is more suitable for TRIGA calculations considering its LEU fuel composition. The MCNP calculations established that the CITATION calculations and the generated cross section library are reasonably good for neutronic analysis of TRIGA reactors. Results obtained from PARET demonstrated that the flux upgrade will not cause the temperature limit on the fuel to be exceeded. Also, the maximum power density remains, by a substantial margin below the level at which the departure from nucleate boiling could occur. A possible core with two additional irradiation channels around the CT is projected where almost identical thermal fluxes as in the CT are obtained. The reconfigured core also shows 7.25% thermal flux increase in the Lazy Susan. (author)

  4. A critical evaluation of validity and utility of translational imaging in pain and analgesia: Utilizing functional imaging to enhance the process.

    Science.gov (United States)

    Upadhyay, Jaymin; Geber, Christian; Hargreaves, Richard; Birklein, Frank; Borsook, David

    2018-01-01

    Assessing clinical pain and metrics related to function or quality of life predominantly relies on patient reported subjective measures. These outcome measures are generally not applicable to the preclinical setting where early signs pointing to analgesic value of a therapy are sought, thus introducing difficulties in animal to human translation in pain research. Evaluating brain function in patients and respective animal model(s) has the potential to characterize mechanisms associated with pain or pain-related phenotypes and thereby provide a means of laboratory to clinic translation. This review summarizes the progress made towards understanding of brain function in clinical and preclinical pain states elucidated using an imaging approach as well as the current level of validity of translational pain imaging. We hypothesize that neuroimaging can describe the central representation of pain or pain phenotypes and yields a basis for the development and selection of clinically relevant animal assays. This approach may increase the probability of finding meaningful new analgesics that can help satisfy the significant unmet medical needs of patients. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Validation of the code ETOBOX/BOXER for UO2 LWR lattices based on the experiments TRX, BAPL-UO2 and other critical experiments

    International Nuclear Information System (INIS)

    Paratte, J.M.

    1985-07-01

    The EIR codes system for LWR arrays is based on cross sections taken out of ENDF/B-4 and ENDF/B-5 by the code ETOBOX. The calculation method for the arrays (code BOXER) and the cross sections as well were applied to the CSEWG benchmark experiments TRX-1 to 4 and BAPL-UO/sub 2/-1 to 3. The results are compared to the measured values and to some calculations of other institutions as well. This demonstrates that the deviations of the parameters calculated by BOXER are typical for the cross sections used. A large number of critical experiments were calculated using the measured material bucklings in order to bring to light possible trends in the calculation of the multiplication factor k/sub eff/. First it came out that the error bounds of B/sub m//sup 2/ evalu-ated in the measurements are often optimistic. Two-dimensional calculations improved the results of the cell calculations. With a mean scattering of 4 to 5 mk in the normal arrays, the multiplication factors calculated by BOXER are satisfactory. However one has to take into account a slight trend of k/sub eff/ to grow with the moderator to fuel ratio and the enrichment. (author)

  6. Hybrid Enhanced Epidermal SpaceSuit Design Approaches

    Science.gov (United States)

    Jessup, Joseph M.

    A Space suit that does not rely on gas pressurization is a multi-faceted problem that requires major stability controls to be incorporated during design and construction. The concept of Hybrid Epidermal Enhancement space suit integrates evolved human anthropomorphic and physiological adaptations into its functionality, using commercially available bio-medical technologies to address shortcomings of conventional gas pressure suits, and the impracticalities of MCP suits. The prototype HEE Space Suit explored integumentary homeostasis, thermal control and mobility using advanced bio-medical materials technology and construction concepts. The goal was a space suit that functions as an enhanced, multi-functional bio-mimic of the human epidermal layer that works in attunement with the wearer rather than as a separate system. In addressing human physiological requirements for design and construction of the HEE suit, testing regimes were devised and integrated into the prototype which was then subject to a series of detailed tests using both anatomical reproduction methods and human subject.

  7. Enabling interoperability in Geoscience with GI-suite

    Science.gov (United States)

    Boldrini, Enrico; Papeschi, Fabrizio; Santoro, Mattia; Nativi, Stefano

    2015-04-01

    GI-suite is a brokering framework targeting interoperability of heterogeneous systems in the Geoscience domain. The framework is composed by different brokers each one focusing on a specific functionality: discovery, access and semantics (i.e. GI-cat, GI-axe, GI-sem). The brokering takes place between a set of heterogeneous publishing services and a set of heterogeneous consumer applications: the brokering target is represented by resources (e.g. coverages, features, or metadata information) required to seamlessly flow from the providers to the consumers. Different international and community standards are now supported by GI-suite, making possible the successful deployment of GI-suite in many international projects and initiatives (such as GEOSS, NSF BCube and several EU funded projects). As for the publisher side more than 40 standards and implementations are supported (e.g. Dublin Core, OAI-PMH, OGC W*S, Geonetwork, THREDDS Data Server, Hyrax Server, etc.). The support for each individual standard is provided by means of specific GI-suite components, called accessors. As for the consumer applications side more than 15 standards and implementations are supported (e.g. ESRI ArcGIS, Openlayers, OGC W*S, OAI-PMH clients, etc.). The support for each individual standard is provided by means of specific profiler components. The GI-suite can be used in different scenarios by different actors: - A data provider having a pre-existent data repository can deploy and configure GI-suite to broker it and making thus available its data resources through different protocols to many different users (e.g. for data discovery and/or data access) - A data consumer can use GI-suite to discover and/or access resources from a variety of publishing services that are already publishing data according to well-known standards. - A community can deploy and configure GI-suite to build a community (or project-specific) broker: GI-suite can broker a set of community related repositories and

  8. A Secure Communication Suite for Underwater Acoustic Sensor Networks

    Directory of Open Access Journals (Sweden)

    Angelica Lo Duca

    2012-11-01

    Full Text Available In this paper we describe a security suite for Underwater Acoustic Sensor Networks comprising both fixed and mobile nodes. The security suite is composed of a secure routing protocol and a set of cryptographic primitives aimed at protecting the confidentiality and the integrity of underwater communication while taking into account the unique characteristics and constraints of the acoustic channel. By means of experiments and simulations based on real data, we show that the suite is suitable for an underwater networking environment as it introduces limited, and sometimes negligible, communication and power consumption overhead.

  9. Development on smart suit for dairy work assistance.

    Science.gov (United States)

    Nara, Hiroyuki; Kusaka, Takashi; Tanaka, Takayuki; Yamagishi, Takayuki; Ogura, Shotaroh

    2013-01-01

    Our purpose in this study is to achieve an independent life and a social involvement for the elderly using KEIROKA Technology(fatigue-reduction) which makes it possible to improve the quality of chores and occupations by removing excessive strain and tiredness. The authors have developed power assist suits named "smart suit". The authors have evaluated the effect that the purpose of dairy work assistance, to measure EMG of the worker, compared to the potential of the surface of the non-wearing and wearing "smart suit".

  10. Inertial motion capture system for biomechanical analysis in pressure suits

    Science.gov (United States)

    Di Capua, Massimiliano

    A non-invasive system has been developed at the University of Maryland Space System Laboratory with the goal of providing a new capability for quantifying the motion of the human inside a space suit. Based on an array of six microprocessors and eighteen microelectromechanical (MEMS) inertial measurement units (IMUs), the Body Pose Measurement System (BPMS) allows the monitoring of the kinematics of the suit occupant in an unobtrusive, self-contained, lightweight and compact fashion, without requiring any external equipment such as those necessary with modern optical motion capture systems. BPMS measures and stores the accelerations, angular rates and magnetic fields acting upon each IMU, which are mounted on the head, torso, and each segment of each limb. In order to convert the raw data into a more useful form, such as a set of body segment angles quantifying pose and motion, a series of geometrical models and a non-linear complimentary filter were implemented. The first portion of this works focuses on assessing system performance, which was measured by comparing the BPMS filtered data against rigid body angles measured through an external VICON optical motion capture system. This type of system is the industry standard, and is used here for independent measurement of body pose angles. By comparing the two sets of data, performance metrics such as BPMS system operational conditions, accuracy, and drift were evaluated and correlated against VICON data. After the system and models were verified and their capabilities and limitations assessed, a series of pressure suit evaluations were conducted. Three different pressure suits were used to identify the relationship between usable range of motion and internal suit pressure. In addition to addressing range of motion, a series of exploration tasks were also performed, recorded, and analysed in order to identify different motion patterns and trajectories as suit pressure is increased and overall suit mobility is reduced

  11. Test suite for the archiver of a SCADA system

    CERN Document Server

    Voitier, Axel

    2009-01-01

    Topic: The group responsible for providing the main control system applications for all machines at CERN has to validate that every piece of the control systems used will be reliable and fully functional when the LHC and its experiments will do collisions of particles. CERN use PVSS from ETM/Siemens for the SCADA part of its control systems. This software has a component dedicated to archive into a centralised Oracle database values and commands of tenth of thousands hardware devices. This component, named RDB, has to be tested and validated in terms of functionality and performance. The need is high for that because archiving is a critical part of the control systems. In case of an incident on one of the machine, it will be unacceptable to not benefit of archiving the machine context at this moment just because of a bug in RDB. Bugs have to be spotted and reported to ETM. Results: The proposed solution is an extensible automatic tester able to evaluate currently around 160 cases of potential bugs. Since the ...

  12. Improvements to the APBS biomolecular solvation software suite: Improvements to the APBS Software Suite

    Energy Technology Data Exchange (ETDEWEB)

    Jurrus, Elizabeth [Pacific Northwest National Laboratory, Richland Washington; Engel, Dave [Pacific Northwest National Laboratory, Richland Washington; Star, Keith [Pacific Northwest National Laboratory, Richland Washington; Monson, Kyle [Pacific Northwest National Laboratory, Richland Washington; Brandi, Juan [Pacific Northwest National Laboratory, Richland Washington; Felberg, Lisa E. [University of California, Berkeley California; Brookes, David H. [University of California, Berkeley California; Wilson, Leighton [University of Michigan, Ann Arbor Michigan; Chen, Jiahui [Southern Methodist University, Dallas Texas; Liles, Karina [Pacific Northwest National Laboratory, Richland Washington; Chun, Minju [Pacific Northwest National Laboratory, Richland Washington; Li, Peter [Pacific Northwest National Laboratory, Richland Washington; Gohara, David W. [St. Louis University, St. Louis Missouri; Dolinsky, Todd [FoodLogiQ, Durham North Carolina; Konecny, Robert [University of California San Diego, San Diego California; Koes, David R. [University of Pittsburgh, Pittsburgh Pennsylvania; Nielsen, Jens Erik [Protein Engineering, Novozymes A/S, Copenhagen Denmark; Head-Gordon, Teresa [University of California, Berkeley California; Geng, Weihua [Southern Methodist University, Dallas Texas; Krasny, Robert [University of Michigan, Ann Arbor Michigan; Wei, Guo-Wei [Michigan State University, East Lansing Michigan; Holst, Michael J. [University of California San Diego, San Diego California; McCammon, J. Andrew [University of California San Diego, San Diego California; Baker, Nathan A. [Pacific Northwest National Laboratory, Richland Washington; Brown University, Providence Rhode Island

    2017-10-24

    The Adaptive Poisson-Boltzmann Solver (APBS) software was developed to solve the equations of continuum electrostatics for large biomolecular assemblages that has provided impact in the study of a broad range of chemical, biological, and biomedical applications. APBS addresses three key technology challenges for understanding solvation and electrostatics in biomedical applications: accurate and efficient models for biomolecular solvation and electrostatics, robust and scalable software for applying those theories to biomolecular systems, and mechanisms for sharing and analyzing biomolecular electrostatics data in the scientific community. To address new research applications and advancing computational capabilities, we have continually updated APBS and its suite of accompanying software since its release in 2001. In this manuscript, we discuss the models and capabilities that have recently been implemented within the APBS software package including: a Poisson-Boltzmann analytical and a semi-analytical solver, an optimized boundary element solver, a geometry-based geometric flow solvation model, a graph theory based algorithm for determining pKa values, and an improved web-based visualization tool for viewing electrostatics.

  13. CAMEO (Computer-Aided Management of Emergency Operations) Software Suite

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — CAMEO is the umbrella name for a system of software applications used widely to plan for and respond to chemical emergencies. All of the programs in the suite work...

  14. Segane saksa-eesti kirjakeel ja eesti lauluraamat / Gustav Suits

    Index Scriptorium Estoniae

    Suits, Gustav, 1883-1956

    1999-01-01

    Varem ilmunud: Suits, Gustav. Eesti kirjanduslugu I. Lund : Eesti Kirjanike Kooperatiiv, 1953. Heinrich Stahli käsiraamatu Hand-, Hausz- und Kirchenbuch (1654-1656) osana ilmunud lauluraamatust Neu Ehstnisches Gesangbuch (1656)

  15. Touring the Tomato: A Suite of Chemistry Laboratory Experiments

    Science.gov (United States)

    Sarkar, Sayantani; Chatterjee, Subhasish; Medina, Nancy; Stark, Ruth E.

    2013-01-01

    An eight-session interdisciplinary laboratory curriculum has been designed using a suite of analytical chemistry techniques to study biomaterials derived from an inexpensive source such as the tomato fruit. A logical

  16. Arensky. Silhouettes (Suite N 2), Op. 23 / Jonathan Swain

    Index Scriptorium Estoniae

    Swain, Jonathan

    1991-01-01

    Uuest heliplaadist "Arensky. Silhouettes (Suite N 2), Op. 23. Scrjabin. Symphony N 3 in C minor, Op. 43 "Le divin poeme". Danish National Radio Symphony Orchestra. Neeme Järvi. Chandos cassette ABTD 1509; CD CHAN 8898 (66 minutes)

  17. 33 CFR 144.20-5 - Exposure suits.

    Science.gov (United States)

    2010-07-01

    ... light that is approved under 46 CFR 161.012. Each light must be securely attached to the front shoulder... lanyard coiled and stopped off. (f) No stowage container for exposure suits may be capable of being locked...

  18. EVA Physiology and Medical Considerations Working in the Suit

    Science.gov (United States)

    Parazynski, Scott

    2012-01-01

    This "EVA Physiology and Medical Considerations Working in the Suit" presentation covers several topics related to the medical implications and physiological effects of suited operations in space from the perspective of a physician with considerable first-hand Extravehicular Activity (EVA) experience. Key themes include EVA physiology working in a pressure suit in the vacuum of space, basic EVA life support and work support, Thermal Protection System (TPS) inspections and repairs, and discussions of the physical challenges of an EVA. Parazynski covers the common injuries and significant risks during EVAs, as well as physical training required to prepare for EVAs. He also shares overall suit physiological and medical knowledge with the next generation of Extravehicular Mobility Unit (EMU) system designers.

  19. Virtual Suit Fit Assessment Using Body Shape Model

    Data.gov (United States)

    National Aeronautics and Space Administration — Shoulder injury is one of the most serious risks for crewmembers in long-duration spaceflight. While suboptimal suit fit and contact pressures between the shoulder...

  20. Tchaikovsky, P.: Orchestral Suite no. 3 op. 55 / Terry Williams

    Index Scriptorium Estoniae

    Williams, Terry

    1996-01-01

    Uuest heliplaadist "Tchaikovsky, P.: Orchestral Suite no. 3 op. 55. Francesca di Rimini op. 32. Detroit Symphony Orchestra, Neeme Järvi". Chandos CHAN 9 419, distribution Media 7 (CD: 160F). TT: 1h 09'20"

  1. Prokofiev: War and Peace - Symphonic Suite (arr. Palmer) / Ivan March

    Index Scriptorium Estoniae

    March, Ivan

    1993-01-01

    Uuest heliplaadist "Prokofiev: War and Peace - Symphonic Suite (arr. Palmer), Summer Night, Op. 123. Russian Overture, Op. 72. Philharmonia Orchestra / Neeme Järvi. Chandos ABTD 1598 CHAN9096 (64 minutes:DDD) Igor - Polovtsian Dances

  2. Prokofiev: Romeo and Juliet - Suite N1 / Ivan March

    Index Scriptorium Estoniae

    March, Ivan

    1990-01-01

    Uuest heliplaadist "Prokofiev: Romeo and Juliet - Suite N1, Op.64b, N2, Op.64c. Philharmonia Orchestra, Barry Wordsworth" Collins Classics cassette 1116-4. CD. Võrreldud Neeme Järvi plaadistustega 1116-2

  3. Nonventing Thermal and Humidity Control for EVA Suits, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Future manned space exploration missions will require space suits with capabilities beyond the current state of the art. Portable Life Support Systems for these...

  4. U.S. Climate Normals Product Suite (1981-2010)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The NOAA Climate Normals are a large suite of data products that provide users with many tools to understand typical climate conditions for thousands of locations...

  5. Assuring Condition and Inventory Accountability of Chemical Protective Suits

    National Research Council Canada - National Science Library

    2000-01-01

    .... As part of the Defense Logistics Agency's efforts to consolidate depot operations and improve inventory accuracy, chemical protective suits were transferred to the Defense Depot, Albany, Georgia, during FY 1991.

  6. Principles of Proper Validation

    DEFF Research Database (Denmark)

    Esbensen, Kim; Geladi, Paul

    2010-01-01

    to suffer from the same deficiencies. The PPV are universal and can be applied to all situations in which the assessment of performance is desired: prediction-, classification-, time series forecasting-, modeling validation. The key element of PPV is the Theory of Sampling (TOS), which allow insight......) is critically necessary for the inclusion of the sampling errors incurred in all 'future' situations in which the validated model must perform. Logically, therefore, all one data set re-sampling approaches for validation, especially cross-validation and leverage-corrected validation, should be terminated...

  7. Corrections of the NIST Statistical Test Suite for Randomness

    OpenAIRE

    Kim, Song-Ju; Umeno, Ken; Hasegawa, Akio

    2004-01-01

    It is well known that the NIST statistical test suite was used for the evaluation of AES candidate algorithms. We have found that the test setting of Discrete Fourier Transform test and Lempel-Ziv test of this test suite are wrong. We give four corrections of mistakes in the test settings. This suggests that re-evaluation of the test results should be needed.

  8. Cosmonaut Sergei Krikalev receives assistance from suit technician

    Science.gov (United States)

    1994-01-01

    Sergei Krikalev, alternative mission specialist for STS-63, gets help from Dawn Mays, a Boeing suit technician. The cosmonaut was about to participate in a training session at JSC's Weightless Environment Training Facility (WETF). Wearing the training version of the extravehicular mobility unit (EMU) space suit, weighted to allow neutral buoyancy in the 25 feet deep WETF pool, Krikalev minutes later was underwater simulating a contingency spacewalk, or extravehicular activity (EVA).

  9. Results and Analysis from Space Suit Joint Torque Testing

    Science.gov (United States)

    Matty, Jennifer

    2010-01-01

    This joint mobility KC lecture included information from two papers, "A Method for and Issues Associated with the Determination of Space Suit Joint Requirements" and "Results and Analysis from Space Suit Joint Torque Testing," as presented for the International Conference on Environmental Systems in 2009 and 2010, respectively. The first paper discusses historical joint torque testing methodologies and approaches that were tested in 2008 and 2009. The second paper discusses the testing that was completed in 2009 and 2010.

  10. STS-74 M.S. Jerry L. Ross suits up

    Science.gov (United States)

    1995-01-01

    Spaceflight veteran Jerry L. Ross, Mission Specialist 2 on Shuttle Mission STS-74, is assisted by a suit technician as he finishes getting into his launch/entry suit in the Operations and Checkout Building. Ross and four fellow astronauts will depart shortly for Launch Pad 39A, where the Space Shuttle Atlantis awaits a second liftoff attempt during a seven-minute window scheduled to open at approximately 7:30 a.m. EST, Nov. 12.

  11. An Ergonomic Evaluation of the Extravehicular Mobility Unit (EMU) Space Suit Hard Upper Torso (HUT) Size Effect on Metabolic, Mobility, and Strength Performance

    Science.gov (United States)

    Reid, Christopher; Harvill, Lauren; England, Scott; Young, Karen; Norcross, Jason; Rajulu, Sudhakar

    2014-01-01

    The objective of this project was to assess the performance differences between a nominally sized Extravehicular Mobility Unit (EMU) space suit and a nominal +1 (plus) sized EMU. Method: This study evaluated suit size conditions by using metabolic cost, arm mobility, and arm strength as performance metrics. Results: Differences between the suit sizes were found only in shoulder extension strength being 15.8% greater for the plus size. Discussion: While this study was able to identify motions and activities that were considered to be practically or statistically different, it does not signify that use of a plus sized suit should be prohibited. Further testing would be required that either pertained to a particular mission critical task or better simulates a microgravity environment that the EMU suit was designed to work in.

  12. Application of Fault Management Theory to the Quantitative Selection of a Launch Vehicle Abort Trigger Suite

    Science.gov (United States)

    Lo, Yunnhon; Johnson, Stephen B.; Breckenridge, Jonathan T.

    2014-01-01

    This paper describes the quantitative application of the theory of System Health Management and its operational subset, Fault Management, to the selection of Abort Triggers for a human-rated launch vehicle, the United States' National Aeronautics and Space Administration's (NASA) Space Launch System (SLS). The results demonstrate the efficacy of the theory to assess the effectiveness of candidate failure detection and response mechanisms to protect humans from time-critical and severe hazards. The quantitative method was successfully used on the SLS to aid selection of its suite of Abort Triggers.

  13. Development of Advanced Suite of Deterministic Codes for VHTR Physics Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Kang Seog; Cho, J. Y.; Lee, K. H. (and others)

    2007-07-15

    Advanced Suites of deterministic codes for VHTR physics analysis has been developed for detailed analysis of current and advanced reactor designs as part of a US-ROK collaborative I-NERI project. These code suites include the conventional 2-step procedure in which a few group constants are generated by a transport lattice calculation, and the reactor physics analysis is performed by a 3-dimensional diffusion calculation, and a whole core transport code that can model local heterogeneities directly at the core level. Particular modeling issues in physics analysis of the gas-cooled VHTRs were resolved, which include a double heterogeneity of the coated fuel particles, a neutron streaming in the coolant channels, a strong core-reflector interaction, and large spectrum shifts due to changes of the surrounding environment, temperature and burnup. And the geometry handling capability of the DeCART code were extended to deal with the hexagonal fuel elements of the VHTR core. The developed code suites were validated and verified by comparing the computational results with those of the Monte Carlo calculations for the benchmark problems.

  14. Functional Mobility Testing: A Novel Method to Create Suit Design Requirements

    Science.gov (United States)

    England, Scott A.; Benson, Elizabeth A.; Rajulu, Sudhakar L.

    2008-01-01

    This study was performed to aide in the creation of design requirements for the next generation of space suits that more accurately describe the level of mobility necessary for a suited crewmember through the use of an innovative methodology utilizing functional mobility. A novel method was utilized involving the collection of kinematic data while 20 subjects (10 male, 10 female) performed pertinent functional tasks that will be required of a suited crewmember during various phases of a lunar mission. These tasks were selected based on relevance and criticality from a larger list of tasks that may be carried out by the crew. Kinematic data was processed through Vicon BodyBuilder software to calculate joint angles for the ankle, knee, hip, torso, shoulder, elbow, and wrist. Maximum functional mobility was consistently lower than maximum isolated mobility. This study suggests that conventional methods for establishing design requirements for human-systems interfaces based on maximal isolated joint capabilities may overestimate the required mobility. Additionally, this method provides a valuable means of evaluating systems created from these requirements by comparing the mobility available in a new spacesuit, or the mobility required to use a new piece of hardware, to this newly established database of functional mobility.

  15. Reliability performance testing of totally encapsulating chemical protective suits

    International Nuclear Information System (INIS)

    Johnson, J.S.; Swearengen, P.M.

    1991-01-01

    The need to assure a high degree of reliability for totally encapsulating chemical protective (TECP) suits has been recognized by Lawrence Livermore National Laboratory's (LLNL) Hazards Control Department for some time. The following four tests were proposed as necessary to provide complete evaluation of TECP suit performance: 1. Quantitative leak test (ASTM draft), 2. Worst-case chemical exposure test (conceptual), 3. Pressure leak-rate test (complete, ASTM F1057-87), and 4. Chemical leak-rate test (ASTM draft). This paper reports on these tests which should be applied to measuring TECP suit performance in two stages: design qualification tests and field use tests. Test 1, 2, and 3 are used as design qualification tests, and tests 3 and 4 are used as field use tests

  16. NetSuite OneWorld Implementation 2011 R2

    CERN Document Server

    Foydel, Thomas

    2011-01-01

    This book is a focused, step-by step tutorial that shows you how to successfully implement NetSuite OneWorld into your organization. It is written in an easy-to-read style, with a strong emphasis on real-world, practical examples with step-by-step explanations. The book focuses on NetSuite OneWorld 2011 R1. If you are an application administrator, business analyst, project team member or business process owner who wants to implement NetSuite OneWorld into your organization, then this book is for you. This book might also be useful if you are a business manager considering a new system for your

  17. Ultraviolet Testing of Space Suit Materials for Mars

    Science.gov (United States)

    Larson, Kristine; Fries, Marc

    2017-01-01

    Human missions to Mars may require radical changes in the approach to extra-vehicular (EVA) suit design. A major challenge is the balance of building a suit robust enough to complete multiple EVAs under intense ultraviolet (UV) light exposure without losing mechanical strength or compromising the suit's mobility. To study how the materials degrade on Mars in-situ, the Jet Propulsion Laboratory (JPL) invited the Advanced Space Suit team at NASA's Johnson Space Center (JSC) to place space suit materials on the Scanning Habitable Environments with Raman & Luminescence for Organics and Chemicals (SHERLOC) instrument's calibration target of the Mars 2020 rover. In order to select materials for the rover and understand the effects from Mars equivalent UV exposure, JSC conducted ground testing on both current and new space suit materials when exposed to 2500 hours of Mars mission equivalent UV. To complete this testing, JSC partnered with NASA's Marshall Space Flight Center to utilize their UV vacuum chambers. Materials tested were Orthofabric, polycarbonate, Teflon, Dacron, Vectran, spectra, bladder, nGimat coated Teflon, and nGimat coated Orthofabric. All samples were measured for mass, tensile strength, and chemical composition before and after radiation. Mass loss was insignificant (less than 0.5%) among the materials. Most materials loss tensile strength after radiation and became more brittle with a loss of elongation. Changes in chemical composition were seen in all radiated materials through Spectral Analysis. Results from this testing helped select the materials that will fly on the Mars 2020 rover. In addition, JSC can use this data to create a correlation to the chemical changes after radiation-which is what the rover will send back while on Mars-to the mechanical changes, such as tensile strength.

  18. Validation of Flight Critical Control Systems

    Science.gov (United States)

    1991-12-01

    1985. [8] Avizienis, A., and Lyu, M., "On the Effectiveness of Multiversion Software in Digital Avionics", AIAA Computers in Aerospace VI Conference...Experimentation and Modelling. NASA CR-165036, 1982. [12] Eckhardt, D. E.; and Lee, L. D.: A Theoretical Basis for the Analysis of Multiversion

  19. Validation of a vortex ring wake model suited for aeroelastic simulations of floating wind turbines

    DEFF Research Database (Denmark)

    Vaal, J.B., de; Hansen, Martin Otto Laver; Moan, T.

    2014-01-01

    In order to evaluate aerodynamic loads on floating oshore wind turbines, advanced dynamic analysis tools are required. As a unied model that can represent both dynamic in ow and skewed in ow effects in it basic formulation, a wake model based on a vortex ring formulation is discussed. Such a model...... presents a good intermediate solution between computationally efficient but simple momentum balance methods and computationally expensive but complete computational fluid dynamics models. The model introduced is shown to be capable of modelling typical steady and unsteady test cases with reasonable...

  20. A suite of diagnostics to validate and optimize the prototype ITER neutral beam injector

    Science.gov (United States)

    Pasqualotto, R.; Agostini, M.; Barbisan, M.; Brombin, M.; Cavazzana, R.; Croci, G.; Dalla Palma, M.; Delogu, R. S.; De Muri, M.; Muraro, A.; Peruzzo, S.; Pimazzoni, A.; Pomaro, N.; Rebai, M.; Rizzolo, A.; Sartori, E.; Serianni, G.; Spagnolo, S.; Spolaore, M.; Tardocchi, M.; Zaniol, B.; Zaupa, M.

    2017-10-01

    The ITER project requires additional heating provided by two neutral beam injectors using 40 A negative deuterium ions accelerated at 1 MV. As the beam requirements have never been experimentally met, a test facility is under construction at Consorzio RFX, which hosts two experiments: SPIDER, full-size 100 kV ion source prototype, and MITICA, 1 MeV full-size ITER injector prototype. Since diagnostics in ITER injectors will be mainly limited to thermocouples, due to neutron and gamma radiation and to limited access, it is crucial to thoroughly investigate and characterize in more accessible experiments the key parameters of source plasma and beam, using several complementary diagnostics assisted by modelling. In SPIDER and MITICA the ion source parameters will be measured by optical emission spectroscopy, electrostatic probes, cavity ring down spectroscopy for H^- density and laser absorption spectroscopy for cesium density. Measurements over multiple lines-of-sight will provide the spatial distribution of the parameters over the source extension. The beam profile uniformity and its divergence are studied with beam emission spectroscopy, complemented by visible tomography and neutron imaging, which are novel techniques, while an instrumented calorimeter based on custom unidirectional carbon fiber composite tiles observed by infrared cameras will measure the beam footprint on short pulses with the highest spatial resolution. All heated components will be monitored with thermocouples: as these will likely be the only measurements available in ITER injectors, their capabilities will be investigated by comparison with other techniques. SPIDER and MITICA diagnostics are described in the present paper with a focus on their rationale, key solutions and most original and effective implementations.

  1. Validation of a vortex ring wake model suited for aeroelastic simulations of floating wind turbines

    International Nuclear Information System (INIS)

    Vaal, J B de; Moan, T; Hansen, M O L

    2014-01-01

    In order to evaluate aerodynamic loads on floating offshore wind turbines, advanced dynamic analysis tools are required. As a unified model that can represent both dynamic inflow and skewed inflow effects in it basic formulation, a wake model based on a vortex ring formulation is discussed. Such a model presents a good intermediate solution between computationally efficient but simple momentum balance methods and computationally expensive but complete computational fluid dynamics models. The model introduced is shown to be capable of modelling typical steady and unsteady test cases with reasonable accuracy

  2. STS-82 Pilot Scott J. 'Doc' Horowitz Suit Up

    Science.gov (United States)

    1997-01-01

    STS-82 Pilot Scott J. 'Doc' Horowitz puts on a glove of his launch and entry suit with assistance from a suit technician in the Operations and Checkout Building. This is Horowitz''';s second space flight. He and the six other crew members will depart shortly for Launch Pad 39A, where the Space Shuttle Discovery awaits liftoff on a 10-day mission to service the orbiting Hubble Space Telescope (HST). This will be the second HST servicing mission. Four back-to-back spacewalks are planned.

  3. STS-87 Mission Specialist Winston E. Scott suits up

    Science.gov (United States)

    1997-01-01

    STS-87 Mission Specialist Winston Scott dons his launch and entry suit with the assistance of a suit technician in the Operations and Checkout Building. This is Scotts second space flight. He and the five other crew members will depart shortly for Launch Pad 39B, where the Space Shuttle Columbia awaits liftoff on a 16-day mission to perform microgravity and solar research. Scott is scheduled to perform an extravehicular activity spacewalk with Mission Specialist Takao Doi, Ph.D., of the National Space Development Agency of Japan, during STS-87. He also performed a spacewalk on STS-72.

  4. Extending the GI Brokering Suite to Support New Interoperability Specifications

    Science.gov (United States)

    Boldrini, E.; Papeschi, F.; Santoro, M.; Nativi, S.

    2014-12-01

    The GI brokering suite provides the discovery, access, and semantic Brokers (i.e. GI-cat, GI-axe, GI-sem) that empower a Brokering framework for multi-disciplinary and multi-organizational interoperability. GI suite has been successfully deployed in the framework of several programmes and initiatives, such as European Union funded projects, NSF BCube, and the intergovernmental coordinated effort Global Earth Observation System of Systems (GEOSS). Each GI suite Broker facilitates interoperability for a particular functionality (i.e. discovery, access, semantic extension) among a set of brokered resources published by autonomous providers (e.g. data repositories, web services, semantic assets) and a set of heterogeneous consumers (e.g. client applications, portals, apps). A wide set of data models, encoding formats, and service protocols are already supported by the GI suite, such as the ones defined by international standardizing organizations like OGC and ISO (e.g. WxS, CSW, SWE, GML, netCDF) and by Community specifications (e.g. THREDDS, OpenSearch, OPeNDAP, ESRI APIs). Using GI suite, resources published by a particular Community or organization through their specific technology (e.g. OPeNDAP/netCDF) can be transparently discovered, accessed, and used by different Communities utilizing their preferred tools (e.g. a GIS visualizing WMS layers). Since Information Technology is a moving target, new standards and technologies continuously emerge and are adopted in the Earth Science context too. Therefore, GI Brokering suite was conceived to be flexible and accommodate new interoperability protocols and data models. For example, GI suite has recently added support to well-used specifications, introduced to implement Linked data, Semantic Web and precise community needs. Amongst the others, they included: DCAT: a RDF vocabulary designed to facilitate interoperability between Web data catalogs. CKAN: a data management system for data distribution, particularly used by

  5. The french criticality handbook

    International Nuclear Information System (INIS)

    Maubert, L.; Puit, J.C.

    1987-01-01

    The french criticality handbook, published in 1978 by the ''Commissariat a l'Energie Atomique'', is presented with the main targets aimed by the writer and the main choices taken relating to fissile mediums, reflection conditions, dilution curves. The validation of the critical values is presented as one of the most important aspects of this handbook which is mainly intended, in the mind of the author, to specialists well advertised in the field of criticality. The complements which have been introduced since 1978 and those which are foreseen in a near future are also detailed. (author)

  6. Thinking Critically about Critical Thinking

    Science.gov (United States)

    Mulnix, Jennifer Wilson

    2012-01-01

    As a philosophy professor, one of my central goals is to teach students to think critically. However, one difficulty with determining whether critical thinking can be taught, or even measured, is that there is widespread disagreement over what critical thinking actually is. Here, I reflect on several conceptions of critical thinking, subjecting…

  7. PLEASE: The Python Low-energy Electron Analysis SuitE – Enabling Rapid Analysis of LEEM and LEED Data

    Directory of Open Access Journals (Sweden)

    Maxwell Grady

    2018-02-01

    Full Text Available PLEASE, the Python Low-energy Electron Analysis SuitE, provides an open source and cross-platform graphical user interface (GUI for rapid analysis and visualization of low energy electron microscopy (LEEM data sets. LEEM and the associated technique, selected area micro-spot low energy electron diffraction (μ-LEED, are powerful tools for analysis of the surface structure for many novel materials. Specifically, these tools are uniquely suited for the characterization of two-dimensional materials. PLEASE offers a user-friendly point-and-click method for extracting intensity-voltage curves from LEEM and LEED data sets. Analysis of these curves provides insight into the atomic structure of the target material surface with unparalleled resolution.

  8. The Los Alamos suite of relativistic atomic physics codes

    International Nuclear Information System (INIS)

    Fontes, C J; Zhang, H L; Jr, J Abdallah; Clark, R E H; Kilcrease, D P; Colgan, J; Cunningham, R T; Hakel, P; Magee, N H; Sherrill, M E

    2015-01-01

    The Los Alamos suite of relativistic atomic physics codes is a robust, mature platform that has been used to model highly charged ions in a variety of ways. The suite includes capabilities for calculating data related to fundamental atomic structure, as well as the processes of photoexcitation, electron-impact excitation and ionization, photoionization and autoionization within a consistent framework. These data can be of a basic nature, such as cross sections and collision strengths, which are useful in making predictions that can be compared with experiments to test fundamental theories of highly charged ions, such as quantum electrodynamics. The suite can also be used to generate detailed models of energy levels and rate coefficients, and to apply them in the collisional-radiative modeling of plasmas over a wide range of conditions. Such modeling is useful, for example, in the interpretation of spectra generated by a variety of plasmas. In this work, we provide a brief overview of the capabilities within the Los Alamos relativistic suite along with some examples of its application to the modeling of highly charged ions. (paper)

  9. Measuring Test Case Similarity to Support Test Suite Understanding

    NARCIS (Netherlands)

    Greiler, M.S.; Van Deursen, A.; Zaidman, A.E.

    2012-01-01

    Preprint of paper published in: TOOLS 2012 - Proceedings of the 50th International Conference, Prague, Czech Republic, May 29-31, 2012; doi:10.1007/978-3-642-30561-0_8 In order to support test suite understanding, we investigate whether we can automatically derive relations between test cases. In

  10. Knowledge Architect : A Tool Suite for Managing Software Architecture Knowledge

    NARCIS (Netherlands)

    Liang, Peng; Jansen, Anton; Avgeriou, Paris

    2009-01-01

    Management of software architecture knowledge (AK) is vital for improving an organization’s architectural capabilities. To support the architecting process within our industrial partner: Astron, the Dutch radio astronomy institute, we implemented the Knowledge Architect (KA): a tool suite for

  11. The Zoot Suit Riots: Exploring Social Issues in American History

    Science.gov (United States)

    Chiodo, John J.

    2013-01-01

    The Zoot Suit Riots provide students with a case study of social unrest in American history. The influx of Latinos into the Los Angeles area prior to World War II created high levels of social unrest between Mexican Americans, military servicemen, and local residences. With large numbers of soldiers stationed in the area during the Second World…

  12. 28 CFR 36.503 - Suit by the Attorney General.

    Science.gov (United States)

    2010-07-01

    ... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Suit by the Attorney General. 36.503... discretion, the Attorney General may commence a civil action in any appropriate United States district court if the Attorney General has reasonable cause to believe that— (a) Any person or group of persons is...

  13. DYNA3D/ParaDyn Regression Test Suite Inventory

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Jerry I. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-09-01

    The following table constitutes an initial assessment of feature coverage across the regression test suite used for DYNA3D and ParaDyn. It documents the regression test suite at the time of preliminary release 16.1 in September 2016. The columns of the table represent groupings of functionalities, e.g., material models. Each problem in the test suite is represented by a row in the table. All features exercised by the problem are denoted by a check mark (√) in the corresponding column. The definition of “feature” has not been subdivided to its smallest unit of user input, e.g., algorithmic parameters specific to a particular type of contact surface. This represents a judgment to provide code developers and users a reasonable impression of feature coverage without expanding the width of the table by several multiples. All regression testing is run in parallel, typically with eight processors, except problems involving features only available in serial mode. Many are strictly regression tests acting as a check that the codes continue to produce adequately repeatable results as development unfolds; compilers change and platforms are replaced. A subset of the tests represents true verification problems that have been checked against analytical or other benchmark solutions. Users are welcomed to submit documented problems for inclusion in the test suite, especially if they are heavily exercising, and dependent upon, features that are currently underrepresented.

  14. Safety Tips: Avoiding Negligence Suits in Chemistry Teaching.

    Science.gov (United States)

    Gerlovich, Jack A.

    1983-01-01

    Discusses various aspects related to negligence on the part of chemistry teachers. Areas addressed include negligence in tort law, avoiding negligence suits, proper instructions, proper supervision, equipment maintenance, and other considerations such as sovereign immunity, and contributory versus comparative negligence. (JN)

  15. Automated integration of lidar into the LANDFIRE product suite

    Science.gov (United States)

    Birgit Peterson; Kurtis J. Nelson; Carl Seielstad; Jason Stoker; W. Matt Jolly; Russell Parsons

    2015-01-01

    Accurate information about three-dimensional canopy structure and wildland fuel across the landscape is necessary for fire behaviour modelling system predictions. Remotely sensed data are invaluable for assessing these canopy characteristics over large areas; lidar data, in particular, are uniquely suited for quantifying three-dimensional canopy structure. Although...

  16. ASIM - an Instrument Suite for the International Space Station

    DEFF Research Database (Denmark)

    Neubert, Torsten; Crosby, B.; Huang, T.-Y.

    2009-01-01

    ASIM (Atmosphere-Space Interactions Monitor) is an instrument suite for studies of severe thunderstorms and their effects on the atmosphere and ionosphere. The instruments are designed to observe transient luminous events (TLEs)—sprites, blue jets and elves—and terrestrial gamma-ray flashes (TGFs...

  17. Rimsky-Korsakov: Symphony N2 (Symphonic Suite) / Warrack, John

    Index Scriptorium Estoniae

    Warrack, John

    1990-01-01

    Uuest heliplaadist "Rimsky-Korsakov: Symphony N2 (Symphonic Suite), Op. 9, "Antar" Russian Easter Festival Overture, Op.36. Philharmonia Orchestra, Evgeni Svetlanov. Hyperion KA 66399. CDA 66399. Teise sümfoonia esitust võrreldud Neeme Järvi plaadistusega

  18. Tailoring Vantage 5 (fuel) to suit each operator's need

    Energy Technology Data Exchange (ETDEWEB)

    Chapin, D L; Secker, J R [Westinghouse Electric Corp., Philadelphia, PA (USA)

    1990-03-01

    By the end of 1989, Westinghouse Vantage 5 fuel had been reloaded into 36 nuclear power plants. The fuel offers a number of features operators can choose from to suit their own particular needs. Experience so far has shown the fuel to have performed well, with coolant activity levels remaining low. (author).

  19. Nuclear criticality predictability

    International Nuclear Information System (INIS)

    Briggs, J.B.

    1999-01-01

    As a result of lots of efforts, a large portion of the tedious and redundant research and processing of critical experiment data has been eliminated. The necessary step in criticality safety analyses of validating computer codes with benchmark critical data is greatly streamlined, and valuable criticality safety experimental data is preserved. Criticality safety personnel in 31 different countries are now using the 'International Handbook of Evaluated Criticality Safety Benchmark Experiments'. Much has been accomplished by the work of the ICSBEP. However, evaluation and documentation represents only one element of a successful Nuclear Criticality Safety Predictability Program and this element only exists as a separate entity, because this work was not completed in conjunction with the experimentation process. I believe; however, that the work of the ICSBEP has also served to unify the other elements of nuclear criticality predictability. All elements are interrelated, but for a time it seemed that communications between these elements was not adequate. The ICSBEP has highlighted gaps in data, has retrieved lost data, has helped to identify errors in cross section processing codes, and has helped bring the international criticality safety community together in a common cause as true friends and colleagues. It has been a privilege to associate with those who work so diligently to make the project a success. (J.P.N.)

  20. Critical Care

    Science.gov (United States)

    Critical care helps people with life-threatening injuries and illnesses. It might treat problems such as complications from surgery, ... attention by a team of specially-trained health care providers. Critical care usually takes place in an ...

  1. Constructed identities A Chronotopic reading of The Great Gatsby, The Man in the Gray Flannel Suit, and Mad Men

    OpenAIRE

    Yndestad, Ingrid Rivedal

    2016-01-01

    This thesis aims to explore how the past figures in F. Scott Fitzgerald's The Great Gatsby (1925), Sloan Wilson's The Man in the Gray Flannel Suit (1955), and in the AMC series Mad Men (2007-2015), written by Matthew Weiner. Focusing on the main protagonists in these works, namely Jay Gatsby, Thomas Rath, and Donald Draper, this thesis examines how the past makes itself valid in these characters' present lives and how it arguably affects their future lives. In do...

  2. Asteroid Redirect Crewed Mission Space Suit and EVA System Architecture Trade Study

    Science.gov (United States)

    Blanco, Raul A.; Bowie, Jonathan T.; Watson, Richard D.; Sipila, Stephanie A.

    2014-01-01

    the Advanced Crew Escape Suit (ACES), and the Exploration Z-suit. For this mission, the pressure garment that was selected is the Modified ACES (MACES) with EVA enhancements. Life support options that were considered included short closed-loop umbilicals, long open-loop umbilicals, the currently in-use ISS EMU Portable Life Support System (PLSS), and the currently in development Exploration PLSS. For this mission, the life support option that was selected is the Exploration PLSS. The greatest risk in the proposed architecture is viewed to be the comfort and mobility of the baseline MACES and the delicate balance between adding more mobility features while not compromising landing safety. Feasibility testing was accomplished in low fidelity analogs and in the JSC Neutral Buoyancy Laboratory (NBL) to validate the concept before a final recommendation on the architecture was made. The proposed architecture was found to meet the mission constraints, but much more work is required to determine the details of the required suit upgrades, the integration with the PLSS, and the rest of the tools and equipment required to accomplish the mission. This work and further definition of the remaining kits will be conducted in government fiscal year 14.

  3. MASH Suite: a user-friendly and versatile software interface for high-resolution mass spectrometry data interpretation and visualization.

    Science.gov (United States)

    Guner, Huseyin; Close, Patrick L; Cai, Wenxuan; Zhang, Han; Peng, Ying; Gregorich, Zachery R; Ge, Ying

    2014-03-01

    The rapid advancements in mass spectrometry (MS) instrumentation, particularly in Fourier transform (FT) MS, have made the acquisition of high-resolution and high-accuracy mass measurements routine. However, the software tools for the interpretation of high-resolution MS data are underdeveloped. Although several algorithms for the automatic processing of high-resolution MS data are available, there is still an urgent need for a user-friendly interface with functions that allow users to visualize and validate the computational output. Therefore, we have developed MASH Suite, a user-friendly and versatile software interface for processing high-resolution MS data. MASH Suite contains a wide range of features that allow users to easily navigate through data analysis, visualize complex high-resolution MS data, and manually validate automatically processed results. Furthermore, it provides easy, fast, and reliable interpretation of top-down, middle-down, and bottom-up MS data. MASH Suite is convenient, easily operated, and freely available. It can greatly facilitate the comprehensive interpretation and validation of high-resolution MS data with high accuracy and reliability.

  4. How Critical Is Critical Thinking?

    Science.gov (United States)

    Shaw, Ryan D.

    2014-01-01

    Recent educational discourse is full of references to the value of critical thinking as a 21st-century skill. In music education, critical thinking has been discussed in relation to problem solving and music listening, and some researchers suggest that training in critical thinking can improve students' responses to music. But what exactly is…

  5. Prokofieff: Krieg und Frieden (Sinfonische Suite), Die Verlobung im Kloster (Sommernacht-Suite), Russische Overtüre. Philharmonia Orchestra, Neeme Järvi / G. W.

    Index Scriptorium Estoniae

    G. W.

    1993-01-01

    Uuest heliplaadist "Prokofieff: Krieg und Frieden (Sinfonische Suite), Die Verlobung im Kloster (Sommernacht-Suite), Russische Overtüre. Philharmonia Orchestra, Neeme Järvi. (AD: 1991). Chandos/Koch CD 9096

  6. What is validation

    International Nuclear Information System (INIS)

    Clark, H.K.

    1985-01-01

    Criteria for establishing the validity of a computational method to be used in assessing nuclear criticality safety, as set forth in ''American Standard for Nuclear Criticality Safety in Operations with Fissionable Materials Outside Reactors,'' ANSI/ANS-8.1-1983, are examined and discussed. Application of the criteria is illustrated by describing the procedures followed in deriving subcritical limits that have been incorporated in the Standard

  7. STS-95 Mission Specialist Pedro Duque suits up for launch

    Science.gov (United States)

    1998-01-01

    STS-95 Mission Specialist Pedro Duque of Spain, with the European Space Agency, is helped with his flight suit by suit tech Tommy McDonald in the Operations and Checkout Building. The final fitting takes place prior to the crew walkout and transport to Launch Pad 39B. Targeted for launch at 2 p.m. EST on Oct. 29, the mission is expected to last 8 days, 21 hours and 49 minutes, and return to KSC at 11:49 a.m. EST on Nov. 7. The STS-95 mission includes research payloads such as the Spartan solar-observing deployable spacecraft, the Hubble Space Telescope Orbital Systems Test Platform, the International Extreme Ultraviolet Hitchhiker, as well as the SPACEHAB single module with experiments on space flight and the aging process.

  8. Failure to exercise due diligence costs plaintiff her suit.

    Science.gov (United States)

    1997-11-28

    The Mississippi State Supreme Court affirmed a lower court ruling dismissing a last-minute suit filed by a plaintiff against United Blood Services of Mississippi and the American Association of Blood Banks. A woman known as D. Doe was a recipient of a tainted transfusion. She contracted HIV in 1983 and died of AIDS-related causes in 1991. Her daughter, the plaintiff, filed a contaminated blood transfusion lawsuit just five days before the statute of limitations ran out but failed to ascertain the correct identity of the blood bank. She named two blood banks in her suit because she was unable to determine the source of the blood. The Supreme Court ruled that waiting until five days before the statute elapsed indicated that the plaintiff did not exercise reasonable diligence within a specific time frame.

  9. STS-90 Pilot Scott Altman is suited up for launch

    Science.gov (United States)

    1998-01-01

    STS-90 Pilot Scott Altman is assisted during suit-up activities by Lockheed Suit Technician Valerie McNeil from Johnson Space Center in KSC's Operations and Checkout Building. Altman and the rest of the STS-90 crew will shortly depart for Launch Pad 39B, where the Space Shuttle Columbia awaits a second liftoff attempt at 2:19 p.m. EDT. His first trip into space, Altman is participating in a life sciences research flight that will focus on the most complex and least understood part of the human body - - the nervous system. Neurolab will examine the effects of spaceflight on the brain, spinal cord, peripheral nerves and sensory organs in the human body.

  10. Mission Specialist Scott Parazynski checks his flight suit

    Science.gov (United States)

    1998-01-01

    STS-95 Mission Specialist Scott E. Parazynski gets help with his flight suit in the Operations and Checkout Building from a suit technician George Brittingham. The final fitting takes place prior to the crew walkout and transport to Launch Pad 39B. Targeted for launch at 2 p.m. EST on Oct. 29, the mission is expected to last 8 days, 21 hours and 49 minutes, and return to KSC at 11:49 a.m. EST on Nov. 7. The STS-95 mission includes research payloads such as the Spartan solar-observing deployable spacecraft, the Hubble Space Telescope Orbital Systems Test Platform, the International Extreme Ultraviolet Hitchhiker, as well as the SPACEHAB single module with experiments on space flight and the aging process.

  11. STS-76 Payload Cmdr Ronald Sega suits up

    Science.gov (United States)

    1996-01-01

    STS-76 Payload Commander Ronald M. Sega is donning his launch/entry suit in the Operations and Checkout Building with assistance from a suit technician. The third docking between the Russian Space Station Mir and the U.S. Space Shuttle marks the second trip into space for Sega, who recently served a five-month assignment in Russia as operations director for NASA activities there. Once suitup activities are completed the six-member STS-76 flight crew will depart for Launch Pad 39B, where the Space Shuttle Atlantis is undergoing final preparations for liftoff during an approximately seven-minute launch window opening around 3:13 a.m. EST, March 22.

  12. The IMBA suite: integrated modules for bioassay analysis

    Energy Technology Data Exchange (ETDEWEB)

    Birchall, A.; Jarvis, N.S.; Peace, M.S.; Riddell, A.E.; Battersby, W.P

    1998-07-01

    The increasing complexity of models representing the biokinetic behaviour of radionuclides in the body following intake poses problems for people who are required to implement these models. The problem is exacerbated by the current paucity of suitable software. In order to remedy this situation, a collaboration between British Nuclear Fuels, Westlakes Research Institute and the National Radiological Protection Board has started with the aim of producing a suite of modules for estimating intakes and doses from bioassay measurements using the new ICRP models. Each module will have a single purpose (e.g. to calculate respiratory tract deposition) and will interface with other software using data files. The elements to be implemented initially are plutonium, uranium, caesium, iodine and tritium. It is intended to make the software available to other parties under terms yet to be decided. This paper describes the proposed suite of integrated modules for bioassay analysis, IMBA. (author)

  13. Enhanced verification test suite for physics simulation codes

    Energy Technology Data Exchange (ETDEWEB)

    Kamm, James R.; Brock, Jerry S.; Brandon, Scott T.; Cotrell, David L.; Johnson, Bryan; Knupp, Patrick; Rider, William J.; Trucano, Timothy G.; Weirs, V. Gregory

    2008-09-01

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations.

  14. Non-Venting Thermal and Humidity Control for EVA Suits

    Science.gov (United States)

    Izenson, Mike; Chen, Weibo; Bue, Grant

    2011-01-01

    Future EVA suits need processes and systems to control internal temperature and humidity without venting water to the environment. This paper describes an absorption-based cooling and dehumidification system as well as laboratory demonstrations of the key processes. There are two main components in the system: an evaporation cooling and dehumidification garment (ECDG) that removes both sensible heat and latent heat from the pressure garment, and an absorber radiator that absorbs moisture and rejects heat to space by thermal radiation. This paper discusses the overall design of both components, and presents recent data demonstrating their operation. We developed a design and fabrication approach to produce prototypical heat/water absorbing elements for the ECDG, and demonstrated by test that these elements could absorb heat and moisture at a high flux. Proof-of-concept tests showed that an ECDG prototype absorbs heat and moisture at a rate of 85 W/ft under conditions that simulate operation in an EVA suit. The heat absorption was primarily due to direct absorption of water vapor. It is possible to construct large, flexible, durable cooling patches that can be incorporated into a cooling garment with this system. The proof-of-concept test data was scaled to calculate area needed for full metabolic loads, thus showing that it is feasible to use this technology in an EVA suit. Full-scale, lightweight absorber/radiator modules have also been built and tested. They can reject heat at a flux of 33 W/ft while maintaining ECDG operation at conditions that will provide a cool and dry environment inside the EVA suit.

  15. Compression under a mechanical counter pressure space suit glove

    Science.gov (United States)

    Waldie, James M A.; Tanaka, Kunihiko; Tourbier, Dietmar; Webb, Paul; Jarvis, Christine W.; Hargens, Alan R.

    2002-01-01

    Background: Current gas-pressurized space suits are bulky stiff shells severely limiting astronaut function and capability. A mechanical counter pressure (MCP) space suit in the form of a tight elastic garment could dramatically improve extravehicular activity (EVA) dexterity, but also be advantageous in safety, cost, mass and volume. The purpose of this study was to verify that a prototype MCP glove exerts the design compression of 200 mmHg, a pressure similar to the current NASA EVA suit. Methods: Seven male subjects donned a pressure measurement array and MCP glove on the right hand, which was placed into a partial vacuum chamber. Average compression was recorded on the palm, the bottom of the middle finger, the top of the middle finger and the dorsum of the hand at pressures of 760 (ambient), 660 and 580 mmHg. The vacuum chamber was used to simulate the pressure difference between the low breathing pressure of the current NASA space suits (approximately 200 mmHg) and an unprotected hand in space. Results: At ambient conditions, the MCP glove compressed the dorsum of the hand at 203.5 +/- 22.7 mmHg, the bottom of the middle finger at 179.4 +/- 16.0 mmHg, and the top of the middle finger at 183.8 +/- 22.6 mmHg. The palm compression was significantly lower (59.6 +/- 18.8 mmHg, pglove compression with the chamber pressure reductions. Conclusions: The MCP glove compressed the dorsum of the hand and middle finger at the design pressure.

  16. Astronaut Ronald Evans is suited up for EVA training

    Science.gov (United States)

    1972-01-01

    Astronaut Ronald E. Evans, command module pilot of the Apollo 17 lunar landing mission, is assisted by technicians in suiting up for extravehicular activity (EVA) training in a water tank in bldg 5 at the Manned Spacecraft Center (49970); Evans participates in EVA training in a water tank in bldg 5 at the Manned Spacecraft Center. The structure in the picture simulates the Scientific Instrument Module (SIM) bay of the Apollo 17 Service Module (49971).

  17. Apollo 11 astronaut Neil Armstrong suits up before launch

    Science.gov (United States)

    1969-01-01

    Apollo 11 Commander Neil Armstrong prepares to put on his helmet with the assistance of a spacesuit technician during suiting operations in the Manned Spacecraft Operations Building (MSOB) prior to the astronauts' departure to Launch Pad 39A. The three astronauts, Edwin E. Aldrin Jr., Neil A Armstrong and Michael Collins, will then board the Saturn V launch vehicle, scheduled for a 9:32 a.m. EDT liftoff, for the first manned lunar landing mission.

  18. Critical Jostling

    Directory of Open Access Journals (Sweden)

    Pippin Barr

    2016-11-01

    Full Text Available Games can serve a critical function in many different ways, from serious games about real world subjects to self-reflexive commentaries on the nature of games themselves. In this essay we discuss critical possibilities stemming from the area of critical design, and more specifically Carl DiSalvo’s adversarial design and its concept of reconfiguring the remainder. To illustrate such an approach, we present the design and outcomes of two games, Jostle Bastard and Jostle Parent. We show how the games specifically engage with two previous games, Hotline Miami and Octodad: Dadliest Catch, reconfiguring elements of those games to create interactive critical experiences and extensions of the source material. Through the presentation of specific design concerns and decisions, we provide a grounded illustration of a particular critical function of videogames and hope to highlight this form as another valuable approach in the larger area of videogame criticism.

  19. The Sample Analysis at Mars Investigation and Instrument Suite

    Science.gov (United States)

    Mahaffy, Paul; Webster, Christopher R.; Conrad, Pamela G.; Arvey, Robert; Bleacher, Lora; Brinckerhoff, William B.; Eigenbrode, Jennifer L.; Chalmers, Robert A.; Dworkin, Jason P.; Errigo, Therese; hide

    2012-01-01

    The Sample Analysis at Mars (SAM) investigation of the Mars Science Laboratory (MSL) addresses the chemical and isotopic composition of the atmosphere and volatiles extracted from solid samples. The SAM investigation is designed to contribute substantially to the mission goal of quantitatively assessing the habitability of Mars as an essential step in the search for past or present life on Mars. SAM is a 40 kg instrument suite located in the interior of MSL's Curiosity rover. The SAM instruments are a quadrupole mass spectrometer, a tunable laser spectrometer, and a 6-column gas chromatograph all coupled through solid and gas processing systems to provide complementary information on the same samples. The SAM suite is able to measure a suite of light isotopes and to analyze volatiles directly from the atmosphere or thermally released from solid samples. In addition to measurements of simple inorganic compounds and noble gases SAM will conduct a sensitive search for organic compounds with either thermal or chemical extraction from sieved samples delivered by the sample processing system on the Curiosity rover's robotic arm,

  20. Statutes of limitations: the special problem of DES suits

    International Nuclear Information System (INIS)

    Feigin, C.A.

    1981-01-01

    In 1971, medical studies determined that DES causes a rare type of vaginal cancer in a small number of daughters of mothers who took DES during pregnancy. Subsequently, medical studies determined that exposure to DES can cause other vaginal abnormalities in the daughters, some of which may be precancerous. As a result of these discoveries, many lawsuits have been filed by these daughters against DES manufacturers. Many DES suits may be barred by statutes of limitations, both because the number of years between the daughters' exposure to DES in utero and the discovery that DES can cause injuries exceeds the statutory period, and because the cancer or other injuries caused by DES may not develop for many additional years. This Note discusses two methods that DES plaintiffs may be able to use to overcome the potential statutes of limitations bar: the discovery rule, and state provisions which toll the statute of limitations for minors. The Note contends that courts should apply an expanded discovery rule to DES suits to avoid the unfair result of barring a claim before the plaintiff could have known that she had a cause of action. In addition, the Note argues that the injury which causes the statute of limitations to begin to run in DES suits should not be rigidly defined. Finally, the Note urges that courts allow eligible DES plaintiffs to take advantage of applicable state provisions that toll the statute of limitations for minors

  1. Critical Proximity

    OpenAIRE

    Simon, Jane

    2010-01-01

    This essay considers how written language frames visual objects. Drawing on Michel Foucault’s response to Raymond Roussel’s obsessive description, the essay proposes a model of criticism where description might press up against its objects. This critical closeness is then mapped across the conceptual art practice and art criticism of Ian Burn. Burn attends to the differences between seeing and reading, and considers the conditions which frame how we look at images, including how w...

  2. Explicating Validity

    Science.gov (United States)

    Kane, Michael T.

    2016-01-01

    How we choose to use a term depends on what we want to do with it. If "validity" is to be used to support a score interpretation, validation would require an analysis of the plausibility of that interpretation. If validity is to be used to support score uses, validation would require an analysis of the appropriateness of the proposed…

  3. Criticality Model

    International Nuclear Information System (INIS)

    Alsaed, A.

    2004-01-01

    The ''Disposal Criticality Analysis Methodology Topical Report'' (YMP 2003) presents the methodology for evaluating potential criticality situations in the monitored geologic repository. As stated in the referenced Topical Report, the detailed methodology for performing the disposal criticality analyses will be documented in model reports. Many of the models developed in support of the Topical Report differ from the definition of models as given in the Office of Civilian Radioactive Waste Management procedure AP-SIII.10Q, ''Models'', in that they are procedural, rather than mathematical. These model reports document the detailed methodology necessary to implement the approach presented in the Disposal Criticality Analysis Methodology Topical Report and provide calculations utilizing the methodology. Thus, the governing procedure for this type of report is AP-3.12Q, ''Design Calculations and Analyses''. The ''Criticality Model'' is of this latter type, providing a process evaluating the criticality potential of in-package and external configurations. The purpose of this analysis is to layout the process for calculating the criticality potential for various in-package and external configurations and to calculate lower-bound tolerance limit (LBTL) values and determine range of applicability (ROA) parameters. The LBTL calculations and the ROA determinations are performed using selected benchmark experiments that are applicable to various waste forms and various in-package and external configurations. The waste forms considered in this calculation are pressurized water reactor (PWR), boiling water reactor (BWR), Fast Flux Test Facility (FFTF), Training Research Isotope General Atomic (TRIGA), Enrico Fermi, Shippingport pressurized water reactor, Shippingport light water breeder reactor (LWBR), N-Reactor, Melt and Dilute, and Fort Saint Vrain Reactor spent nuclear fuel (SNF). The scope of this analysis is to document the criticality computational method. The criticality

  4. 46 CFR 199.214 - Immersion suits and thermal protective aids.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 7 2010-10-01 2010-10-01 false Immersion suits and thermal protective aids. 199.214... Passenger Vessels § 199.214 Immersion suits and thermal protective aids. (a) Each passenger vessel must... an immersion suit. (c) The immersion suits and thermal protective aids required under paragraphs (a...

  5. 33 CFR 150.518 - What are the inspection requirements for work vests and immersion suits?

    Science.gov (United States)

    2010-07-01

    ... requirements for work vests and immersion suits? 150.518 Section 150.518 Navigation and Navigable Waters COAST... vests and immersion suits? (a) All work vests and immersion suits must be inspected by the owner or... a work vest or immersion suit is inspected and is in serviceable condition, then it may remain in...

  6. Test suite for image-based motion estimation of the brain and tongue

    Science.gov (United States)

    Ramsey, Jordan; Prince, Jerry L.; Gomez, Arnold D.

    2017-03-01

    Noninvasive analysis of motion has important uses as qualitative markers for organ function and to validate biomechanical computer simulations relative to experimental observations. Tagged MRI is considered the gold standard for noninvasive tissue motion estimation in the heart, and this has inspired multiple studies focusing on other organs, including the brain under mild acceleration and the tongue during speech. As with other motion estimation approaches, using tagged MRI to measure 3D motion includes several preprocessing steps that affect the quality and accuracy of estimation. Benchmarks, or test suites, are datasets of known geometries and displacements that act as tools to tune tracking parameters or to compare different motion estimation approaches. Because motion estimation was originally developed to study the heart, existing test suites focus on cardiac motion. However, many fundamental differences exist between the heart and other organs, such that parameter tuning (or other optimization) with respect to a cardiac database may not be appropriate. Therefore, the objective of this research was to design and construct motion benchmarks by adopting an "image synthesis" test suite to study brain deformation due to mild rotational accelerations, and a benchmark to model motion of the tongue during speech. To obtain a realistic representation of mechanical behavior, kinematics were obtained from finite-element (FE) models. These results were combined with an approximation of the acquisition process of tagged MRI (including tag generation, slice thickness, and inconsistent motion repetition). To demonstrate an application of the presented methodology, the effect of motion inconsistency on synthetic measurements of head- brain rotation and deformation was evaluated. The results indicated that acquisition inconsistency is roughly proportional to head rotation estimation error. Furthermore, when evaluating non-rigid deformation, the results suggest that

  7. MCNP and OMEGA criticality calculations

    International Nuclear Information System (INIS)

    Seifert, E.

    1998-04-01

    The reliability of OMEGA criticality calculations is shown by a comparison with calculations by the validated and widely used Monte Carlo code MCNP. The criticality of 16 assemblies with uranium as fissionable is calculated with the codes MCNP (Version 4A, ENDF/B-V cross sections), MCNP (Version 4B, ENDF/B-VI cross sections), and OMEGA. Identical calculation models are used for the three codes. The results are compared mutually and with the experimental criticality of the assemblies. (orig.)

  8. Occupational-Specific Strength Predicts Astronaut-Related Task Performance in a Weighted Suit.

    Science.gov (United States)

    Taylor, Andrew; Kotarsky, Christopher J; Bond, Colin W; Hackney, Kyle J

    2018-01-01

    Future space missions beyond low Earth orbit will require deconditioned astronauts to perform occupationally relevant tasks within a planetary spacesuit. The prediction of time-to-completion (TTC) of astronaut tasks will be critical for crew safety, autonomous operations, and mission success. This exploratory study determined if the addition of task-specific strength testing to current standard lower body testing would enhance the prediction of TTC in a 1-G test battery. Eight healthy participants completed NASA lower body strength tests, occupationally specific strength tests, and performed six task simulations (hand drilling, construction wrenching, incline walking, collecting weighted samples, and dragging an unresponsive crewmember to safety) in a 48-kg weighted suit. The TTC for each task was recorded and summed to obtain a total TTC for the test battery. Linear regression was used to predict total TTC with two models: 1) NASA lower body strength tests; and 2) NASA lower body strength tests + occupationally specific strength tests. Total TTC of the test battery ranged from 20.2-44.5 min. The lower body strength test alone accounted for 61% of the variability in total TTC. The addition of hand drilling and wrenching strength tests accounted for 99% of the variability in total TTC. Adding occupationally specific strength tests (hand drilling and wrenching) to standard lower body strength tests successfully predicted total TTC in a performance test battery within a weighted suit. Future research should couple these strength tests with higher fidelity task simulations to determine the utility and efficacy of task performance prediction.Taylor A, Kotarsky CJ, Bond CW, Hackney KJ. Occupational-specific strength predicts astronaut-related task performance in a weighted suit. Aerosp Med Hum Perform. 2018; 89(1):58-62.

  9. Suited versus unsuited analog astronaut performance using the Aouda.X space suit simulator: the DELTA experiment of MARS2013.

    Science.gov (United States)

    Soucek, Alexander; Ostkamp, Lutz; Paternesi, Roberta

    2015-04-01

    Space suit simulators are used for extravehicular activities (EVAs) during Mars analog missions. Flight planning and EVA productivity require accurate time estimates of activities to be performed with such simulators, such as experiment execution or traverse walking. We present a benchmarking methodology for the Aouda.X space suit simulator of the Austrian Space Forum. By measuring and comparing the times needed to perform a set of 10 test activities with and without Aouda.X, an average time delay was derived in the form of a multiplicative factor. This statistical value (a second-over-second time ratio) is 1.30 and shows that operations in Aouda.X take on average a third longer than the same operations without the suit. We also show that activities predominantly requiring fine motor skills are associated with larger time delays (between 1.17 and 1.59) than those requiring short-distance locomotion or short-term muscle strain (between 1.10 and 1.16). The results of the DELTA experiment performed during the MARS2013 field mission increase analog mission planning reliability and thus EVA efficiency and productivity when using Aouda.X.

  10. FRHAM-TEX trademark cool suit - OST reference No. 1854. Deactivation and decommissioning focus area

    International Nuclear Information System (INIS)

    1998-02-01

    This paper describes a demonstration project for the FRHAM-TEX Cool Suit trademark manufactured by FRHAM Safety Products. It is a one-piece, disposable, breathable, waterproof coverall designed to permit moisture generated by the wearer to be transmitted outside the suit. The performance of this suit was compared to a Tyvek reg-sign suit as a baseline. The suit is proposed as safety ware for workers at decontamination and decommissioning projects

  11. Critical Review

    DEFF Research Database (Denmark)

    Rosenbaum, Ralph K.; Olsen, Stig Irving

    2018-01-01

    Manipulation and mistakes in LCA studies are as old as the tool itself, and so is its critical review. Besides preventing misuse and unsupported claims, critical review may also help identifying mistakes and more justifiable assumptions as well as generally improve the quality of a study. It thus...... supports the robustness of an LCA and increases trust in its results and conclusions. The focus of this chapter is on understanding what a critical review is, how the international standards define it, what its main elements are, and what reviewer qualifications are required. It is not the objective...... of this chapter to learn how to conduct a critical review, neither from a reviewer nor from a practitioner perspective. The foundation of this chapter and the basis for any critical review of LCA studies are the International Standards ISO 14040:2006, ISO 14044:2006 and ISO TS 14071:2014....

  12. The VLF Wave and Particle Precipitation Mapper (VPM) Cubesat Payload Suite

    Science.gov (United States)

    Inan, U.; Linscott, I.; Marshall, R. A.; Lauben, D.; Starks, M. J.; Doolittle, J. H.

    2012-12-01

    The VLF Wave and Particle Precipitation Mapper (VPM) payload is under development at Stanford University for a Cubesat mission that is planned to fly in low-earth-orbit in 2015. The VPM payload suite includes a 2-meter electric-field dipole antenna; a single-axis magnetic search coil; and a two-channel relativistic electron detector, measuring both trapped and loss-cone electrons. VPM will measure waves and relativistic electrons with the following primary goals: i) develop an improved climatology of plasmaspheric hiss in the L-shell range 1 < L < 3 at all local times; ii) detect VLF waves launched by space-based VLF transmitters, as well as energetic electrons scattered by those in-situ injected waves; iii) develop an improved climatology of lightning-generated whistlers and lightning-induced electron precipitation; iv)measure waves and electron precipitation produced by ground-based VLF transmitters; and v) validate propagation and wave-particle interaction models. In this paper we outline these science objectives of the VPM payload instrument suite, and describe the payload instruments and data products that will meet these science goals.

  13. The PAUL Suit(©) : an experience of ageing.

    Science.gov (United States)

    Bennett, Paul; Moore, Malcolm; Wenham, John

    2016-04-01

    An ageing population worldwide makes it increasingly important that health students understand issues that elderly people face and can provide empathic care to them. This teaching department in an isolated rural setting developed an interprofessional learning session to assist health students to understand issues of functional loss and social isolation that can affect elderly people. The Premature Ageing Unisex Leisure (PAUL) Suit(©) was developed as part of a 1-day learning session for undergraduate health students - including students of medicine, nursing and allied health - attending clinical placement in far-west New South Wales. The suit was developed locally and can be adjusted to simulate a wide range of functional losses in the wearer. Students undertake a range of daily tasks in the community while wearing the suit in the company of a student 'carer'. Over the past 4 years, approximately 140 students have participated in the simulation. Post-simulation evaluations report that students gain a greater understanding of some functional issues associated with ageing, and of the social isolation that can be associated with these. The experiential nature of the activity leads to some powerful insights. This activity is an innovative, experiential tool to deepen students understanding of issues related to ageing This activity is an innovative, experiential tool to deepen students understanding of issues relating to ageing. The interprofessional nature of the activity is an important factor in the success of the day, and produces a wide range of shared insights. The activity also enhances the partnerships between the university, the health service and the local community. Our experience supports the value of simulation in providing a deep learning opportunity in the area of ageing and disability. © 2015 John Wiley & Sons Ltd.

  14. pcircle - A Suite of Scalable Parallel File System Tools

    Energy Technology Data Exchange (ETDEWEB)

    2015-10-01

    Most of the software related to file system are written for conventional local file system, they are serialized and can't take advantage of the benefit of a large scale parallel file system. "pcircle" software builds on top of ubiquitous MPI in cluster computing environment and "work-stealing" pattern to provide a scalable, high-performance suite of file system tools. In particular - it implemented parallel data copy and parallel data checksumming, with advanced features such as async progress report, checkpoint and restart, as well as integrity checking.

  15. The Chernobyl cloud: comments on a non-suit

    International Nuclear Information System (INIS)

    2011-01-01

    This article comments the non-suit decision taken by a Paris court at the benefit of Pierre Pellerin after a trial about his declarations concerning the Chernobyl cloud. It recalls the great number of controls of radioactive contamination levels performed in France at this time by the SCRPI with Mr Pellerin at its head. It states that French authorities behaved like other European authorities with respect to the contamination brought by the cloud, that no epidemiological study has ever revealed pathologies which could be due to the cloud, and that the increase of cancers in Corsica is not proved

  16. Shift Verification and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Pandya, Tara M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Evans, Thomas M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Davidson, Gregory G [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Seth R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Godfrey, Andrew T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-09-07

    This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of Light Water Reactors (CASL). Five main types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed-source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results, and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over a burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.

  17. Critical Arts

    African Journals Online (AJOL)

    both formal and informal) in culture and social theory. CRITICAL ARTS aims to challenge and ... Book Review: Brian McNair, An Introduction to Political Communication (3rd edition), London: Routledge, 2003, ISBN 0415307082, 272pp. Phil Joffe ...

  18. Critical Proximity

    Directory of Open Access Journals (Sweden)

    Jane Simon

    2010-09-01

    Full Text Available This essay considers how written language frames visual objects. Drawing on Michel Foucault’s response to Raymond Roussel’s obsessive description, the essay proposes a model of criticism where description might press up against its objects. This critical closeness is then mapped across the conceptual art practice and art criticism of Ian Burn. Burn attends to the differences between seeing and reading, and considers the conditions which frame how we look at images, including how we look at, and through words. The essay goes on to consider Meaghan Morris’s writing on Lynn Silverman’s photographs. Both Morris and Burn offer an alternative to a parasitic model of criticism and enact a patient way of looking across and through visual landscapes.

  19. Critical proximity

    Directory of Open Access Journals (Sweden)

    Simon, Jane

    2010-01-01

    Full Text Available This essay considers how written language frames visual objects. Drawing on Michel Foucault’s response to Raymond Roussel’s obsessive description, the essay proposes a model of criticism where description might press up against its objects. This critical closeness is then mapped across the conceptual art practice and art criticism of Ian Burn. Burn attends to the differences between seeing and reading, and considers the conditions which frame how we look at images, including how we look at, and through words. The essay goes on to consider Meaghan Morris’s writing on Lynn Silverman’s photographs. Both Morris and Burn offer an alternative to a parasitic model of criticism and enact a patient way of looking across and through visual landscapes.

  20. SEEKING A VALID THEORY OF MAGIC REALISM: A CRITICAL LITERATURE REVIEW / BUSCANDO UNA TEORÍA VALIDA DE REALISMO MÁGICO: UNA REVISIÓN BIBLIOGRÁFICA CRÍTICA

    Directory of Open Access Journals (Sweden)

    Milagro Asensio

    2017-11-01

    Full Text Available A pesar de que mucho se ha investigado y escrito sobre el género literario realismo mágico, las conclusiones a las que llegan la mayor parte de los estudios varían y, en algunos casos, hasta se contradicen. Los malentendidos asociados al término, por un lado, y el apego al modelo estructuralista, incluso por parte de estudios postcoloniales y postmodernos del género, por el otro, han impedido el desarrollo de un marco conceptual que pueda ser usado para analizar obras mágico realistas. Este artículo aborda estas cuestiones revisando críticamente la bibliografía sobre realismo mágico desde que el término fue acuñado. Se cuestiona el criterio más usado para definir y caracterizar el género a la luz de teorías postcoloniales y postmodernas. Esta evaluación de las numerosas contribuciones realizadas por investigadores de diferentes enfoques literarios apunta a delinear un marco teórico bien fundado con criterios sólidos de definición del género que puedan ser validados y mejorados en investigaciones futuras. / Despite much has been researched and written about the literary genre magic realism, the conclusions to which most studies arrive vary and, in some cases, even contradict each other. The misconceptions associated to the term, on the one hand, and the attachment to a structuralist model, even from postcolonial and postmodern studies of the genre, on the other, have hindered the development of a conceptual framework that can be used for analysing magical realist works. This article addresses these issues by reviewing the critical literature on magic realism since the term was first coined. The most widely used criteria to define and characterise the genre is questioned in the light of both postcolonial and postmodern theories. This examination of the many contributions made by researchers from different literary approaches aims at outlining a well-grounded theoretical framework with sound criteria to define the genre that can

  1. Criticality safety

    International Nuclear Information System (INIS)

    Walker, G.

    1983-01-01

    When a sufficient quantity of fissile material is brought together a self-sustaining neutron chain reaction will be started in it and will continue until some change occurs in the fissile material to stop the chain reaction. The quantity of fissile material required is the 'Critical Mass'. This is not a fixed quantity even for a given type of fissile material but varies between quite wide limits depending on a number of factors. In a nuclear reactor the critical mass of fissile material is assembled under well-defined condition to produce a controllable chain reaction. The same materials have to be handled outside the reactor in all stages of fuel element manufacture, storage, transport and irradiated fuel reprocessing. At any stage it is possible (at least in principle) to assemble a critical mass and thus initiate an accidental and uncontrollable chain reaction. Avoiding this is what criticality safety is all about. A system is just critical when the rate of production of neutrons balances the rate of loss either by escape or by absorption. The factors affecting criticality are, therefore, those which effect neutron production and loss. The principal ones are:- type of nuclide and enrichment (or isotopic composition), moderation, reflection, concentration (density), shape and interaction. Each factor is considered in detail. (author)

  2. Anesthesia and the pediatric cardiac catheterization suite: a review.

    Science.gov (United States)

    Lam, Jennifer E; Lin, Erica P; Alexy, Ryan; Aronson, Lori A

    2015-02-01

    Advances in technology over the last couple of decades have caused a shift in pediatric cardiac catheterization from a primary focus on diagnostics to innovative therapeutic interventions. These improvements allow patients a wider range of nonsurgical options for treatment of congenital heart disease. However, these therapeutic modalities can entail higher risk in an already complex patient population, compounded by the added challenges inherent to the environment of the cardiac catheterization suite. Anesthesiologists caring for children with congenital heart disease must understand not only the pathophysiology of the disease but also the effects the anesthetics and interventions have on the patient in order to provide a safe perioperative course. It is the aim of this article to review the latest catheterization modalities offered to patients with congenital heart disease, describe the unique challenges presented in the cardiac catheterization suite, list the most common complications encountered during catheterization and finally, to review the literature regarding different anesthetic drugs used in the catheterization lab. © 2014 John Wiley & Sons Ltd.

  3. SSAGES: Software Suite for Advanced General Ensemble Simulations

    Science.gov (United States)

    Sidky, Hythem; Colón, Yamil J.; Helfferich, Julian; Sikora, Benjamin J.; Bezik, Cody; Chu, Weiwei; Giberti, Federico; Guo, Ashley Z.; Jiang, Xikai; Lequieu, Joshua; Li, Jiyuan; Moller, Joshua; Quevillon, Michael J.; Rahimi, Mohammad; Ramezani-Dakhel, Hadi; Rathee, Vikramjit S.; Reid, Daniel R.; Sevgen, Emre; Thapar, Vikram; Webb, Michael A.; Whitmer, Jonathan K.; de Pablo, Juan J.

    2018-01-01

    Molecular simulation has emerged as an essential tool for modern-day research, but obtaining proper results and making reliable conclusions from simulations requires adequate sampling of the system under consideration. To this end, a variety of methods exist in the literature that can enhance sampling considerably, and increasingly sophisticated, effective algorithms continue to be developed at a rapid pace. Implementation of these techniques, however, can be challenging for experts and non-experts alike. There is a clear need for software that provides rapid, reliable, and easy access to a wide range of advanced sampling methods and that facilitates implementation of new techniques as they emerge. Here we present SSAGES, a publicly available Software Suite for Advanced General Ensemble Simulations designed to interface with multiple widely used molecular dynamics simulations packages. SSAGES allows facile application of a variety of enhanced sampling techniques—including adaptive biasing force, string methods, and forward flux sampling—that extract meaningful free energy and transition path data from all-atom and coarse-grained simulations. A noteworthy feature of SSAGES is a user-friendly framework that facilitates further development and implementation of new methods and collective variables. In this work, the use of SSAGES is illustrated in the context of simple representative applications involving distinct methods and different collective variables that are available in the current release of the suite. The code may be found at: https://github.com/MICCoM/SSAGES-public.

  4. STS-93 Pilot Ashby suits up before launch

    Science.gov (United States)

    1999-01-01

    In the Operations and Checkout Building during final launch preparations for the second time, STS-93 Pilot Jeffrey S. Ashby waves after donning his launch and entry suit while a suit tech adjusts his boot. After Space Shuttle Columbia's July 20 launch attempt was scrubbed at the T-7 second mark in the countdown, the launch was rescheduled for Thursday, July 22, at 12:28 a.m. EDT. The target landing date is July 26, 1999, at 11:24 p.m. EDT. STS- 93 is a five-day mission primarily to release the Chandra X-ray Observatory, which will allow scientists from around the world to study some of the most distant, powerful and dynamic objects in the universe. The new telescope is 20 to 50 times more sensitive than any previous X-ray telescope and is expected unlock the secrets of supernovae, quasars and black holes. The STS-93 crew numbers five: Commander Eileen M. Collins, Ashby, and Mission Specialists Stephen A. Hawley (Ph.D.), Catherine G. Coleman (Ph.D.) and Michel Tognini of France, with the Centre National d'Etudes Spatiales (CNES). Collins is the first woman to serve as commander of a shuttle mission.

  5. STS-93 M.S. Hawley suits up for launch

    Science.gov (United States)

    1999-01-01

    During final launch preparations in the Operations and Checkout Building, STS-93 Mission Specialist Steven A. Hawley (Ph.D.)gets help donning his launch and entry suit from a suit tech. After Space Shuttle Columbia's July 20 launch attempt was scrubbed at the T-7 second mark in the countdown, the launch was rescheduled for Thursday, July 22, at 12:28 a.m. EDT. The target landing date is July 26, 1999, at 11:24 p.m. EDT. STS-93 is a five-day mission primarily to release the Chandra X-ray Observatory, which will allow scientists from around the world to study some of the most distant, powerful and dynamic objects in the universe. The new telescope is 20 to 50 times more sensitive than any previous X- ray telescope and is expected unlock the secrets of supernovae, quasars and black holes. The STS-93 crew numbers five: Commander Eileen M. Collins, Pilot Jeffrey S. Ashby, and Mission Specialists Hawley, Catherine G. Coleman (Ph.D.) and Michel Tognini of France, with the Centre National d'Etudes Spatiales (CNES). Collins is the first woman to serve as commander of a shuttle mission.

  6. SSAGES: Software Suite for Advanced General Ensemble Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Sidky, Hythem [Department of Chemical and Biomolecular Engineering, University of Notre Dame, Notre Dame, Indiana 46556, USA; Colón, Yamil J. [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Institute for Molecular Engineering and Materials Science Division, Argonne National Laboratory, Lemont, Illinois 60439, USA; Helfferich, Julian [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Steinbuch Center for Computing, Karlsruhe Institute of Technology, Hermann-von-Helmholtz-Platz 1, 76344 Eggenstein-Leopoldshafen, Germany; Sikora, Benjamin J. [Department of Chemical and Biomolecular Engineering, University of Notre Dame, Notre Dame, Indiana 46556, USA; Bezik, Cody [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Chu, Weiwei [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Giberti, Federico [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Guo, Ashley Z. [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Jiang, Xikai [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Lequieu, Joshua [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Li, Jiyuan [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Moller, Joshua [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Quevillon, Michael J. [Department of Chemical and Biomolecular Engineering, University of Notre Dame, Notre Dame, Indiana 46556, USA; Rahimi, Mohammad [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Ramezani-Dakhel, Hadi [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Department of Biochemistry and Molecular Biology, University of Chicago, Chicago, Illinois 60637, USA; Rathee, Vikramjit S. [Department of Chemical and Biomolecular Engineering, University of Notre Dame, Notre Dame, Indiana 46556, USA; Reid, Daniel R. [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Sevgen, Emre [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Thapar, Vikram [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Webb, Michael A. [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Institute for Molecular Engineering and Materials Science Division, Argonne National Laboratory, Lemont, Illinois 60439, USA; Whitmer, Jonathan K. [Department of Chemical and Biomolecular Engineering, University of Notre Dame, Notre Dame, Indiana 46556, USA; de Pablo, Juan J. [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Institute for Molecular Engineering and Materials Science Division, Argonne National Laboratory, Lemont, Illinois 60439, USA

    2018-01-28

    Molecular simulation has emerged as an essential tool for modern-day research, but obtaining proper results and making reliable conclusions from simulations requires adequate sampling of the system under consideration. To this end, a variety of methods exist in the literature that can enhance sampling considerably, and increasingly sophisticated, effective algorithms continue to be developed at a rapid pace. Implementation of these techniques, however, can be challenging for experts and non-experts alike. There is a clear need for software that provides rapid, reliable, and easy access to a wide range of advanced sampling methods, and that facilitates implementation of new techniques as they emerge. Here we present SSAGES, a publicly available Software Suite for Advanced General Ensemble Simulations designed to interface with multiple widely used molecular dynamics simulations packages. SSAGES allows facile application of a variety of enhanced sampling techniques—including adaptive biasing force, string methods, and forward flux sampling—that extract meaningful free energy and transition path data from all-atom and coarse grained simulations. A noteworthy feature of SSAGES is a user-friendly framework that facilitates further development and implementation of new methods and collective variables. In this work, the use of SSAGES is illustrated in the context of simple representative applications involving distinct methods and different collective variables that are available in the current release of the suite.

  7. Tier-3 Monitoring Software Suite (T3MON) proposal

    CERN Document Server

    Andreeva, J; The ATLAS collaboration; Klimentov, A; Korenkov, V; Oleynik, D; Panitkin, S; Petrosyan, A

    2011-01-01

    The ATLAS Distributed Computing activities concentrated so far in the “central” part of the computing system of the experiment, namely the first 3 tiers (CERN Tier0, the 10 Tier1s centres and the 60+ Tier2s). This is a coherent system to perform data processing and management on a global scale and host (re)processing, simulation activities down to group and user analysis. Many ATLAS Institutes and National Communities built (or have plans to build) Tier-3 facilities. The definition of Tier-3 concept has been outlined (REFERENCE). Tier-3 centres consist of non-pledged resources mostly dedicated for the data analysis by the geographically close or local scientific groups. Tier-3 sites comprise a range of architectures and many do not possess Grid middleware, which would render application of Tier-2 monitoring systems useless. This document describes a strategy to develop a software suite for monitoring of the Tier3 sites. This software suite will enable local monitoring of the Tier3 sites and the global vie...

  8. Wireless hydrotherapy smart suit for monitoring handicapped people

    Science.gov (United States)

    Correia, Jose H.; Mendes, Paulo M.

    2005-02-01

    This paper presents a smart suit, water impermeable, containing sensors and electronics for monitoring handicapped people at hydrotherapy sessions in swimming-pools. For integration into textiles, electronic components should be designed in a functional, robust and inexpensive way. Therefore, small-size electronics microsystems are a promising approach. The smart suit allows the monitoring of individual biometric data, such as heart rate, temperature and movement of the body. Two solutions for transmitting the data wirelessly are presented: through a low-voltage (3.0 V), low-power, CMOS RF IC (1.6 mm x 1.5 mm size dimensions) operating at 433 MHz, with ASK modulation and a patch antenna built on lossy substrates compatible with integrated circuits fabrication. Two different substrates were used for antenna implementation: high-resistivity silicon (HRS) and Corning Pyrex #7740 glass. The antenna prototypes were built to operate close to the 5 GHz ISM band. They operate at a center frequency of 5.705 GHz (HRS) and 5.995 GHz (Pyrex). The studied parameters were: substrate thickness, substrate losses, oxide thickness, metal conductivity and thickness. The antenna on HRS uses an area of 8 mm2, providing a 90 MHz bandwidth and ~0.3 dBi of gain. On a glass substrate, the antenna uses 12 mm2, provides 100 MHz bandwidth and ~3 dBi of gain.

  9. A comparison of two Shuttle launch and entry suits - Reach envelope, isokinetic strength, and treadmill tests

    Science.gov (United States)

    Schafer, Lauren E.; Rajulu, Sudhakar L.; Klute, Glenn K.

    1992-01-01

    A quantification has been conducted of any existing differences between the performance, in operational conditions, of the Space Shuttle crew Launch Entry Suit (LES) and the new Advanced Crew Escape Suit (ACES). While LES is a partial-pressure suit, the ACES system which is being considered as a replacement for LES is a full-pressure suit. Three tests have been conducted with six subjects to ascertain the suits' reach envelope, strength, and treadmill performance. No significant operational differences were found between the two suit designs.

  10. Modeling the Impact of Space Suit Components and Anthropometry on the Center of Mass of a Seated Crewmember

    Science.gov (United States)

    Rajulu, Sudhakar; Blackledge, Christopher; Ferrer, Mike; Margerum, Sarah

    2009-01-01

    The designers of the Orion Crew Exploration Vehicle (CEV) utilize an intensive simulation program in order to predict the launch and landing characteristics of the Crew Impact Attenuation System (CIAS). The CIAS is the energy absorbing strut concept that dampens loads to levels sustainable by the crew during landing and consists of the crew module seat pallet that accommodates four to six seated astronauts. An important parameter required for proper dynamic modeling of the CIAS is knowledge of the suited center of mass (COM) variations within the crew population. Significant center of mass variations across suited crew configurations would amplify the inertial effects of the pallet and potentially create unacceptable crew loading during launch and landing. Established suited, whole-body, and posture-based mass properties were not available due to the uncertainty of the final CEV seat posture and suit hardware configurations. While unsuited segmental center of mass values can be obtained via regression equations from previous studies, building them into a model that was posture dependent with custom anthropometry and integrated suit components proved cumbersome and time consuming. Therefore, the objective of this study was to quantify the effects of posture, suit components, and the expected range of anthropometry on the center of mass of a seated individual. Several elements are required for the COM calculation of a suited human in a seated position: anthropometry; body segment mass; suit component mass; suit component location relative to the body; and joint angles defining the seated posture. Anthropometry and body segment masses used in this study were taken from a selection of three-dimensional human body models, called boundary manikins, which were developed in a previous project. These boundary manikins represent the critical anthropometric dimension extremes for the anticipated astronaut population. Six male manikins and 6 female manikins, representing a

  11. A suite of standards for radiation monitors and their revisions

    International Nuclear Information System (INIS)

    Noda, Kimio

    1991-01-01

    A suite of standards for radiation monitors applied in nuclear facilities in Japan was compiled mainly by Health Physicists in Power Reactor and Nuclear Fuel Development (PNC) and Japan Atomic Energy Research Institute (JAERI), and issued in 1971 as 'The Standard for Radiation Monitors'. PNC facilities such as Reprocessing Plant and Plutonium Fuel Fabrication Facility, as well as other nuclear industries have applied the standard, and contributed improvement of practical maintenability and availability of the radiation monitors. Meanwhile, the radiation monitors have remarkably progressed in its application and size of the monitors is growing. Furthermore, manufacturing techniques have significantly progressed especially in the field of system concepts and electronics elements. These progresses require revision of the standards. 'The Standard for Radiation Monitors' has been revised considering the problems in practical application and data processing capability. Considerations are given to keep compatibility of old and new modules. (author)

  12. User Guide for the STAYSL PNNL Suite of Software Tools

    Energy Technology Data Exchange (ETDEWEB)

    Greenwood, Lawrence R.; Johnson, Christian D.

    2013-02-27

    The STAYSL PNNL software suite provides a set of tools for working with neutron activation rates measured in a nuclear fission reactor, an accelerator-based neutron source, or any neutron field to determine the neutron flux spectrum through a generalized least-squares approach. This process is referred to as neutron spectral adjustment since the preferred approach is to use measured data to adjust neutron spectra provided by neutron physics calculations. The input data consist of the reaction rates based on measured activities, an initial estimate of the neutron flux spectrum, neutron activation cross sections and their associated uncertainties (covariances), and relevant correction factors. The output consists of the adjusted neutron flux spectrum and associated covariance matrix, which is useful for neutron dosimetry and radiation damage calculations.

  13. Court rules against failed viatical firm in investor suit.

    Science.gov (United States)

    1999-10-01

    A Federal appeals court has revived a claim against Dignity Partners Inc., a viatical business, and offshoot of a financial-services firm. Dignity Partners operated by buying the life insurance policies of terminally ill people. The company was charged with making false and misleading statements in its prospectus for an initial public stock offering. Five months later, the company announced that it would not accept new customers with AIDS, a group which represented 95 percent of its accounts at that time. The company had information from researchers and clinicians that the introduction of protease inhibitors would greatly increase life expectancy for its customers and would reduce company profits. This information was not generally available to potential investors. The suit against the company alleges violations of the Securities Act of 1933 and the Exchange Act of 1934, both which govern stock trading.

  14. STS-93 Commander Collins suits up for launch

    Science.gov (United States)

    1999-01-01

    During the third launch preparations in the Operations and Checkout Building, STS-93 Commander Eileen M. Collins waves while having her launch and entry suit checked. After Space Shuttle Columbia's July 20 and 22 launch attempts were scrubbed, the launch was again rescheduled for Friday, July 23, at 12:24 a.m. EDT. STS-93 is a five-day mission primarily to release the Chandra X-ray Observatory, which will allow scientists from around the world to study some of the most distant, powerful and dynamic objects in the universe. The STS-93 crew numbers five: Commander Collins, Pilot Jeffrey S. Ashby, and Mission Specialists Stephen A. Hawley (Ph.D.), Catherine G. Coleman (Ph.D.) and Michel Tognini of France, with the Centre National d'Etudes Spatiales (CNES). Collins is the first woman to serve as commander of a shuttle mission.

  15. STS-93 Mission Specialist Cady Coleman suits up for launch

    Science.gov (United States)

    1999-01-01

    For the third time, during final launch preparations in the Operations and Checkout Building, STS-93 Mission Specialist Catherine G. Coleman (Ph.D.) dons her launch and entry suit. After Space Shuttle Columbia's July 20 and 22 launch attempts were scrubbed, the launch was again rescheduled for Friday, July 23, at 12:24 a.m. EDT. STS-93 is a five-day mission primarily to release the Chandra X-ray Observatory, which will allow scientists from around the world to study some of the most distant, powerful and dynamic objects in the universe. The STS-93 crew numbers five: Commander Eileen M. Collins, Pilot Jeffrey S. Ashby, and Mission Specialists Stephen A. Hawley (Ph.D.), Coleman, and Michel Tognini of France, with the Centre National d'Etudes Spatiales (CNES). Collins is the first woman to serve as commander of a shuttle mission.

  16. STS-93 Commander Eileen Collins suits up for launch

    Science.gov (United States)

    1999-01-01

    For the third time, in the Operations and Checkout Building, STS- 93 Commander Eileen M. Collins tries on her helmet with her launch and entry suit. After Space Shuttle Columbia's July 20 and 22 launch attempts were scrubbed, the launch was again rescheduled for Friday, July 23, at 12:24 a.m. EDT. STS-93 is a five-day mission primarily to release the Chandra X-ray Observatory, which will allow scientists from around the world to study some of the most distant, powerful and dynamic objects in the universe. The STS-93 crew numbers five: Commander Collins, Pilot Jeffrey S. Ashby, and Mission Specialists Stephen A. Hawley (Ph.D.), Catherine G. Coleman (Ph.D.) and Michel Tognini of France, with the Centre National d'Etudes Spatiales (CNES). Collins is the first woman to serve as commander of a shuttle mission.

  17. STS-93 Mission Specialist Hawley suits up for launch

    Science.gov (United States)

    1999-01-01

    For the third time, during final launch preparations in the Operations and Checkout Building, STS-93 Mission Specialist Steven A. Hawley (Ph.D.) waves after donning his launch and entry suit. After Space Shuttle Columbia's July 20 and 22 launch attempts were scrubbed, the launch was again rescheduled for Friday, July 23, at 12:24 a.m. EDT. STS-93 is a five-day mission primarily to release the Chandra X-ray Observatory, which will allow scientists from around the world to study some of the most distant, powerful and dynamic objects in the universe. The STS-93 crew numbers five: Commander Eileen M. Collins, Pilot Jeffrey S. Ashby, and Mission Specialists Hawley, Catherine G. Coleman (Ph.D.) and Michel Tognini of France, with the Centre National d'Etudes Spatiales (CNES). Collins is the first woman to serve as commander of a shuttle mission.

  18. STS-93 Pilot Ashby suits up for launch

    Science.gov (United States)

    1999-01-01

    In the Operations and Checkout Building during final launch preparations for the third time, STS-93 Pilot Jeffrey S. Ashby pulls on his glove, part of his launch and entry suit. After Space Shuttle Columbia's July 20 and 22 launch attempts were scrubbed, the launch was again rescheduled for Friday, July 23, at 12:24 a.m. EDT. STS-93 is a five-day mission primarily to release the Chandra X-ray Observatory, which will allow scientists from around the world to study some of the most distant, powerful and dynamic objects in the universe. The STS-93 crew numbers five: Commander Eileen Collins, Ashby, and Mission Specialists Stephen A. Hawley (Ph.D.), Catherine G. Coleman (Ph.D.) and Michel Tognini of France, with the Centre National d'Etudes Spatiales (CNES). Collins is the first woman to serve as commander of a shuttle mission.

  19. STS-92 Pilot Pam Melroy suits up for launch

    Science.gov (United States)

    2000-01-01

    In the Operations and Checkout Building, STS-92 Pilot Pamela Ann Melroy smiles during suit check before heading out to the Astrovan for the ride to Launch Pad 39A. During the 11-day mission to the International Space Station, four extravehicular activities (EVAs), or spacewalks, are planned for construction. The payload includes the Integrated Truss Structure Z-1 and the third Pressurized Mating Adapter. The Z-1 truss is the first of 10 that will become the backbone of the Space Station, eventually stretching the length of a football field. PMA-3 will provide a Shuttle docking port for solar array installation on the sixth Station flight and Lab installation on the seventh Station flight. Launch is scheduled for 7:17 p.m. EDT. Landing is expected Oct. 22 at 2:10 p.m. EDT.

  20. The Characterization of Biosignatures in Caves Using an Instrument Suite

    Science.gov (United States)

    Uckert, Kyle; Chanover, Nancy J.; Getty, Stephanie; Voelz, David G.; Brinckerhoff, William B.; McMillan, Nancy; Xiao, Xifeng; Boston, Penelope J.; Li, Xiang; McAdam, Amy; Glenar, David A.; Chavez, Arriana

    2017-12-01

    The search for life and habitable environments on other Solar System bodies is a major motivator for planetary exploration. Due to the difficulty and significance of detecting extant or extinct extraterrestrial life in situ, several independent measurements from multiple instrument techniques will bolster the community's confidence in making any such claim. We demonstrate the detection of subsurface biosignatures using a suite of instrument techniques including IR reflectance spectroscopy, laser-induced breakdown spectroscopy, and scanning electron microscopy/energy dispersive X-ray spectroscopy. We focus our measurements on subterranean calcium carbonate field samples, whose biosignatures are analogous to those that might be expected on some high-interest astrobiology targets. In this work, we discuss the feasibility and advantages of using each of the aforementioned instrument techniques for the in situ search for biosignatures and present results on the autonomous characterization of biosignatures using multivariate statistical analysis techniques.

  1. Comparing apples and oranges: the Community Intercomparison Suite

    Science.gov (United States)

    Schutgens, Nick; Stier, Philip; Kershaw, Philip; Pascoe, Stephen

    2015-04-01

    Visual representation and comparison of geoscientific datasets presents a huge challenge due to the large variety of file formats and spatio-temporal sampling of data (be they observations or simulations). The Community Intercomparison Suite attempts to greatly simplify these tasks for users by offering an intelligent but simple command line tool for visualisation and colocation of diverse datasets. In addition, CIS can subset and aggregate large datasets into smaller more manageable datasets. Our philosophy is to remove as much as possible the need for specialist knowledge by the user of the structure of a dataset. The colocation of observations with model data is as simple as: "cis col ::" which will resample the simulation data to the spatio-temporal sampling of the observations, contingent on a few user-defined options that specify a resampling kernel. As an example, we apply CIS to a case study of biomass burning aerosol from the Congo. Remote sensing observations, in-situe observations and model data are shown in various plots, with the purpose of either comparing different datasets or integrating them into a single comprehensive picture. CIS can deal with both gridded and ungridded datasets of 2, 3 or 4 spatio-temporal dimensions. It can handle different spatial coordinates (e.g. longitude or distance, altitude or pressure level). CIS supports both HDF, netCDF and ASCII file formats. The suite is written in Python with entirely publicly available open source dependencies. Plug-ins allow a high degree of user-moddability. A web-based developer hub includes a manual and simple examples. CIS is developed as open source code by a specialist IT company under supervision of scientists from the University of Oxford and the Centre of Environmental Data Archival as part of investment in the JASMIN superdatacluster facility.

  2. Development of a Fan for Future Space Suit Applications

    Science.gov (United States)

    Paul. Heather L.; Converse, David; Dionne, Steven; Moser, Jeff

    2010-01-01

    NASA's next generation space suit system will place new demands on the fan used to circulate breathing gas through the ventilation loop of the portable life support system. Long duration missions with frequent extravehicular activities (EVAs), the requirement for significant increases in reliability and durability, and a mission profile that imposes strict limits on weight, volume and power create the basis for a set of requirements that demand more performance than is available from existing fan designs. This paper describes the development of a new fan to meet these needs. A centrifugal fan was designed with a normal operating speed of approximately 39,400 rpm to meet the ventilation flow requirements while also meeting the aggressive minimal packaging, weight and power requirements. The prototype fan also operates at 56,000 rpm to satisfy a second operating condition associated with a single fan providing ventilation flow to two spacesuits connected in series. This fan incorporates a novel nonmetallic "can" to keep the oxygen flow separate from the motor electronics, thus eliminating ignition potential. The nonmetallic can enables a small package size and low power consumption. To keep cost and schedule within project bounds a commercial motor controller was used. The fan design has been detailed and implemented using materials and approaches selected to address anticipated mission needs. Test data is presented to show how this fan performs relative to anticipated ventilation requirements for the EVA portable life support system. Additionally, data is presented to show tolerance to anticipated environmental factors such as acoustics, shock, and vibration. Recommendations for forward work to progress the technology readiness level and prepare the fan for the next EVA space suit system are also discussed.

  3. Automated integration of lidar into the LANDFIRE product suite

    Science.gov (United States)

    Peterson, Birgit; Nelson, Kurtis; Seielstad, Carl; Stoker, Jason M.; Jolly, W. Matt; Parsons, Russell

    2015-01-01

    Accurate information about three-dimensional canopy structure and wildland fuel across the landscape is necessary for fire behaviour modelling system predictions. Remotely sensed data are invaluable for assessing these canopy characteristics over large areas; lidar data, in particular, are uniquely suited for quantifying three-dimensional canopy structure. Although lidar data are increasingly available, they have rarely been applied to wildland fuels mapping efforts, mostly due to two issues. First, the Landscape Fire and Resource Planning Tools (LANDFIRE) program, which has become the default source of large-scale fire behaviour modelling inputs for the US, does not currently incorporate lidar data into the vegetation and fuel mapping process because spatially continuous lidar data are not available at the national scale. Second, while lidar data are available for many land management units across the US, these data are underutilized for fire behaviour applications. This is partly due to a lack of local personnel trained to process and analyse lidar data. This investigation addresses these issues by developing the Creating Hybrid Structure from LANDFIRE/lidar Combinations (CHISLIC) tool. CHISLIC allows individuals to automatically generate a suite of vegetation structure and wildland fuel parameters from lidar data and infuse them into existing LANDFIRE data sets. CHISLIC will become available for wider distribution to the public through a partnership with the U.S. Forest Service’s Wildland Fire Assessment System (WFAS) and may be incorporated into the Wildland Fire Decision Support System (WFDSS) with additional design and testing. WFAS and WFDSS are the primary systems used to support tactical and strategic wildland fire management decisions.

  4. Space Suit Simulator (S3) for Partial Gravity EVA Experimentation and Training, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Pressurized space suits impose high joint torques on the wearer, reducing mobility for upper and lower body motions. Using actual space suits in training or...

  5. Astronaut Neil Armstrong in Launch Complex 16 trailer during suiting up

    Science.gov (United States)

    1966-01-01

    Astronaut Neil A. Armstrong, command pilot of the Gemini 8 space flight, sits in the Launch Complex 16 trailer during suiting up operations for the Gemini 8 mission. Suit technician Jim Garrepy assists.

  6. Simulating Visible/Infrared Imager Radiometer Suite Normalized Difference Vegetation Index Data Using Hyperion and MODIS

    Science.gov (United States)

    Ross, Kenton W.; Russell, Jeffrey; Ryan, Robert E.

    2006-01-01

    The success of MODIS (the Moderate Resolution Imaging Spectrometer) in creating unprecedented, timely, high-quality data for vegetation and other studies has created great anticipation for data from VIIRS (the Visible/Infrared Imager Radiometer Suite). VIIRS will be carried onboard the joint NASA/Department of Defense/National Oceanic and Atmospheric Administration NPP (NPOESS (National Polar-orbiting Operational Environmental Satellite System) Preparatory Project). Because the VIIRS instruments will have lower spatial resolution than the current MODIS instruments 400 m versus 250 m at nadir for the channels used to generate Normalized Difference Vegetation Index data, scientists need the answer to this question: how will the change in resolution affect vegetation studies? By using simulated VIIRS measurements, this question may be answered before the VIIRS instruments are deployed in space. Using simulated VIIRS products, the U.S. Department of Agriculture and other operational agencies can then modify their decision support systems appropriately in preparation for receipt of actual VIIRS data. VIIRS simulations and validations will be based on the ART (Application Research Toolbox), an integrated set of algorithms and models developed in MATLAB(Registerd TradeMark) that enables users to perform a suite of simulations and statistical trade studies on remote sensing systems. Specifically, the ART provides the capability to generate simulated multispectral image products, at various scales, from high spatial hyperspectral and/or multispectral image products. The ART uses acquired ( real ) or synthetic datasets, along with sensor specifications, to create simulated datasets. For existing multispectral sensor systems, the simulated data products are used for comparison, verification, and validation of the simulated system s actual products. VIIRS simulations will be performed using Hyperion and MODIS datasets. The hyperspectral and hyperspatial properties of Hyperion

  7. Modeling and dynamic simulation of astronaut's upper limb motions considering counter torques generated by the space suit.

    Science.gov (United States)

    Li, Jingwen; Ye, Qing; Ding, Li; Liao, Qianfang

    2017-07-01

    Extravehicular activity (EVA) is an inevitable task for astronauts to maintain proper functions of both the spacecraft and the space station. Both experimental research in a microgravity simulator (e.g. neutral buoyancy tank, zero-g aircraft or a drop tower/tube) and mathematical modeling were used to study EVA to provide guidance for the training on Earth and task design in space. Modeling has become more and more promising because of its efficiency. Based on the task analysis, almost 90% of EVA activity is accomplished through upper limb motions. Therefore, focusing on upper limb models of the body and space suit is valuable to this effort. In previous modeling studies, some multi-rigid-body systems were developed to simplify the human musculoskeletal system, and the space suit was mostly considered as a part of the astronaut body. With the aim to improve the reality of the models, we developed an astronauts' upper limb model, including a torque model and a muscle-force model, with the counter torques from the space suit being considered as a boundary condition. Inverse kinematics and the Maggi-Kane's method was applied to calculate the joint angles, joint torques and muscle force given that the terminal trajectory of upper limb motion was known. Also, we validated the muscle-force model using electromyogram (EMG) data collected in a validation experiment. Muscle force calculated from our model presented a similar trend with the EMG data, supporting the effectiveness and feasibility of the muscle-force model we established, and also, partially validating the joint model in kinematics aspect.

  8. A Habermasian Approach to Critical Reading

    Science.gov (United States)

    Lee, Cheu-jey

    2016-01-01

    This article explores the connection between critical reading and Jurgen Habermas's theory of communicative action. It proposes that Habermas's criteria used for evaluating validity claims in communicative action can be applied in reading texts critically. Analyses of different types of texts are presented to show how critical reading is done in a…

  9. Nuclear criticality information system

    International Nuclear Information System (INIS)

    Koponen, B.L.; Hampel, V.E.

    1981-01-01

    The nuclear criticality safety program at LLNL began in the 1950's with a critical measurements program which produced benchmark data until the late 1960's. This same time period saw the rapid development of computer technology useful for both computer modeling of fissile systems and for computer-aided management and display of the computational benchmark data. Database management grew in importance as the amount of information increased and as experimental programs were terminated. Within the criticality safety program at LLNL we began at that time to develop a computer library of benchmark data for validation of computer codes and cross sections. As part of this effort, we prepared a computer-based bibliography of criticality measurements on relatively simple systems. However, it is only now that some of these computer-based resources can be made available to the nuclear criticality safety community at large. This technology transfer is being accomplished by the DOE Technology Information System (TIS), a dedicated, advanced information system. The NCIS database is described

  10. Critical Vidders

    DEFF Research Database (Denmark)

    Svegaard, Robin Sebastian Kaszmarczyk

    2015-01-01

    This article will introduce and take a look at a specific subset of the fan created remix videos known as vids, namely those that deal with feminist based critique of media. Through examples, it will show how fans construct and present their critique, and finally broach the topic of the critical ...

  11. FACTAR validation

    International Nuclear Information System (INIS)

    Middleton, P.B.; Wadsworth, S.L.; Rock, R.C.; Sills, H.E.; Langman, V.J.

    1995-01-01

    A detailed strategy to validate fuel channel thermal mechanical behaviour codes for use of current power reactor safety analysis is presented. The strategy is derived from a validation process that has been recently adopted industry wide. Focus of the discussion is on the validation plan for a code, FACTAR, for application in assessing fuel channel integrity safety concerns during a large break loss of coolant accident (LOCA). (author)

  12. Development of an online men’s suits customizing system using heuristic procedure for wheelchair users

    NARCIS (Netherlands)

    Jeong, Minseok; Yang, Chuneun; You, Heecheon; Park, Kwangae; Lee, W.

    2016-01-01

    An online suit-customizing system for the special accessibility needs of wheelchair users should be developed because the demand for business suits by wheelchair users involved in economic activities has increased. This study
    develops a user interface an online customizing system for men's suits

  13. Application of Sensitivity and Uncertainty Analysis Methods to a Validation Study for Weapons-Grade Mixed-Oxide Fuel

    International Nuclear Information System (INIS)

    Dunn, M.E.

    2001-01-01

    At the Oak Ridge National Laboratory (ORNL), sensitivity and uncertainty (S/U) analysis methods and a Generalized Linear Least-Squares Methodology (GLLSM) have been developed to quantitatively determine the similarity or lack thereof between critical benchmark experiments and an application of interest. The S/U and GLLSM methods provide a mathematical approach, which is less judgment based relative to traditional validation procedures, to assess system similarity and estimate the calculational bias and uncertainty for an application of interest. The objective of this paper is to gain experience with the S/U and GLLSM methods by revisiting a criticality safety evaluation and associated traditional validation for the shipment of weapons-grade (WG) MOX fuel in the MO-1 transportation package. In the original validation, critical experiments were selected based on a qualitative assessment of the MO-1 and MOX contents relative to the available experiments. Subsequently, traditional trending analyses were used to estimate the Δk bias and associated uncertainty. In this paper, the S/U and GLLSM procedures are used to re-evaluate the suite of critical experiments associated with the original MO-1 evaluation. Using the S/U procedures developed at ORNL, critical experiments that are similar to the undamaged and damaged MO-1 package are identified based on sensitivity and uncertainty analyses of the criticals and the MO-1 package configurations. Based on the trending analyses developed for the S/U and GLLSM procedures, the Δk bias and uncertainty for the most reactive MO-1 package configurations are estimated and used to calculate an upper subcritical limit (USL) for the MO-1 evaluation. The calculated bias and uncertainty from the S/U and GLLSM analyses lead to a calculational USL that supports the original validation study for the MO-1

  14. Critical reading and critical thinking Critical reading and critical thinking

    Directory of Open Access Journals (Sweden)

    Loni Kreis Taglieber

    2008-04-01

    Full Text Available The purpose of this paper is to provide, for L1 and L2 reading and writing teachers, a brief overview of the literature about critical reading and higher level thinking skills. The teaching of these skills is still neglected in some language classes in Brazil, be it in L1 or in L2 classes. Thus, this paper may also serve as a resource guide for L1 and/or L2 reading and writing teachers who want to incorporate critical reading and thinking into their classes. In modern society, even in everyday life people frequently need to deal with complicated public and political issues, make decisions, and solve problems. In order to do this efficiently and effectively, citizens must be able to evaluate critically what they see, hear, and read. Also, with the huge amount of printed material available in all areas in this age of “information explosion” it is easy to feel overwhelmed. But often the information piled up on people’s desks and in their minds is of no use due to the enormous amount of it. The purpose of this paper is to provide, for L1 and L2 reading and writing teachers, a brief overview of the literature about critical reading and higher level thinking skills. The teaching of these skills is still neglected in some language classes in Brazil, be it in L1 or in L2 classes. Thus, this paper may also serve as a resource guide for L1 and/or L2 reading and writing teachers who want to incorporate critical reading and thinking into their classes. In modern society, even in everyday life people frequently need to deal with complicated public and political issues, make decisions, and solve problems. In order to do this efficiently and effectively, citizens must be able to evaluate critically what they see, hear, and read. Also, with the huge amount of printed material available in all areas in this age of “information explosion” it is easy to feel overwhelmed. But often the information piled up on people’s desks and in their minds is of

  15. Criticality accident:

    International Nuclear Information System (INIS)

    Canavese, Susana I.

    2000-01-01

    A criticality accident occurred at 10:35 on September 30, 1999. It occurred in a precipitation tank in a Conversion Test Building at the JCO Tokai Works site in Tokaimura (Tokai Village) in the Ibaraki Prefecture of Japan. STA provisionally rated this accident a 4 on the seven-level, logarithmic International Nuclear Event Scale (INES). The September 30, 1999 criticality accident at the JCO Tokai Works Site in Tokaimura, Japan in described in preliminary, technical detail. Information is based on preliminary presentations to technical groups by Japanese scientists and spokespersons, translations by technical and non-technical persons of technical web postings by various nuclear authorities, and English-language non-technical reports from various news media and nuclear-interest groups. (author)

  16. Critical dynamics

    International Nuclear Information System (INIS)

    Dekker, H.

    1980-01-01

    It is shown how to solve the master equation for a Markov process including a critical point by means of successive approximations in terms of a small parameter. A critical point occurs if, by adjusting an externally controlled quantity, the system shows a transition from normal monostable to bistable behaviour. The fundamental idea of the theory is to separate the master equation into its proper irreducible part and a corrective remainder. The irreducible or zeroth order stochastic approximation will be a relatively simple Fokker-Planck equation that contains the essential features of the process. Once the solution of this irreducible equation is known, the higher order corrections in the original master equation can be incorporated in a systematic manner. (Auth.)

  17. Hybrid Microscopic-Endoscopic Surgery for Craniopharyngioma in Neurosurgical Suite: Technical Notes.

    Science.gov (United States)

    Ichikawa, Tomotsugu; Otani, Yoshihiro; Ishida, Joji; Fujii, Kentaro; Kurozumi, Kazuhiko; Ono, Shigeki; Date, Isao

    2016-01-01

    The best chance of curing craniopharyngioma is achieved by microsurgical total resection; however, its location adjacent to critical structures hinders complete resection without neurologic deterioration. Unrecognized residual tumor within microscopic blind spots might result in tumor recurrences. To improve outcomes, new techniques are necessary to visualize tissue within these blind spots. We examined the success of hybrid microscopic-endoscopic neurosurgery for craniopharyngioma in a neurosurgical suite. Four children with craniopharyngiomas underwent microscopic resection. When the neurosurgeon was confident that most of the visible tumor was removed but was suspicious of residual tumor within the blind spot, he or she used an integrated endoscope-holder system to inspect and remove any residual tumor. Two ceiling monitors were mounted side by side in front of the surgeon to display both microscopic and endoscopic views and to view both monitors simultaneously. Surgery was performed in all patients via the frontobasal interhemispheric approach. Residual tumors were observed in the sella (2 patients), on the ventral surface of the chiasm and optic nerve (1 patient), and in the third ventricle (1 patient) and were resected to achieve total resection. Postoperatively, visual function was improved in 2 patients and none exhibited deterioration related to the surgery. Simultaneous microscopic and endoscopic observation with the use of dual monitors in a neurosurgical suite was ergonomically optimal for the surgeon to perform microsurgical procedures and to avoid traumatizing surrounding vessels or neural tissues. Hybrid microscopic-endoscopic neurosurgery may contribute to safe, less-invasive, and maximal resection to achieve better prognosis in children with craniopharyngioma. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. A suite of benchmark and challenge problems for enhanced geothermal systems

    Energy Technology Data Exchange (ETDEWEB)

    White, Mark; Fu, Pengcheng; McClure, Mark; Danko, George; Elsworth, Derek; Sonnenthal, Eric; Kelkar, Sharad; Podgorney, Robert

    2017-11-06

    A diverse suite of numerical simulators is currently being applied to predict or understand the performance of enhanced geothermal systems (EGS). To build confidence and identify critical development needs for these analytical tools, the United States Department of Energy, Geothermal Technologies Office sponsored a Code Comparison Study (GTO-CCS), with participants from universities, industry, and national laboratories. A principal objective for the study was to create a community forum for improvement and verification of numerical simulators for EGS modeling. Teams participating in the study were those representing U.S. national laboratories, universities, and industries, and each team brought unique numerical simulation capabilities to bear on the problems. Two classes of problems were developed during the study, benchmark problems and challenge problems. The benchmark problems were structured to test the ability of the collection of numerical simulators to solve various combinations of coupled thermal, hydrologic, geomechanical, and geochemical processes. This class of problems was strictly defined in terms of properties, driving forces, initial conditions, and boundary conditions. The challenge problems were based on the enhanced geothermal systems research conducted at Fenton Hill, near Los Alamos, New Mexico, between 1974 and 1995. The problems involved two phases of research, stimulation, development, and circulation in two separate reservoirs. The challenge problems had specific questions to be answered via numerical simulation in three topical areas: 1) reservoir creation/stimulation, 2) reactive and passive transport, and 3) thermal recovery. Whereas the benchmark class of problems were designed to test capabilities for modeling coupled processes under strictly specified conditions, the stated objective for the challenge class of problems was to demonstrate what new understanding of the Fenton Hill experiments could be realized via the application of

  19. Recent Enhancements in NOAA's JPSS Land Product Suite and Key Operational Applications

    Science.gov (United States)

    Csiszar, I. A.; Yu, Y.; Zhan, X.; Vargas, M.; Ek, M. B.; Zheng, W.; Wu, Y.; Smirnova, T. G.; Benjamin, S.; Ahmadov, R.; James, E.; Grell, G. A.

    2017-12-01

    A suite of operational land products has been produced as part of NOAA's Joint Polar Satellite System (JPSS) program to support a wide range of operational applications in environmental monitoring, prediction, disaster management and mitigation, and decision support. The Visible Infrared Imaging Radiometer Suite (VIIRS) on the Suomi National Polar-orbiting Partnership (NPP) and the operational JPSS satellite series forms the basis of six fundamental and multiple additional added-value environmental data records (EDRs). A major recent improvement in the land-based VIIRS EDRs has been the development of global gridded products, providing a format and science content suitable for ingest into NOAA's operational land surface and coupled numerical weather prediction models. VIIRS near-real-time Green Vegetation Fraction is now in the process of testing for full operational use, while land surface temperature and albedo are under testing and evaluation. The operational 750m VIIRS active fire product, including fire radiative power, is used to support emission modeling and air quality applications. Testing the evaluation for operational NOAA implementation of the improved 375m VIIRS active fire product is also underway. Added-value and emerging VIIRS land products include vegetation health, phenology, near-real-time surface type and surface condition change, and other biogeophysical variables. As part of the JPSS program, a global soil moisture data product has also been generated from the Advanced Microwave Scanning Radiometer 2 (AMSR2) sensor on the GCOM-W1 (Global Change Observation Mission - Water 1) satellite since July 2012. This product is included in the blended NESDIS Soil Moisture Operational Products System, providing soil moisture data as a critical input for land surface modeling.

  20. A comprehensive software suite for protein family construction and functional site prediction.

    Directory of Open Access Journals (Sweden)

    David Renfrew Haft

    Full Text Available In functionally diverse protein families, conservation in short signature regions may outperform full-length sequence comparisons for identifying proteins that belong to a subgroup within which one specific aspect of their function is conserved. The SIMBAL workflow (Sites Inferred by Metabolic Background Assertion Labeling is a data-mining procedure for finding such signature regions. It begins by using clues from genomic context, such as co-occurrence or conserved gene neighborhoods, to build a useful training set from a large number of uncharacterized but mutually homologous proteins. When training set construction is successful, the YES partition is enriched in proteins that share function with the user's query sequence, while the NO partition is depleted. A selected query sequence is then mined for short signature regions whose closest matches overwhelmingly favor proteins from the YES partition. High-scoring signature regions typically contain key residues critical to functional specificity, so proteins with the highest sequence similarity across these regions tend to share the same function. The SIMBAL algorithm was described previously, but significant manual effort, expertise, and a supporting software infrastructure were required to prepare the requisite training sets. Here, we describe a new, distributable software suite that speeds up and simplifies the process for using SIMBAL, most notably by providing tools that automate training set construction. These tools have broad utility for comparative genomics, allowing for flexible collection of proteins or protein domains based on genomic context as well as homology, a capability that can greatly assist in protein family construction. Armed with this new software suite, SIMBAL can serve as a fast and powerful in silico alternative to direct experimentation for characterizing proteins and their functional interactions.

  1. Critical scattering

    International Nuclear Information System (INIS)

    Stirling, W.G.; Perry, S.C.

    1996-01-01

    We outline the theoretical and experimental background to neutron scattering studies of critical phenomena at magnetic and structural phase transitions. The displacive phase transition of SrTiO 3 is discussed, along with examples from recent work on magnetic materials from the rare-earth (Ho, Dy) and actinide (NpAs, NpSb, USb) classes. The impact of synchrotron X-ray scattering is discussed in conclusion. (author) 13 figs., 18 refs

  2. Application of the bomb radiocarbon chronometer to the validation of redfish Centroberyx affinis age

    International Nuclear Information System (INIS)

    Kalish, J.M.

    1995-01-01

    Validation of methods used to estimate fish age is a critical element of the fish stock assessment process. Despite the importance of validation, few procedures are available that provide unbiased estimates of true fish age and those methods that are available are seldom used. The majority of these methods are unlikely to provide an indication of the true age of individual fish, data that are best suited to the validation process. Accelerator mass spectrometry analyses of radiocarbon in selected regions of Centroberyx affinis otoliths were used to validate the age estimation method for this species. Radiocarbon data from the otoliths of C. affinis with presumed birth dates between 1955 and 1985 described the increase in ocean radiocarbon attributable to the atmospheric detonation of nuclear weapons in the 1950s and 1960s. The results confirm the longevity of C. affinis and demonstrate the effectiveness of the bomb radiocarbon chronometer for the validation of age-estimation methods. (author). 31 refs., 2 tabs., 1 fig

  3. Validation of software for calculating the likelihood ratio for parentage and kinship.

    Science.gov (United States)

    Drábek, J

    2009-03-01

    Although the likelihood ratio is a well-known statistical technique, commercial off-the-shelf (COTS) software products for its calculation are not sufficiently validated to suit general requirements for the competence of testing and calibration laboratories (EN/ISO/IEC 17025:2005 norm) per se. The software in question can be considered critical as it directly weighs the forensic evidence allowing judges to decide on guilt or innocence or to identify person or kin (i.e.: in mass fatalities). For these reasons, accredited laboratories shall validate likelihood ratio software in accordance with the above norm. To validate software for calculating the likelihood ratio in parentage/kinship scenarios I assessed available vendors, chose two programs (Paternity Index and familias) for testing, and finally validated them using tests derived from elaboration of the available guidelines for the field of forensics, biomedicine, and software engineering. MS Excel calculation using known likelihood ratio formulas or peer-reviewed results of difficult paternity cases were used as a reference. Using seven testing cases, it was found that both programs satisfied the requirements for basic paternity cases. However, only a combination of two software programs fulfills the criteria needed for our purpose in the whole spectrum of functions under validation with the exceptions of providing algebraic formulas in cases of mutation and/or silent allele.

  4. A suite of microsatellite markers optimized for amplification of DNA from Addax (Addax nasomaculatus) blood preserved on FTA cards.

    Science.gov (United States)

    Heim, Brett C; Ivy, Jamie A; Latch, Emily K

    2012-01-01

    The addax (Addax nasomaculatus) is a critically endangered antelope that is currently maintained in zoos through regional, conservation breeding programs. As for many captive species, incomplete pedigree data currently impedes the ability of addax breeding programs to confidently manage the genetics of captive populations and to select appropriate animals for reintroduction. Molecular markers are often used to improve pedigree resolution, thereby improving the long-term effectiveness of genetic management. When developing a suite of molecular markers, it is important to consider the source of DNA, as the utility of markers may vary across DNA sources. In this study, we optimized a suite of microsatellite markers for use in genotyping captive addax blood samples collected on FTA cards. We amplified 66 microsatellite loci previously described in other Artiodactyls. Sixteen markers amplified a single product in addax, but only 5 of these were found to be polymorphic in a sample of 37 addax sampled from a captive herd at Fossil Rim Wildlife Center in the US. The suite of microsatellite markers developed in this study provides a new tool for the genetic management of captive addax, and demonstrates that FTA cards can be a useful means of sample storage, provided appropriate loci are used in downstream analyses. © 2011 Wiley Periodicals, Inc.

  5. Philosophies Applied in the Selection of Space Suit Joint Range of Motion Requirements

    Science.gov (United States)

    Aitchison, Lindsway; Ross, Amy; Matty, Jennifer

    2009-01-01

    Space suits are the most important tool for astronauts working in harsh space and planetary environments; suits keep crewmembers alive and allow them to perform exploration, construction, and scientific tasks on a routine basis over a period of several months. The efficiency with which the tasks are performed is largely dictated by the mobility features of the space suit. For previous space suit development programs, the mobility requirements were written as pure functional mobility requirements that did not separate joint ranges of motion from the joint torques. The Constellation Space Suit Element has the goal to make more quantitative mobility requirements that focused on the individual components of mobility to enable future suit designers to build and test systems more effectively. This paper details the test planning and selection process for the Constellation space suit pressure garment range of motion requirements.

  6. Development of an advanced rocket propellant handler's suit

    Science.gov (United States)

    Doerr, D. F.

    2001-01-01

    Most launch vehicles and satellites in the US inventory rely upon the use of hypergolic rocket propellants, many of which are toxic to humans. These fuels and oxidizers, such as hydrazine and nitrogen tetroxide have threshold limit values as low as 0.01 PPM. It is essential to provide space workers handling these agents whole body protection as they are universally hazardous not only to the respiratory system, but the skin as well. This paper describes a new method for powering a whole body protective garment to assure the safety of ground servicing crews. A new technology has been developed through the small business innovative research program at the Kennedy Space Center. Currently, liquid air is used in the environmental control unit (ECU) that powers the propellant handlers suit (PHE). However, liquid air exhibits problems with attitude dependence, oxygen enrichment, and difficulty with reliable quantity measurement. The new technology employs the storage of the supply air as a supercritical gas. This method of air storage overcomes all of three problems above while maintaining high density storage at relatively low vessel pressures (rights reserved.

  7. Advanced Space Suit Portable Life Support Subsystem Packaging Design

    Science.gov (United States)

    Howe, Robert; Diep, Chuong; Barnett, Bob; Thomas, Gretchen; Rouen, Michael; Kobus, Jack

    2006-01-01

    This paper discusses the Portable Life Support Subsystem (PLSS) packaging design work done by the NASA and Hamilton Sundstrand in support of the 3 future space missions; Lunar, Mars and zero-g. The goal is to seek ways to reduce the weight of PLSS packaging, and at the same time, develop a packaging scheme that would make PLSS technology changes less costly than the current packaging methods. This study builds on the results of NASA s in-house 1998 study, which resulted in the "Flex PLSS" concept. For this study the present EMU schematic (low earth orbit) was used so that the work team could concentrate on the packaging. The Flex PLSS packaging is required to: protect, connect, and hold the PLSS and its components together internally and externally while providing access to PLSS components internally for maintenance and for technology change without extensive redesign impact. The goal of this study was two fold: 1. Bring the advanced space suit integrated Flex PLSS concept from its current state of development to a preliminary design level and build a proof of concept mockup of the proposed design, and; 2. "Design" a Design Process, which accommodates both the initial Flex PLSS design and the package modifications, required to accommodate new technology.

  8. STS-95 Mission Specialist Duque suits up during TCDT

    Science.gov (United States)

    1998-01-01

    STS-95 Mission Specialist Pedro Duque of Spain, representing the European Space Agency, suits up in the Operations and Checkout Building prior to his trip to Launch Pad 39-B. Duque and the rest of the STS-95 crew are at KSC to participate in the Terminal Countdown Demonstration Test (TCDT) which includes mission familiarization activities, emergency egress training, and a simulated main engine cutoff. The other crew members are Payload Specialist Chiaki Mukai (M.D., Ph.D.), representing the National Space Development Agency of Japan (NASDA), Pilot Steven W. Lindsey, Mission Specialist Scott E. Parazynski, Mission Specialist Stephen K. Robinson, Payload Specialist John H. Glenn Jr., senator from Ohio, and Mission Commander Curtis L. Brown. The STS-95 mission, targeted for liftoff on Oct. 29, includes research payloads such as the Spartan solar-observing deployable spacecraft, the Hubble Space Telescope Orbital Systems Test Platform, the International Extreme Ultraviolet Hitchhiker, as well as the SPACEHAB single module with experiments on space flight and the aging process. Following the TCDT, the crew will be returning to Houston for final flight preparations.

  9. SECIMTools: a suite of metabolomics data analysis tools.

    Science.gov (United States)

    Kirpich, Alexander S; Ibarra, Miguel; Moskalenko, Oleksandr; Fear, Justin M; Gerken, Joseph; Mi, Xinlei; Ashrafi, Ali; Morse, Alison M; McIntyre, Lauren M

    2018-04-20

    Metabolomics has the promise to transform the area of personalized medicine with the rapid development of high throughput technology for untargeted analysis of metabolites. Open access, easy to use, analytic tools that are broadly accessible to the biological community need to be developed. While technology used in metabolomics varies, most metabolomics studies have a set of features identified. Galaxy is an open access platform that enables scientists at all levels to interact with big data. Galaxy promotes reproducibility by saving histories and enabling the sharing workflows among scientists. SECIMTools (SouthEast Center for Integrated Metabolomics) is a set of Python applications that are available both as standalone tools and wrapped for use in Galaxy. The suite includes a comprehensive set of quality control metrics (retention time window evaluation and various peak evaluation tools), visualization techniques (hierarchical cluster heatmap, principal component analysis, modular modularity clustering), basic statistical analysis methods (partial least squares - discriminant analysis, analysis of variance, t-test, Kruskal-Wallis non-parametric test), advanced classification methods (random forest, support vector machines), and advanced variable selection tools (least absolute shrinkage and selection operator LASSO and Elastic Net). SECIMTools leverages the Galaxy platform and enables integrated workflows for metabolomics data analysis made from building blocks designed for easy use and interpretability. Standard data formats and a set of utilities allow arbitrary linkages between tools to encourage novel workflow designs. The Galaxy framework enables future data integration for metabolomics studies with other omics data.

  10. Astronaut Anna Fisher Suits Up for NBS Training

    Science.gov (United States)

    1980-01-01

    The Hubble Space Telescope (HST) is a cooperative program of the European Space Agency (ESA) and the National Aeronautical and Space Administration (NASA) to operate a long-lived space-based observatory. It was the flagship mission of NASA's Great Observatories program. The HST program began as an astronomical dream in the 1940s. During the 1970s and 1980s, the HST was finally designed and built becoming operational in the 1990s. The HST was deployed into a low-Earth orbit on April 25, 1990 from the cargo bay of the Space Shuttle Discovery (STS-31). The design of the HST took into consideration its length of service and the necessity of repairs and equipment replacement by making the body modular. In doing so, subsequent shuttle missions could recover the HST, replace faulty or obsolete parts and be re-released. Marshall Space Flight Center's (MSFC's) Neutral Buoyancy Simulator (NBS) served as the test center for shuttle astronauts training for Hubble related missions. Shown is astronaut Anna Fisher suiting up for training on a mockup of a modular section of the HST for an axial scientific instrument change out.

  11. Astronaut Anna Fisher Suiting Up For NBS Training

    Science.gov (United States)

    1980-01-01

    The Hubble Space Telescope (HST) is a cooperative program of the European Space Agency (ESA) and the National Aeronautical and Space Administration (NASA) to operate a long-lived space-based observatory. It was the flagship mission of NASA's Great Observatories program. The HST program began as an astronomical dream in the 1940s. During the 1970s and 1980s, the HST was finally designed and built becoming operational in the 1990s. The HST was deployed into a low-Earth orbit on April 25, 1990 from the cargo bay of the Space Shuttle Discovery (STS-31). The design of the HST took into consideration its length of service and the necessity of repairs and equipment replacement by making the body modular. In doing so, subsequent shuttle missions could recover the HST, replace faulty or obsolete parts and be re-released. Marshall Space Flight Center's (MSFC's) Neutral Buoyancy Simulator (NBS) served as the test center for shuttle astronauts training for Hubble related missions. Shown is astronaut Anna Fisher suiting up for training on a mockup of a modular section of the HST for an axial scientific instrument change out.

  12. Astronaut Anna Fisher Suited Up For NBS Training

    Science.gov (United States)

    1980-01-01

    The Hubble Space Telescope (HST) is a cooperative program of the European Space Agency (ESA) and the National Aeronautical and Space Administration (NASA) to operate a long-lived space-based observatory. It was the flagship mission of NASA's Great Observatories program. The HST program began as an astronomical dream in the 1940s. During the 1970s and 1980s, the HST was finally designed and built becoming operational in the 1990s. The HST was deployed into a low-Earth orbit on April 25, 1990 from the cargo bay of the Space Shuttle Discovery (STS-31). The design of the HST took into consideration its length of service and the necessity of repairs and equipment replacement by making the body modular. In doing so, subsequent shuttle missions could recover the HST, replace faulty or obsolete parts and be re-released. Marshall Space Flight Center's (MSFC's) Neutral Buoyancy Simulator (NBS) served as the test center for shuttle astronauts training for Hubble related missions. Shown is astronaut Anna Fisher suited up for training on a mockup of a modular section of the HST for an axial scientific instrument change out.

  13. Development of an advanced rocket propellant handler's suit

    Science.gov (United States)

    Doerr, DonaldF.

    2001-08-01

    Most launch vehicles and satellites in the US inventory rely upon the use of hypergolic rocket propellants, many of which are toxic to humans. These fuels and oxidizers, such as hydrazine and nitrogen tetroxide have threshold limit values as low as 0.01 PPM. It is essential to provide space workers handling these agents whole body protection as they are universally hazardous not only to the respiratory system, but the skin as well. This paper describes a new method for powering a whole body protective garment to assure the safety of ground servicing crews. A new technology has been developed through the small business innovative research program at the Kennedy Space Center. Currently, liquid air is used in the environmental control unit (ECU) that powers the propellant handlers suit (PHE). However, liquid air exhibits problems with attitude dependence, oxygen enrichment, and difficulty with reliable quantity measurement. The new technology employs the storage of the supply air as a supercritical gas. This method of air storage overcomes all of three problems above while maintaining high density storage at relatively low vessel pressures (protective ensemble marked an advancement in the state-of-the-art in personal protective equipment. Not only was long duration environmental control provided, but it was done without a high pressure vessel. The unit met human performance needs for attitude independence, oxygen stability, and relief of heat stress. This supercritical air (and oxygen) technology is suggested for microgravity applications in life support such as the Extravehicular Mobility Unit.

  14. eXtended CASA Line Analysis Software Suite (XCLASS)

    Science.gov (United States)

    Möller, T.; Endres, C.; Schilke, P.

    2017-02-01

    The eXtended CASA Line Analysis Software Suite (XCLASS) is a toolbox for the Common Astronomy Software Applications package (CASA) containing new functions for modeling interferometric and single dish data. Among the tools is the myXCLASS program which calculates synthetic spectra by solving the radiative transfer equation for an isothermal object in one dimension, whereas the finite source size and dust attenuation are considered as well. Molecular data required by the myXCLASS program are taken from an embedded SQLite3 database containing entries from the Cologne Database for Molecular Spectroscopy (CDMS) and JPL using the Virtual Atomic and Molecular Data Center (VAMDC) portal. Additionally, the toolbox provides an interface for the model optimizer package Modeling and Analysis Generic Interface for eXternal numerical codes (MAGIX), which helps to find the best description of observational data using myXCLASS (or another external model program), that is, finding the parameter set that most closely reproduces the data. http://www.astro.uni-koeln.de/projects/schilke/myXCLASSInterface A copy of the code is available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/598/A7

  15. California court says disability benefits do not preclude suit.

    Science.gov (United States)

    1998-05-01

    A California appeals court reversed a lower court decision barring a worker from pursuing an HIV discrimination claim against his employer. [Name removed] claims that [name removed] violated California's Fair Employment and Housing Act when it rescinded accommodations that the bank had made earlier for HIV-related medical needs. The accommodations included a compressed work week and one day of telecommuting per week, which [name removed] performed well enough to earn a promotion. With a change in management, the accommodations were canceled, ostensibly to control costs. The lower court ruled that [name removed] was barred from suing his former employer because of statements on his disability insurance application. However, the appeals court ruled that [name removed]'s statements on the form were honest and did not preclude him from future litigation. Myron Quon, an attorney with Lambda Legal Defense and Education Fund in Los Angeles, noted that [name removed]'s deft handling of the questions was vital to the success of the suit. [Name removed] had made comments and notations on the form, rather than just checking the appropriate yes or no boxes, and noted that he could return to work with a reasonable accommodation. Others applying for disability are cautioned to do the same to preserve their legal rights.

  16. Freud: a software suite for high-throughput simulation analysis

    Science.gov (United States)

    Harper, Eric; Spellings, Matthew; Anderson, Joshua; Glotzer, Sharon

    Computer simulation is an indispensable tool for the study of a wide variety of systems. As simulations scale to fill petascale and exascale supercomputing clusters, so too does the size of the data produced, as well as the difficulty in analyzing these data. We present Freud, an analysis software suite for efficient analysis of simulation data. Freud makes no assumptions about the system being analyzed, allowing for general analysis methods to be applied to nearly any type of simulation. Freud includes standard analysis methods such as the radial distribution function, as well as new methods including the potential of mean force and torque and local crystal environment analysis. Freud combines a Python interface with fast, parallel C + + analysis routines to run efficiently on laptops, workstations, and supercomputing clusters. Data analysis on clusters reduces data transfer requirements, a prohibitive cost for petascale computing. Used in conjunction with simulation software, Freud allows for smart simulations that adapt to the current state of the system, enabling the study of phenomena such as nucleation and growth, intelligent investigation of phases and phase transitions, and determination of effective pair potentials.

  17. A Coupled Calculation Suite for Atucha II Operational Transients Analysis

    International Nuclear Information System (INIS)

    Mazzantini, O.; Schivo, M.; Cesare, J.D.; Garbero, R.; Rivero, M.; Theler, G.

    2011-01-01

    While more than a decade ago reactor and thermal hydraulic calculations were tedious and often needed a lot of approximations and simplifications that forced the designers to take a very conservative approach, computational resources available nowadays allow engineers to cope with increasingly complex problems in a reasonable time. The use of best-estimate calculations provides tools to justify convenient engineering margins, reduces costs, and maximises economic benefits. In this direction, a suite of coupled best-estimate specific calculation codes was developed to analyse the behaviour of the Atucha II nuclear power plant in Argentina. The developed tool includes three-dimensional spatial neutron kinetics, a channel-level model of the core thermal hydraulics with subcooled boiling correlations, a one-dimensional model of the primary and secondary circuits including pumps, steam generators, heat exchangers, and the turbine with all their associated control loops, and a complete simulation of the reactor control, limitation, and protection system working in closed-loop conditions as a faithful representation of the real power plant. In the present paper, a description of the coupling scheme between the codes involved is given, and some examples of their application to Atucha II are shown

  18. STS-93 Commander Collins suits up before launch

    Science.gov (United States)

    1999-01-01

    In the Operations and Checkout Building, STS-93 Commander Eileen M. Collins gets help donning her launch and entry suit. After Space Shuttle Columbia's July 20 launch attempt was scrubbed at the T-7 second mark in the countdown, the launch was rescheduled for Thursday, July 22, at 12:28 a.m. EDT. The target landing date is July 26, 1999, at 11:24 p.m. EDT. STS-93 is a five-day mission primarily to release the Chandra X-ray Observatory, which will allow scientists from around the world to study some of the most distant, powerful and dynamic objects in the universe. The new telescope is 20 to 50 times more sensitive than any previous X- ray telescope and is expected unlock the secrets of supernovae, quasars and black holes. The STS-93 crew numbers five: Commander Collins, Pilot Jeffrey S. Ashby, and Mission Specialists Stephen A. Hawley (Ph.D.), Catherine G. Coleman (Ph.D.) and Michel Tognini of France, with the Centre National d'Etudes Spatiales (CNES). Collins is the first woman to serve as commander of a shuttle mission.

  19. STS-93 Commander Collins waves after suiting up before launch

    Science.gov (United States)

    1999-01-01

    During final launch preparations in the Operations and Checkout Building, STS-93 Commander Eileen M. Collins waves after donning her launch and entry suit. After Space Shuttle Columbia's July 20 launch attempt was scrubbed at the T-7 second mark in the countdown, the launch was rescheduled for Thursday, July 22, at 12:28 a.m. EDT. The target landing date is July 26, 1999, at 11:24 p.m. EDT. STS-93 is a five-day mission primarily to release the Chandra X-ray Observatory, which will allow scientists from around the world to study some of the most distant, powerful and dynamic objects in the universe. The new telescope is 20 to 50 times more sensitive than any previous X-ray telescope and is expected unlock the secrets of supernovae, quasars and black holes. The STS-93 crew numbers five: Commander Collins, Pilot Jeffrey S. Ashby, and Mission Specialists Stephen A. Hawley (Ph.D.), Catherine G. Coleman (Ph.D.) and Michel Tognini of France, with the Centre National d'Etudes Spatiales (CNES). Collins is the first woman to serve as commander of a shuttle mission.

  20. The possession law suit, caused by forbidden immissions

    Directory of Open Access Journals (Sweden)

    Popov Danica D.

    2015-01-01

    Full Text Available In the Serbian Law and in most of jurisdictions, there are limits on exercising the right of property. The real estate owner must refrain from activities by which the use of other real estate is being impoded through the immission of execive gasses, vapors, smoke, heat noise, tremors etc. The property ownership whose is affected by immission exceeding the set limits, has the right to request a stop of immisions which exceeded the allowed volume of immissions. In article author describes various kinds of immissions. The general feature of this law suit is that there is only discussion on the facts and not for a legal matters. Subject matter jurisdiction for the resolution of such disputes belongs to the court of general jurisdiction, while the disputes itself is a litigation. The special rule of proceedings of action of disturbance are: provisionality of the protection of possession; urgency in proceedings; initiation of proceedings; limiting of objection; prescribing temporary measures; rendering a ruling in the form of order; appeals which may be filed within a short deadline and which does not have suspensive effect (do not delay the execution of the order; revision is not allowed etc.

  1. The JPEG XT suite of standards: status and future plans

    Science.gov (United States)

    Richter, Thomas; Bruylants, Tim; Schelkens, Peter; Ebrahimi, Touradj

    2015-09-01

    The JPEG standard has known an enormous market adoption. Daily, billions of pictures are created, stored and exchanged in this format. The JPEG committee acknowledges this success and spends continued efforts in maintaining and expanding the standard specifications. JPEG XT is a standardization effort targeting the extension of the JPEG features by enabling support for high dynamic range imaging, lossless and near-lossless coding, and alpha channel coding, while also guaranteeing backward and forward compatibility with the JPEG legacy format. This paper gives an overview of the current status of the JPEG XT standards suite. It discusses the JPEG legacy specification, and details how higher dynamic range support is facilitated both for integer and floating-point color representations. The paper shows how JPEG XT's support for lossless and near-lossless coding of low and high dynamic range images is achieved in combination with backward compatibility to JPEG legacy. In addition, the extensible boxed-based JPEG XT file format on which all following and future extensions of JPEG will be based is introduced. This paper also details how the lossy and lossless representations of alpha channels are supported to allow coding transparency information and arbitrarily shaped images. Finally, we conclude by giving prospects on upcoming JPEG standardization initiative JPEG Privacy & Security, and a number of other possible extensions in JPEG XT.

  2. Improvements to the APBS biomolecular solvation software suite.

    Science.gov (United States)

    Jurrus, Elizabeth; Engel, Dave; Star, Keith; Monson, Kyle; Brandi, Juan; Felberg, Lisa E; Brookes, David H; Wilson, Leighton; Chen, Jiahui; Liles, Karina; Chun, Minju; Li, Peter; Gohara, David W; Dolinsky, Todd; Konecny, Robert; Koes, David R; Nielsen, Jens Erik; Head-Gordon, Teresa; Geng, Weihua; Krasny, Robert; Wei, Guo-Wei; Holst, Michael J; McCammon, J Andrew; Baker, Nathan A

    2018-01-01

    The Adaptive Poisson-Boltzmann Solver (APBS) software was developed to solve the equations of continuum electrostatics for large biomolecular assemblages that have provided impact in the study of a broad range of chemical, biological, and biomedical applications. APBS addresses the three key technology challenges for understanding solvation and electrostatics in biomedical applications: accurate and efficient models for biomolecular solvation and electrostatics, robust and scalable software for applying those theories to biomolecular systems, and mechanisms for sharing and analyzing biomolecular electrostatics data in the scientific community. To address new research applications and advancing computational capabilities, we have continually updated APBS and its suite of accompanying software since its release in 2001. In this article, we discuss the models and capabilities that have recently been implemented within the APBS software package including a Poisson-Boltzmann analytical and a semi-analytical solver, an optimized boundary element solver, a geometry-based geometric flow solvation model, a graph theory-based algorithm for determining pK a values, and an improved web-based visualization tool for viewing electrostatics. © 2017 The Protein Society.

  3. Critical Mass

    CERN Multimedia

    AUTHOR|(CDS)2070299

    2017-01-01

    Critical Mass is a cycling event typically held on the last Friday of every month; its purpose is not usually formalized beyond the direct action of meeting at a set location and time and traveling as a group through city or town streets on bikes. The event originated in 1992 in San Francisco; by the end of 2003, the event was being held in over 300 cities around the world. At CERN it is held once a year in conjunction with the national Swiss campaing "Bike to work".

  4. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  5. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  6. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William L.; Trucano, Timothy G.

    2008-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  7. VOLCWORKS: A suite for optimization of hazards mapping

    Science.gov (United States)

    Delgado Granados, H.; Ramírez Guzmán, R.; Villareal Benítez, J. L.; García Sánchez, T.

    2012-04-01

    Making hazards maps is a process linking basic science, applied science and engineering for the benefit of the society. The methodologies for hazards maps' construction have evolved enormously together with the tools that allow the forecasting of the behavior of the materials produced by different eruptive processes. However, in spite of the development of tools and evolution of methodologies, the utility of hazards maps has not changed: prevention and mitigation of volcanic disasters. Integration of different tools for simulation of different processes for a single volcano is a challenge to be solved using software tools including processing, simulation and visualization techniques, and data structures in order to build up a suit that helps in the construction process starting from the integration of the geological data, simulations and simplification of the output to design a hazards/scenario map. Scientific visualization is a powerful tool to explore and gain insight into complex data from instruments and simulations. The workflow from data collection, quality control and preparation for simulations, to achieve visual and appropriate presentation is a process that is usually disconnected, using in most of the cases different applications for each of the needed processes, because it requires many tools that are not built for the solution of a specific problem, or were developed by research groups to solve particular tasks, but disconnected. In volcanology, due to its complexity, groups typically examine only one aspect of the phenomenon: ash dispersal, laharic flows, pyroclastic flows, lava flows, and ballistic projectile ejection, among others. However, when studying the hazards associated to the activity of a volcano, it is important to analyze all the processes comprehensively, especially for communication of results to the end users: decision makers and planners. In order to solve this problem and connect different parts of a workflow we are developing the

  8. Developing defensive aids suite technology on a virtual battlefield

    Science.gov (United States)

    Rapanotti, John L.; DeMontigny-Leboeuf, Annie; Palmarini, Marc; Cantin, Andre

    2002-07-01

    Modern anti-tank missiles and the requirement of rapid deployment are limiting the use of passive armour in protecting land vehicles. Vehicle survivability is becoming more dependent on sensors, computers and countermeasures to detect and avoid threats. The integration of various technologies into a Defensive Aids Suite (DAS) can be designed and analyzed by combining field trials and laboratory data with modeling and simulation. MATLAB is used as a quick prototyping tool to model DAS systems and facilitate transfer to other researchers. The DAS model can be transferred from MATLAB or programmed directly in ModSAF (Modular Semi-Automated Forces), which is used to construct the virtual battlefield. Through scripted input files, a fixed battle approach ensures implementation and analysis meeting the requirements of three different interests. These three communities include the scientists and engineers, military and operations research. This approach ensures the modelling of processes known to be important regardless of the level of information available about the system. A system can be modelled phenomenologically until more information is available. Further processing of the simulation can be used to optimize the vehicle for a specific mission. ModSAF will be used to analyze and plan trials and develop DAS technology for future vehicles. Survivability of a DAS-equipped vehicle can be assessed relative to a basic vehicle without a DAS. In later stages, more complete DAS systems will be analyzed to determine the optimum configuration of the DAS components and the effectiveness of a DAS-equipped vehicle for specific missions. These concepts and approach will be discussed in the paper.

  9. BioWord: A sequence manipulation suite for Microsoft Word

    Directory of Open Access Journals (Sweden)

    Anzaldi Laura J

    2012-06-01

    Full Text Available Abstract Background The ability to manipulate, edit and process DNA and protein sequences has rapidly become a necessary skill for practicing biologists across a wide swath of disciplines. In spite of this, most everyday sequence manipulation tools are distributed across several programs and web servers, sometimes requiring installation and typically involving frequent switching between applications. To address this problem, here we have developed BioWord, a macro-enabled self-installing template for Microsoft Word documents that integrates an extensive suite of DNA and protein sequence manipulation tools. Results BioWord is distributed as a single macro-enabled template that self-installs with a single click. After installation, BioWord will open as a tab in the Office ribbon. Biologists can then easily manipulate DNA and protein sequences using a familiar interface and minimize the need to switch between applications. Beyond simple sequence manipulation, BioWord integrates functionality ranging from dyad search and consensus logos to motif discovery and pair-wise alignment. Written in Visual Basic for Applications (VBA as an open source, object-oriented project, BioWord allows users with varying programming experience to expand and customize the program to better meet their own needs. Conclusions BioWord integrates a powerful set of tools for biological sequence manipulation within a handy, user-friendly tab in a widely used word processing software package. The use of a simple scripting language and an object-oriented scheme facilitates customization by users and provides a very accessible educational platform for introducing students to basic bioinformatics algorithms.

  10. BioWord: A sequence manipulation suite for Microsoft Word

    Science.gov (United States)

    2012-01-01

    Background The ability to manipulate, edit and process DNA and protein sequences has rapidly become a necessary skill for practicing biologists across a wide swath of disciplines. In spite of this, most everyday sequence manipulation tools are distributed across several programs and web servers, sometimes requiring installation and typically involving frequent switching between applications. To address this problem, here we have developed BioWord, a macro-enabled self-installing template for Microsoft Word documents that integrates an extensive suite of DNA and protein sequence manipulation tools. Results BioWord is distributed as a single macro-enabled template that self-installs with a single click. After installation, BioWord will open as a tab in the Office ribbon. Biologists can then easily manipulate DNA and protein sequences using a familiar interface and minimize the need to switch between applications. Beyond simple sequence manipulation, BioWord integrates functionality ranging from dyad search and consensus logos to motif discovery and pair-wise alignment. Written in Visual Basic for Applications (VBA) as an open source, object-oriented project, BioWord allows users with varying programming experience to expand and customize the program to better meet their own needs. Conclusions BioWord integrates a powerful set of tools for biological sequence manipulation within a handy, user-friendly tab in a widely used word processing software package. The use of a simple scripting language and an object-oriented scheme facilitates customization by users and provides a very accessible educational platform for introducing students to basic bioinformatics algorithms. PMID:22676326

  11. Response Surface Modeling Tool Suite, Version 1.x

    Energy Technology Data Exchange (ETDEWEB)

    2016-07-05

    The Response Surface Modeling (RSM) Tool Suite is a collection of three codes used to generate an empirical interpolation function for a collection of drag coefficient calculations computed with Test Particle Monte Carlo (TPMC) simulations. The first code, "Automated RSM", automates the generation of a drag coefficient RSM for a particular object to a single command. "Automated RSM" first creates a Latin Hypercube Sample (LHS) of 1,000 ensemble members to explore the global parameter space. For each ensemble member, a TPMC simulation is performed and the object drag coefficient is computed. In the next step of the "Automated RSM" code, a Gaussian process is used to fit the TPMC simulations. In the final step, Markov Chain Monte Carlo (MCMC) is used to evaluate the non-analytic probability distribution function from the Gaussian process. The second code, "RSM Area", creates a look-up table for the projected area of the object based on input limits on the minimum and maximum allowed pitch and yaw angles and pitch and yaw angle intervals. The projected area from the look-up table is used to compute the ballistic coefficient of the object based on its pitch and yaw angle. An accurate ballistic coefficient is crucial in accurately computing the drag on an object. The third code, "RSM Cd", uses the RSM generated by the "Automated RSM" code and the projected area look-up table generated by the "RSM Area" code to accurately compute the drag coefficient and ballistic coefficient of the object. The user can modify the object velocity, object surface temperature, the translational temperature of the gas, the species concentrations of the gas, and the pitch and yaw angles of the object. Together, these codes allow for the accurate derivation of an object's drag coefficient and ballistic coefficient under any conditions with only knowledge of the object's geometry and mass.

  12. LANDFIRE Remap: A New National Baseline Product Suite

    Science.gov (United States)

    Dockter, D.; Peterson, B.; Picotte, J. J.; Long, J.; Tolk, B.; Callahan, K.; Davidson, A.; Earnhardt, T.

    2017-12-01

    LANDFIRE, also known as the Landscape Fire and Resource Management Planning Tools Program, is a vegetation, fire, and fuel characteristic data creation program managed by both the U.S. Department of Agriculture Forest Service and the U.S. Department of the Interior with involvement from The Nature Conservancy. LANDFIRE represents the first and only complete, nationally consistent collection of over 20 geo-spatial layers (e.g., vegetation type and structure, fuels, fire regimes), databases, and ecological models that can be used across multiple disciplines to support cross-boundary planning, management, and operations across all lands of the United States and insular areas. Since 2004, LANDFIRE has produced comprehensive, consistent, and scientifically based suites of mapped products and associated databases for the United States and affiliated territories. These products depict the nation's major ecosystems and wildlife habitats. Over a decade has passed since the development of the first LANDFIRE base map, and an overhaul of the data products, i.e., a "Remap", is needed to maintain their functionality and relevance. To prepare for Remap production LANDFIRE has invested in a prototyping phase that focused on exploring various input data sources and new modeling and mapping techniques. While still grounded in a solid base consisting of Landsat imagery and high-quality field observations, the prototyping efforts explored different image compositing techniques, the integration of lidar data, modeling approaches as well as other factors that will inform Remap production. Several of these various research efforts are highlighted here and are currently being integrated into an end-to-end data processing flow that will drive the Remap production. The current Remap prototype effort has focused on several study areas throughout CONUS, with additional studies anticipated for Alaska, Hawaii and the territories. The LANDFIRE Remap effort is expected to take three to four years

  13. BioWord: a sequence manipulation suite for Microsoft Word.

    Science.gov (United States)

    Anzaldi, Laura J; Muñoz-Fernández, Daniel; Erill, Ivan

    2012-06-07

    The ability to manipulate, edit and process DNA and protein sequences has rapidly become a necessary skill for practicing biologists across a wide swath of disciplines. In spite of this, most everyday sequence manipulation tools are distributed across several programs and web servers, sometimes requiring installation and typically involving frequent switching between applications. To address this problem, here we have developed BioWord, a macro-enabled self-installing template for Microsoft Word documents that integrates an extensive suite of DNA and protein sequence manipulation tools. BioWord is distributed as a single macro-enabled template that self-installs with a single click. After installation, BioWord will open as a tab in the Office ribbon. Biologists can then easily manipulate DNA and protein sequences using a familiar interface and minimize the need to switch between applications. Beyond simple sequence manipulation, BioWord integrates functionality ranging from dyad search and consensus logos to motif discovery and pair-wise alignment. Written in Visual Basic for Applications (VBA) as an open source, object-oriented project, BioWord allows users with varying programming experience to expand and customize the program to better meet their own needs. BioWord integrates a powerful set of tools for biological sequence manipulation within a handy, user-friendly tab in a widely used word processing software package. The use of a simple scripting language and an object-oriented scheme facilitates customization by users and provides a very accessible educational platform for introducing students to basic bioinformatics algorithms.

  14. Interaction of Space Suits with Windblown Soil: Preliminary Mars Wind Tunnel Results

    Science.gov (United States)

    Marshall, J.; Bratton, C.; Kosmo, J.; Trevino, R.

    1999-09-01

    Experiments in the Mars Wind Tunnel at NASA Ames Research Center show that under Mars conditions, spacesuit materials are highly susceptible to dust contamination when exposed to windblown soil. This effect was suspected from knowledge of the interaction of electrostatically adhesive dust with solid surfaces in general. However, it is important to evaluate the respective roles of materials, meteorological and radiation effects, and the character of the soil. The tunnel permits evaluation of dust contamination and sand abrasion of space suits by simulating both pressure and wind conditions on Mars. The long-term function of space suits on Mars will be primarily threatened by dust contamination. Lunar EVA activities caused heavy contamination of space suits, but the problem was never seriously manifest because of the brief utilization of the suits, and the suits were never reused. Electrostatically adhering dust grains have various detrimental effects: (1) penetration and subsequent wear of suit fabrics, (2) viewing obscuration through visors and scratching/pitting of visor surfaces, (3) penetration, wear, and subsequent seizing-up of mechanical suit joints, (4) changes in albedo and therefore of radiation properties of external heat-exchanger systems, (5) changes in electrical conductivity of suit surfaces which may affect tribocharging of suits and create spurious discharge effects detrimental to suit electronics/radio systems. Additional information is contained in the original.

  15. Performance of MPI parallel processing implemented by MCNP5/ MCNPX for criticality benchmark problems

    International Nuclear Information System (INIS)

    Mark Dennis Usang; Mohd Hairie Rabir; Mohd Amin Sharifuldin Salleh; Mohamad Puad Abu

    2012-01-01

    MPI parallelism are implemented on a SUN Workstation for running MCNPX and on the High Performance Computing Facility (HPC) for running MCNP5. 23 input less obtained from MCNP Criticality Validation Suite are utilized for the purpose of evaluating the amount of speed up achievable by using the parallel capabilities of MPI. More importantly, we will study the economics of using more processors and the type of problem where the performance gain are obvious. This is important to enable better practices of resource sharing especially for the HPC facilities processing time. Future endeavours in this direction might even reveal clues for best MCNP5/ MCNPX coding practices for optimum performance of MPI parallelisms. (author)

  16. Dictionary criticism

    DEFF Research Database (Denmark)

    Nielsen, Sandro

    2018-01-01

    Dictionary criticism is part of the lexicographical universe and reviewing of electronic and printed dictionaries is not an exercise in linguistics or in subject fields but an exercise in lexicography. It does not follow from this that dictionary reviews should not be based on a linguistic approach......, but that the linguistic approach is only one of several approaches to dictionary reviewing. Similarly, the linguistic and factual competences of reviewers should not be relegated to an insignificant position in the review process. Moreover, reviewers should define the object of their reviews, the dictionary, as a complex...... information tool with several components and in terms of significant lexicographical features: lexicographical functions, data and structures. This emphasises the fact that dictionaries are much more than mere vessels of linguistic categories, namely lexicographical tools that have been developed to fulfil...

  17. Critical theory and holocaust

    Directory of Open Access Journals (Sweden)

    Krstić Predrag

    2006-01-01

    Full Text Available In this paper the author is attempting to establish the relationship - or the lack of it - of the Critical Theory to the "Jewish question" and justification of perceiving signs of Jewish religious heritage in the thought of the representatives of this movement. The holocaust marked out by the name of "Auschwitz", is here tested as a point where the nature of this relationship has been decided. In this encounter with the cardinal challenge for the contemporary social theory, the particularity of the Frankfurt School reaction is here revealed through Adorno installing Auschwitz as unexpected but lawful emblem of the ending of the course that modern history has assumed. The critique of this "fascination" with Auschwitz, as well as certain theoretical pacification and measured positioning of the holocaust into discontinued plane of "unfinished" and continuation and closure of the valued project, are given through communicative-theoretical pre-orientation of Jürgen Habermas’s Critical Theory and of his followers. Finally, through the work of Detlev Claussen, it is suggested that in the youngest generation of Adorno’s students there are signs of revision to once already revised Critical Theory and a kind of defractured and differentiated return to the initial understanding of the decisiveness of the holocaust experience. This shift in the attitude of the Critical Theory thinkers to the provocation of holocaust is not, however, particularly reflected towards the status of Jews and their tradition, but more to the age old questioning and explanatory patterns for which they served as a "model". The question of validity of the enlightenment project, the nature of occidental rationalism, (nonexistence of historical theology and understanding of the identity and emancipation - describe the circle of problems around which the disagreement is concentrated in the social critical theory.

  18. Using "The Burns Suite" as a Novel High Fidelity Simulation Tool for Interprofessional and Teamwork Training.

    Science.gov (United States)

    Sadideen, Hazim; Wilson, David; Moiemen, Naiem; Kneebone, Roger

    2016-01-01

    Educational theory highlights the importance of contextualized simulation for effective learning. The authors recently published the concept of "The Burns Suite" (TBS) as a novel tool to advance the delivery of burns education for residents/clinicians. Effectively, TBS represents a low-cost, high-fidelity, portable, immersive simulation environment. Recently, simulation-based team training (SBTT) has been advocated as a means to improve interprofessional practice. The authors aimed to explore the role of TBS in SBTT. A realistic pediatric burn resuscitation scenario was designed based on "advanced trauma and life support" and "emergency management of severe burns" principles, refined utilizing expert opinion through cognitive task analysis. The focus of this analysis was on nontechnical and interpersonal skills of clinicians and nurses within the scenario, mirroring what happens in real life. Five-point Likert-type questionnaires were developed for face and content validity. Cronbach's alpha was calculated for scale reliability. Semistructured interviews captured responses for qualitative thematic analysis allowing for data triangulation. Twenty-two participants completed TBS resuscitation scenario. Mean face and content validity ratings were high (4.4 and 4.7 respectively; range 4-5). The internal consistency of questions was high. Qualitative data analysis revealed two new themes. Participants reported that the experience felt particularly authentic because the simulation had high psychological and social fidelity, and there was a demand for such a facility to be made available to improve nontechnical skills and interprofessional relations. TBS provides a realistic, novel tool for SBTT, addressing both nontechnical and interprofessional team skills. Recreating clinical challenge is crucial to optimize SBTT. With a better understanding of the theories underpinning simulation and interprofessional education, future simulation scenarios can be designed to provide

  19. Technical Note: DIRART- A software suite for deformable image registration and adaptive radiotherapy research

    International Nuclear Information System (INIS)

    Yang Deshan; Brame, Scott; El Naqa, Issam; Aditya, Apte; Wu Yu; Murty Goddu, S.; Mutic, Sasa; Deasy, Joseph O.; Low, Daniel A.

    2011-01-01

    Purpose: Recent years have witnessed tremendous progress in image guide radiotherapy technology and a growing interest in the possibilities for adapting treatment planning and delivery over the course of treatment. One obstacle faced by the research community has been the lack of a comprehensive open-source software toolkit dedicated for adaptive radiotherapy (ART). To address this need, the authors have developed a software suite called the Deformable Image Registration and Adaptive Radiotherapy Toolkit (DIRART). Methods: DIRART is an open-source toolkit developed in MATLAB. It is designed in an object-oriented style with focus on user-friendliness, features, and flexibility. It contains four classes of DIR algorithms, including the newer inverse consistency algorithms to provide consistent displacement vector field in both directions. It also contains common ART functions, an integrated graphical user interface, a variety of visualization and image-processing features, dose metric analysis functions, and interface routines. These interface routines make DIRART a powerful complement to the Computational Environment for Radiotherapy Research (CERR) and popular image-processing toolkits such as ITK. Results: DIRART provides a set of image processing/registration algorithms and postprocessing functions to facilitate the development and testing of DIR algorithms. It also offers a good amount of options for DIR results visualization, evaluation, and validation. Conclusions: By exchanging data with treatment planning systems via DICOM-RT files and CERR, and by bringing image registration algorithms closer to radiotherapy applications, DIRART is potentially a convenient and flexible platform that may facilitate ART and DIR research.

  20. Coding practice of the Journal Article Tag Suite extensible markup language

    Directory of Open Access Journals (Sweden)

    Sun Huh

    2014-08-01

    Full Text Available In general, the Journal Article Tag Suite (JATS extensible markup language (XML coding is processed automatically by an XML filtering program. In this article, the basic tagging in JATS is explained in terms of coding practice. A text editor that supports UTF-8 encoding is necessary to input JATS XML data that works in every language. Any character representable in Unicode can be used in JATS XML, and commonly available web browsers can be used to view JATS XML files. JATS XML files can refer to document type definitions, extensible stylesheet language files, and cascading style sheets, but they must specify the locations of those files. Tools for validating JATS XML files are available via the web sites of PubMed Central and ScienceCentral. Once these files are uploaded to a web server, they can be accessed from all over the world by anyone with a browser. Encoding an example article in JATS XML may help editors in deciding on the adoption of JATS XML.

  1. Assessment of Protective Gloves for Use with Airfed Suits.

    Science.gov (United States)

    Millard, Claire E; Vaughan, Nicholas P

    2015-10-01

    Gloves are often needed for hand protection at work, but they can impair manual dexterity, especially if they are multilayered or ill-fitting. This article describes two studies of gloves to be worn with airfed suits (AFS) for nuclear decommissioning or containment level 4 (CL4) microbiological work. Both sets of workers wear multiple layers of gloves for protection and to accommodate decontamination procedures. Nuclear workers are also often required to wear cut-resistant gloves as an extra layer of protection. A total of 15 subjects volunteered to take part in manual dexterity testing of the different gloving systems. The subjects' hands were measured to ensure that the appropriate sized gloves were used. The gloves were tested with the subjects wearing the complete clothing ensembles appropriate to the work, using a combination of standard dexterity tests: the nine-hole peg test; a pin test adapted from the European Standard for protective gloves, the Purdue Pegboard test, and the Minnesota turning test. Specialized tests such as a hand tool test were used to test nuclear gloves, and laboratory-type manipulation tasks were used to test CL4 gloves. Subjective assessments of temperature sensation and skin wettedness were made before and after the dexterity tests of the nuclear gloves only. During all assessments, we made observations and questioned the subjects about ergonomic issues related to the clothing ensembles. Overall, the results show that the greater the thickness of the gloves and the number of layers the more the levels of manual dexterity performance are degraded. The nuclear cut-resistant gloves with the worst level of dexterity were stiff and inflexible and the subjects experienced problems picking up small items and bending their hands. The work also highlighted other factors that affect manual dexterity performance, including proper sizing, interactions with the other garments worn at the time, and the work equipment in use. In conclusion, when

  2. Oracle E-Business Suite Financials R12 A Functionality Guide

    CERN Document Server

    Iyer, Mohan

    2012-01-01

    This is a step-by-step functional guide to get you started easily with Oracle EBS Financials. If you are an Oracle E-Business Suite Financial consultant or an administrator looking to get a quick review on the capabilities of Oracle E-Business Suite and improve the use of the systems functionality, then this is the best guide for you. This book assumes that you have a fundamental knowledge of EBS Suite.

  3. Conservation biology for suites of species: Demographic modeling for Pacific island kingfishers

    Science.gov (United States)

    Kesler, D.C.; Haig, S.M.

    2007-01-01

    Conservation practitioners frequently extrapolate data from single-species investigations when managing critically endangered populations. However, few researchers initiate work with the intent of making findings useful to conservation efforts for other species. We presented and explored the concept of conducting conservation-oriented research for suites of geographically separated populations with similar natural histories, resource needs, and extinction threats. An example was provided in the form of an investigation into the population demography of endangered Micronesian kingfishers (Todiramphus cinnamominus). We provided the first demographic parameter estimates for any of the 12 endangered Pacific Todiramphus species, and used results to develop a population projection matrix model for management throughout the insular Pacific. Further, we used the model for elasticity and simulation analyses with demographic values that randomly varied across ranges that might characterize congener populations. Results from elasticity and simulation analyses indicated that changes in breeding adult survival exerted the greatest magnitude of influence on population dynamics. However, changes in nestling survival were more consistently correlated with population dynamics as demographic rates were randomly altered. We concluded that conservation practitioners working with endangered Pacific kingfishers should primarily focus efforts on factors affecting nestling and breeder survival, and secondarily address fledgling juveniles and helpers. Further, we described how the generalized base model might be changed to focus on individual populations and discussed the potential application of multi-species models to other conservation situations. ?? 2007 Elsevier Ltd. All rights reserved.

  4. The long-term Global LAnd Surface Satellite (GLASS) product suite and applications

    Science.gov (United States)

    Liang, S.

    2015-12-01

    Our Earth's environment is experiencing rapid changes due to natural variability and human activities. To monitor, understand and predict environment changes to meet the economic, social and environmental needs, use of long-term high-quality satellite data products is critical. The Global LAnd Surface Satellite (GLASS) product suite, generated at Beijing Normal University, currently includes 12 products, including leaf area index (LAI), broadband shortwave albedo, broadband longwave emissivity, downwelling shortwave radiation and photosynthetically active radiation, land surface skin temperature, longwave net radiation, daytime all-wave net radiation, fraction of absorbed photosynetically active radiation absorbed by green vegetation (FAPAR), fraction of green vegetation coverage, gross primary productivity (GPP), and evapotranspiration (ET). Most products span from 1981-2014. The algorithms for producing these products have been published in the top remote sensing related journals and books. More and more applications have being reported in the scientific literature. The GLASS products are freely available at the Center for Global Change Data Processing and Analysis of Beijing Normal University (http://www.bnu-datacenter.com/), and the University of Maryland Global Land Cover Facility (http://glcf.umd.edu). After briefly introducing the basic characteristics of GLASS products, we will present some applications on the long-term environmental changes detected from GLASS products at both global and local scales. Detailed analysis of regional hotspots, such as Greenland, Tibetan plateau, and northern China, will be emphasized, where environmental changes have been mainly associated with climate warming, drought, land-atmosphere interactions, and human activities.

  5. A Parametric Model of Shoulder Articulation for Virtual Assessment of Space Suit Fit

    Science.gov (United States)

    Kim, K. Han; Young, Karen S.; Bernal, Yaritza; Boppana, Abhishektha; Vu, Linh Q.; Benson, Elizabeth A.; Jarvis, Sarah; Rajulu, Sudhakar L.

    2016-01-01

    Suboptimal suit fit is a known risk factor for crewmember shoulder injury. Suit fit assessment is however prohibitively time consuming and cannot be generalized across wide variations of body shapes and poses. In this work, we have developed a new design tool based on the statistical analysis of body shape scans. This tool is aimed at predicting the skin deformation and shape variations for any body size and shoulder pose for a target population. This new process, when incorporated with CAD software, will enable virtual suit fit assessments, predictively quantifying the contact volume, and clearance between the suit and body surface at reduced time and cost.

  6. A Method for and Issues Associated with the Determination of Space Suit Joint Requirements

    Science.gov (United States)

    Matty, Jennifer E.; Aitchison, Lindsay

    2009-01-01

    In the design of a new space suit it is necessary to have requirements that define what mobility space suit joints should be capable of achieving in both a system and at the component level. NASA elected to divide mobility into its constituent parts-range of motion (ROM) and torque- in an effort to develop clean design requirements that limit subject performance bias and are easily verified. Unfortunately, the measurement of mobility can be difficult to obtain. Current technologies, such as the Vicon motion capture system, allow for the relatively easy benchmarking of range of motion (ROM) for a wide array of space suit systems. The ROM evaluations require subjects in the suit to accurately evaluate the ranges humans can achieve in the suit. However, when it comes to torque, there are significant challenges for both benchmarking current performance and writing requirements for future suits. This is reflected in the fact that torque definitions have been applied to very few types of space suits and with limited success in defining all the joints accurately. This paper discussed the advantages and disadvantages to historical joint torque evaluation methods, describes more recent efforts directed at benchmarking joint torques of prototype space suits, and provides an outline for how NASA intends to address joint torque in design requirements for the Constellation Space Suit System (CSSS).

  7. A comparison of global rating scale and checklist scores in the validation of an evaluation tool to assess performance in the resuscitation of critically ill patients during simulated emergencies (abbreviated as "CRM simulator study IB").

    Science.gov (United States)

    Kim, John; Neilipovitz, David; Cardinal, Pierre; Chiu, Michelle

    2009-01-01

    Crisis resource management (CRM) skills are a set of nonmedical skills required to manage medical emergencies. There is currently no gold standard for evaluation of CRM performance. A prior study examined the use of a global rating scale (GRS) to evaluate CRM performance. This current study compared the use of a GRS and a checklist as formal rating instruments to evaluate CRM performance during simulated emergencies. First-year and third-year residents participated in two simulator scenarios each. Three raters then evaluated resident performance in CRM using edited video recordings using both a GRS and a checklist. The Ottawa GRS provides a seven-point anchored ordinal scale for performance in five categories of CRM, and an overall performance score. The Ottawa CRM checklist provides 12 items in the five categories of CRM, with a maximum cumulative score of 30 points. Construct validity was measured on the basis of content validity, response process, internal structure, and response to other variables. T-test analysis of Ottawa GRS scores was conducted to examine response to the variable of level of training. Intraclass correlation coefficient (ICC) scores were used to measure inter-rater reliability for both scenarios. Thirty-two first-year and 28 third-year residents participated in the study. Third-year residents produced higher mean scores for overall CRM performance than first-year residents (P CRM checklist (P CRM checklist. Users indicated a strong preference for the Ottawa GRS given ease of scoring, presence of an overall score, and the potential for formative evaluation. Construct validity seems to be present when using both the Ottawa GRS and CRM checklist to evaluate CRM performance during simulated emergencies. Data also indicate the presence of moderate inter-rater reliability when using both the Ottawa GRS and CRM checklist.

  8. Characterization of the Radiation Shielding Properties of US and Russian EVA Suits

    International Nuclear Information System (INIS)

    Benton, E.R.; Benton, E.V.; Frank, A.L.

    2001-01-01

    Reported herein are results from the Eril Research, Inc. (ERI) participation in the NASA Johnson Space Center sponsored study characterizing the radiation shielding properties of the two types of space suit that astronauts are wearing during the EVA on-orbit assembly of the International Space Station (ISS). Measurements using passive detectors were carried out to assess the shielding properties of the USEMU Suit and the Russian Orlan-M suit during irradiations of the suits and a tissue equivalent phantom to monoenergetic proton and electron beams at the Loma Linda University Medical Center (LLUMC). During irradiations of 6 MeV electrons and 60 MeV protons, absorbed dose as a function of depth was measured using TLDs exposed behind swatches of the two suit materials and inside the two EVA helmets. Considerable reduction in electron dose was measured behind all suit materials in exposures to 6MeV electrons. Slowing of the proton beam in the suit materials led to an increase in dose measured in exposures to 60 MeV protons. During 232 MeV proton irradiations, measurements were made with TLDs and CR-39 PNTDs at five organ locations inside a tissue equivalent phantom, exposed both with and without the two EVA suits. The EVA helmets produce a 13 to 27 percent reduction in total dose and a 0 to 25 percent reduction in dose equivalent when compared to measurements made in the phantom head alone. Differences in dose and dose equivalent between the suit and non-suit irradiations for the lower portions of the two EVA suits tended to be smaller. Proton-induced target fragmentation was found to be a significant source of increased dose equivalent, especially within the two EVA helmets, and average quality factor inside the EMU and Orlan-M helmets was 2 to 14 percent greater than that measured in the bare phantom head

  9. Validation of the model of Critical Heat Flux COBRA-TF compared experiments of Post-Dryout performed by the Royal Institute of Technology (KTH); Validacion del Modelo de Critical Heat Flux de COBRA-TF frente a los Experimentos de Post-Dryout realizados por el Royal Institute of Technology (KTH)

    Energy Technology Data Exchange (ETDEWEB)

    Abarca, A.; Miro, R.; Barrachina, T.; Verdu, G.

    2014-07-01

    In this work is a validation of the results obtained with different existing correlations for predicting the value and location of the CTF code CHF, using them for experiments of Post-Dryout conducted by the Royal Institute of Technology (KTH) in Stockholm, Sweden. (Author)

  10. Gamma-ray spectrometry of granitic suites of the Paranaguá Terrane, Southern Brazil

    Science.gov (United States)

    Weihermann, Jessica Derkacz; Ferreira, Francisco José Fonseca; Cury, Leonardo Fadel; da Silveira, Claudinei Taborda

    2016-09-01

    The Paranaguá Terrane, located in the coastal portion of the states of Santa Catarina, Paraná and São Paulo in Southern Brazil is a crustal segment constituted mainly by an igneous complex, with a variety of granitic rocks inserted into the Serra do Mar ridge. The average altitude is approximately 1200 m above sea level, with peaks of up to 1800 m. Due to the difficulty of accessing the area, a shortage of outcrops and the thick weathering mantle, this terrane is understudied. This research aims to evaluate the gamma-ray spectrometry data of the granitic suites of the Paranaguá Terrane, in correspondence with the geological, petrographical, lithogeochemical, relief and mass movement information available in the literature. Aerogeophysical data were acquired along north-south lines spaced at 500 m, with a mean terrain clearance of 100 m. These data cover potassium (K, %), equivalent in thorium (eTh, ppm) and equivalent in uranium (eU, ppm). After performing a critical analysis of the data, basic (K, eU, eTh) and ternary (R-K/G-eTh/B-eU) maps were generated and then superimposed on the digital elevation model (DEM). The investigation of the radionuclide mobility across the relief and weathering mantle consisted of an analysis of the schematic profiles of elevation related with each radionuclide; a comparison of the K, eU and eTh maps with their 3D correspondents; and the study of mass movements registered in the region. A statistical comparison of lithogeochemical (K, U, Th) and geophysical (K, eU, eTh) data showed consistency in all the granitic suites studied (Morro Inglês, Rio do Poço and Canavieiras-Estrela). Through gamma-ray spectrometry, it was possible to establish relationships between scars (from mass movements) and the gamma-ray responses as well as the radionuclide mobility and the relief and to map the granitic bodies.

  11. Safety Precautions and Operating Procedures in an (A)BSL-4 Laboratory: 1. Biosafety Level 4 Suit Laboratory Suite Entry and Exit Procedures.

    Science.gov (United States)

    Janosko, Krisztina; Holbrook, Michael R; Adams, Ricky; Barr, Jason; Bollinger, Laura; Newton, Je T'aime; Ntiforo, Corrie; Coe, Linda; Wada, Jiro; Pusl, Daniela; Jahrling, Peter B; Kuhn, Jens H; Lackemeyer, Matthew G

    2016-10-03

    Biosafety level 4 (BSL-4) suit laboratories are specifically designed to study high-consequence pathogens for which neither infection prophylaxes nor treatment options exist. The hallmarks of these laboratories are: custom-designed airtight doors, dedicated supply and exhaust airflow systems, a negative-pressure environment, and mandatory use of positive-pressure ("space") suits. The risk for laboratory specialists working with highly pathogenic agents is minimized through rigorous training and adherence to stringent safety protocols and standard operating procedures. Researchers perform the majority of their work in BSL-2 laboratories and switch to BSL-4 suit laboratories when work with a high-consequence pathogen is required. Collaborators and scientists considering BSL-4 projects should be aware of the challenges associated with BSL-4 research both in terms of experimental technical limitations in BSL-4 laboratory space and the increased duration of such experiments. Tasks such as entering and exiting the BSL-4 suit laboratories are considerably more complex and time-consuming compared to BSL-2 and BSL-3 laboratories. The focus of this particular article is to address basic biosafety concerns and describe the entrance and exit procedures for the BSL-4 laboratory at the NIH/NIAID Integrated Research Facility at Fort Detrick. Such procedures include checking external systems that support the BSL-4 laboratory, and inspecting and donning positive-pressure suits, entering the laboratory, moving through air pressure-resistant doors, and connecting to air-supply hoses. We will also discuss moving within and exiting the BSL-4 suit laboratories, including using the chemical shower and removing and storing positive-pressure suits.

  12. The Role of KREEP in the Production of Mg-Suite Magmas and Its Influence on the Extent of Mg-Suite Magmatism in the Lunar Crust

    Science.gov (United States)

    Elardo, S. M.; Shearer, C. K.; McCubbin, F. M.

    2017-01-01

    The lunar magnesian-suite, or Mg-suite, is a series of ancient plutonic rocks from the lunar crust. They have received a considerable amount of attention from lunar scientists since their discovery for three primary reasons: 1) their ages and geochemistry indicate they represent pristine magmatic samples that crystallized very soon after the formation of the Moon; 2) their ages often overlap with ages of the ferroan anorthosite (FAN) crust; and 3) planetary-scale processes are needed in formation models to account for their unique geochemical features. Taken as a whole, the Mg-suite samples, as magmatic cumulate rocks, approximate a fractional crystallization sequence in the low-pressure forsterite-anorthite-silica system, and thus these samples are generally thought to be derived from layered mafic intrusions which crystallized very slowly from magmas that intruded the anorthositic crust. However, no direct linkages have been established between different Mg-suite samples based either on field relationships or geochemistry.The model for the origin of the Mg-suite, which best fits the limited available data, is one where Mg-suite magmas form from melting of a hybrid cumulate package consisting of deep mantle dunite, crustal anorthosite, and KREEP (potassium-rare earth elements-phosphorus) at the base of the crust under the Procellarum KREEP Terrane (PKT). In this model, these three LMO (Lunar Magma Ocean) cumulate components are brought into close proximity by the cumulate overturn process. Deep mantle dunitic cumulates with an Mg number of approximately 90 rise to the base of the anorthositic crust due to their buoyancy relative to colder, more dense Fe- and Ti-rich cumulates. This hybridized source rock melts to form Mg-suite magmas, saturated in Mg-rich olivine and anorthitic plagioclase, that have a substantial KREEP component.

  13. Magnetospheric Multiscale Instrument Suite Operations and Data System

    Science.gov (United States)

    Baker, D. N.; Riesberg, L.; Pankratz, C. K.; Panneton, R. S.; Giles, B. L.; Wilder, F. D.; Ergun, R. E.

    2016-03-01

    The four Magnetospheric Multiscale (MMS) spacecraft will collect a combined volume of ˜100 gigabits per day of particle and field data. On average, only 4 gigabits of that volume can be transmitted to the ground. To maximize the scientific value of each transmitted data segment, MMS has developed the Science Operations Center (SOC) to manage science operations, instrument operations, and selection, downlink, distribution, and archiving of MMS science data sets. The SOC is managed by the Laboratory for Atmospheric and Space Physics (LASP) in Boulder, Colorado and serves as the primary point of contact for community participation in the mission. MMS instrument teams conduct their operations through the SOC, and utilize the SOC's Science Data Center (SDC) for data management and distribution. The SOC provides a single mission data archive for the housekeeping and science data, calibration data, ephemerides, attitude and other ancillary data needed to support the scientific use and interpretation. All levels of data products will reside at and be publicly disseminated from the SDC. Documentation and metadata describing data products, algorithms, instrument calibrations, validation, and data quality will be provided. Arguably, the most important innovation developed by the SOC is the MMS burst data management and selection system. With nested automation and "Scientist-in-the-Loop" (SITL) processes, these systems are designed to maximize the value of the burst data by prioritizing the data segments selected for transmission to the ground. This paper describes the MMS science operations approach, processes and data systems, including the burst system and the SITL concept.

  14. High Performance Electrical Modeling and Simulation Verification Test Suite - Tier I; TOPICAL

    International Nuclear Information System (INIS)

    SCHELLS, REGINA L.; BOGDAN, CAROLYN W.; WIX, STEVEN D.

    2001-01-01

    This document describes the High Performance Electrical Modeling and Simulation (HPEMS) Global Verification Test Suite (VERTS). The VERTS is a regression test suite used for verification of the electrical circuit simulation codes currently being developed by the HPEMS code development team. This document contains descriptions of the Tier I test cases

  15. Modular Algorithm Testbed Suite (MATS): A Software Framework for Automatic Target Recognition

    Science.gov (United States)

    2017-01-01

    NAVAL SURFACE WARFARE CENTER PANAMA CITY DIVISION PANAMA CITY, FL 32407-7001 TECHNICAL REPORT NSWC PCD TR-2017-004 MODULAR ...31-01-2017 Technical Modular Algorithm Testbed Suite (MATS): A Software Framework for Automatic Target Recognition DR...flexible platform to facilitate the development and testing of ATR algorithms. To that end, NSWC PCD has created the Modular Algorithm Testbed Suite

  16. Surgical Space Suits Increase Particle and Microbiological Emission Rates in a Simulated Surgical Environment.

    Science.gov (United States)

    Vijaysegaran, Praveen; Knibbs, Luke D; Morawska, Lidia; Crawford, Ross W

    2018-05-01

    The role of space suits in the prevention of orthopedic prosthetic joint infection remains unclear. Recent evidence suggests that space suits may in fact contribute to increased infection rates, with bioaerosol emissions from space suits identified as a potential cause. This study aimed to compare the particle and microbiological emission rates (PER and MER) of space suits and standard surgical clothing. A comparison of emission rates between space suits and standard surgical clothing was performed in a simulated surgical environment during 5 separate experiments. Particle counts were analyzed with 2 separate particle counters capable of detecting particles between 0.1 and 20 μm. An Andersen impactor was used to sample bacteria, with culture counts performed at 24 and 48 hours. Four experiments consistently showed statistically significant increases in both PER and MER when space suits are used compared with standard surgical clothing. One experiment showed inconsistent results, with a trend toward increases in both PER and MER when space suits are used compared with standard surgical clothing. Space suits cause increased PER and MER compared with standard surgical clothing. This finding provides mechanistic evidence to support the increased prosthetic joint infection rates observed in clinical studies. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. A suite of models to support the quantitative assessment of spread in pest risk analysis

    NARCIS (Netherlands)

    Robinet, C.; Kehlenbeck, H.; Werf, van der W.

    2012-01-01

    In the frame of the EU project PRATIQUE (KBBE-2007-212459 Enhancements of pest risk analysis techniques) a suite of models was developed to support the quantitative assessment of spread in pest risk analysis. This dataset contains the model codes (R language) for the four models in the suite. Three

  18. Telemetry Standards, IRIG Standard 106-17, Chapter 22, Network Based Protocol Suite

    Science.gov (United States)

    2017-07-01

    requirements. 22.2 Network Access Layer 22.2.1 Physical Layer Connectors and cable media should meet the electrical or optical properties required by the...Telemetry Standards, IRIG Standard 106-17 Chapter 22, July 2017 i CHAPTER 22 Network -Based Protocol Suite Acronyms...iii Chapter 22. Network -Based Protocol Suite

  19. Fluid replacement advice during work in fully encapsulated impermeable chemical protective suits

    NARCIS (Netherlands)

    Rubenstein, C.D.; Hartog, E.A. den; Deaton, A.S.; Bogerd, C.P.; Kant, S. de

    2017-01-01

    A major concern for responders to hazardous materials (HazMat) incidents is the heat strain that is caused by fully encapsulated impermeable chemical protective suits. In a research project, funded by the US Department of Defense, the thermal strain experienced when wearing these suits was studied.

  20. DEMON/ANGEL - A SUITE OF PROGRAMS TO CARRY OUT DENSITY MODIFICATION

    NARCIS (Netherlands)

    VELLIEUX, FMDAP; HUNT, JF; ROY, S; READ, RJ

    1995-01-01

    The DEMON/ANGEL suite of computer programs has been developed to carry out density modification by non-crystallographic symmetry-averaging, solvent-flattening and histogram-mapping techniques. This suite consists of programs that allow molecular envelopes to be defined and modified,