WorldWideScience

Sample records for rate testing methodology

  1. Accelerated Testing Methodology in Constant Stress-Rate Testing for Advanced Structural Ceramics: A Preloading Technique

    Science.gov (United States)

    Choi, Sung R.; Gyekenyesi, John P.; Huebert, Dean; Bartlett, Allen; Choi, Han-Ho

    2001-01-01

    Preloading technique was used as a means of an accelerated testing methodology in constant stress-rate (dynamic fatigue) testing for two different brittle materials. The theory developed previously for fatigue strength as a function of preload was further verified through extensive constant stress-rate testing for glass-ceramic and CRT glass in room temperature distilled water. The preloading technique was also used in this study to identify the prevailing failure mechanisms at elevated temperatures, particularly at lower test rates in which a series of mechanisms would be associated simultaneously with material failure, resulting in significant strength increase or decrease. Two different advanced ceramics including SiC whisker-reinforced composite silicon nitride and 96 wt% alumina were used at elevated temperatures. It was found that the preloading technique can be used as an additional tool to pinpoint the dominant failure mechanism that is associated with such a phenomenon of considerable strength increase or decrease.

  2. Methodology for developing new test methods

    Directory of Open Access Journals (Sweden)

    A. I. Korobko

    2017-06-01

    Full Text Available The paper describes the methodology for developing new test methods and forming solutions for the development of new test methods. The basis of the methodology for developing new test methods is the individual elements of the system and process approaches. They contribute to the development of an effective research strategy for the object, the study of interrelations, the synthesis of an adequate model of the test method. The effectiveness of the developed test method is determined by the correct choice of the set of concepts, their interrelations and mutual influence. This allows you to solve the tasks assigned to achieve the goal. The methodology is based on the use of fuzzy cognitive maps. The question of the choice of the method on the basis of which the model for the formation of solutions is based is considered. The methodology provides for recording a model for a new test method in the form of a finite set of objects. These objects are significant for the test method characteristics. Then a causal relationship is established between the objects. Further, the values of fitness indicators and the observability of the method and metrological tolerance for the indicator are established. The work is aimed at the overall goal of ensuring the quality of tests by improving the methodology for developing the test method.

  3. PETA: Methodology of Information Systems Security Penetration Testing

    Directory of Open Access Journals (Sweden)

    Tomáš Klíma

    2016-12-01

    Full Text Available Current methodologies of information systems penetration testing focuses mainly on a high level and technical description of the testing process. Unfortunately, there is no methodology focused primarily on the management of these tests. It often results in a situation when the tests are badly planned, managed and the vulnerabilities found are unsystematically remediated. The goal of this article is to present new methodology called PETA which is focused mainly on the management of penetration tests. Development of this methodology was based on the comparative analysis of current methodologies. New methodology incorporates current best practices of IT governance and project management represented by COBIT and PRINCE2 principles. Presented methodology has been quantitatively evaluated.

  4. Progress on qualification testing methodology study of electric cables

    International Nuclear Information System (INIS)

    Yoshida, K.; Seguchi, T.; Okada, S.; Ito, M.; Kusama, Y.; Yagi, T.; Yoshikawa, M.

    1983-01-01

    Many instrumental, control and power cables are installed in nuclear power plants, and these cables contain a large amount of organic polymers as insulating and jacketing materials. They are exposed to radiation at high dose rate, steam at high temperature and chemical (or water) spray simultaneously when a LOCA occurs. Under such conditions, the polymers tend to lose their original properties. For reactor safety, the cables should be functional even if they are subjected to a loss-of-coolant accident (LOCA) at the end of their intended service life. In Japan, cable manufacturers qualify their cables according to the proposed test standard issued from IEEJ in 1982, but the standard still has many unsolved problems or uncertainties which have been dealt with tentatively through the manufacturer-user's agreement. The objectives of this research are to study the methodologies for qualification testing of electric wires and cables, and to provide the improved technical bases for modification of the standard. Research activities are divided into the Accident (LOCA) Testing Methodology and the Accelerated Aging Methodology

  5. Pilot Testing of a Sampling Methodology for Assessing Seed Attachment Propensity and Transport Rate in a Soil Matrix Carried on Boot Soles and Bike Tires

    Science.gov (United States)

    Hardiman, Nigel; Dietz, Kristina Charlotte; Bride, Ian; Passfield, Louis

    2017-01-01

    Land managers of natural areas are under pressure to balance demands for increased recreation access with protection of the natural resource. Unintended dispersal of seeds by visitors to natural areas has high potential for weedy plant invasions, with initial seed attachment an important step in the dispersal process. Although walking and mountain biking are popular nature-based recreation activities, there are few studies quantifying propensity for seed attachment and transport rate on boot soles and none for bike tires. Attachment and transport rate can potentially be affected by a wide range of factors for which field testing can be time-consuming and expensive. We pilot tested a sampling methodology for measuring seed attachment and transport rate in a soil matrix carried on boot soles and bike tires traversing a known quantity and density of a seed analog (beads) over different distances and soil conditions. We found % attachment rate on boot soles was much lower overall than previously reported, but that boot soles had a higher propensity for seed attachment than bike tires in almost all conditions. We believe our methodology offers a cost-effective option for researchers seeking to manipulate and test effects of different influencing factors on these two dispersal vectors.

  6. Pilot Testing of a Sampling Methodology for Assessing Seed Attachment Propensity and Transport Rate in a Soil Matrix Carried on Boot Soles and Bike Tires.

    Science.gov (United States)

    Hardiman, Nigel; Dietz, Kristina Charlotte; Bride, Ian; Passfield, Louis

    2017-01-01

    Land managers of natural areas are under pressure to balance demands for increased recreation access with protection of the natural resource. Unintended dispersal of seeds by visitors to natural areas has high potential for weedy plant invasions, with initial seed attachment an important step in the dispersal process. Although walking and mountain biking are popular nature-based recreation activities, there are few studies quantifying propensity for seed attachment and transport rate on boot soles and none for bike tires. Attachment and transport rate can potentially be affected by a wide range of factors for which field testing can be time-consuming and expensive. We pilot tested a sampling methodology for measuring seed attachment and transport rate in a soil matrix carried on boot soles and bike tires traversing a known quantity and density of a seed analog (beads) over different distances and soil conditions. We found % attachment rate on boot soles was much lower overall than previously reported, but that boot soles had a higher propensity for seed attachment than bike tires in almost all conditions. We believe our methodology offers a cost-effective option for researchers seeking to manipulate and test effects of different influencing factors on these two dispersal vectors.

  7. The Impact of the Rating Agencies' Through-the-cycle Methodology on Rating Dynamics

    NARCIS (Netherlands)

    Altman, E.I.; Rijken, H.A.

    2005-01-01

    Surveys on the use of agency credit ratings reveal that some investors believe that credit-rating agencies are relatively slow in adjusting their ratings. A well-accepted explanation for this perception on rating timeliness is the through-the-cycle methodology that agencies use. Through-the-cycle

  8. A proposed quantitative credit-rating methodology for South African provincial departments

    Directory of Open Access Journals (Sweden)

    Erika Fourie

    2016-05-01

    Full Text Available The development of subnational credit-rating methodologies affords benefits for subnationals, the sovereign and its citizens. Trusted credit ratings facilitate access to financial markets and above-average ratings allow for the negotiation of better collateral and guarantee agreements, as well as for funding of, for example, infrastructure projects at superior (lower interest rates. This paper develops the quantitative section of a credit-rating methodology for South African subnationals. The unique characteristics of South African data, their assembly, and the selection of dependent and independent variables for the linear-regression model chosen, are discussed. The methodology is then applied to the provincial Department of Health using linear regression modelling.

  9. A proposed quantitative credit-rating methodology for South African provincial departments

    OpenAIRE

    Erika Fourie; Tanja Verster; Gary Wayne van Vuuren

    2016-01-01

    The development of subnational credit-rating methodologies affords benefits for subnationals, the sovereign and its citizens. Trusted credit ratings facilitate access to financial markets and above-average ratings allow for the negotiation of better collateral and guarantee agreements, as well as for funding of, for example, infrastructure projects at superior (lower) interest rates. This paper develops the quantitative section of a credit-rating methodology for South African subnationals. Th...

  10. Test reactor risk assessment methodology

    International Nuclear Information System (INIS)

    Jennings, R.H.; Rawlins, J.K.; Stewart, M.E.

    1976-04-01

    A methodology has been developed for the identification of accident initiating events and the fault modeling of systems, including common mode identification, as these methods are applied in overall test reactor risk assessment. The methods are exemplified by a determination of risks to a loss of primary coolant flow in the Engineering Test Reactor

  11. Methodology for dynamic biaxial tension testing of pregnant uterine tissue.

    Science.gov (United States)

    Manoogian, Sarah; Mcnally, Craig; Calloway, Britt; Duma, Stefan

    2007-01-01

    Placental abruption accounts for 50% to 70% of fetal losses in motor vehicle crashes. Since automobile crashes are the leading cause of traumatic fetal injury mortality in the United States, research of this injury mechanism is important. Before research can adequately evaluate current and future restraint designs, a detailed model of the pregnant uterine tissues is necessary. The purpose of this study is to develop a methodology for testing the pregnant uterus in biaxial tension at a rate normally seen in a motor vehicle crash. Since the majority of previous biaxial work has established methods for quasi-static testing, this paper combines previous research and new methods to develop a custom designed system to strain the tissue at a dynamic rate. Load cells and optical markers are used for calculating stress strain curves of the perpendicular loading axes. Results for this methodology show images of a tissue specimen loaded and a finite verification of the optical strain measurement. The biaxial test system dynamically pulls the tissue to failure with synchronous motion of four tissue grips that are rigidly coupled to the tissue specimen. The test device models in situ loading conditions of the pregnant uterus and overcomes previous limitations of biaxial testing. A non-contact method of measuring strains combined with data reduction to resolve the stresses in two directions provides the information necessary to develop a three dimensional constitutive model of the material. Moreover, future research can apply this method to other soft tissues with similar in situ loading conditions.

  12. Methodology for testing metal detectors using variables test data

    Energy Technology Data Exchange (ETDEWEB)

    Spencer, D.D.; Murray, D.W.

    1993-08-01

    By extracting and analyzing measurement (variables) data from portal metal detectors whenever possible instead of the more typical ``alarm``/``no-alarm`` (attributes or binomial) data, we can be more informed about metal detector health with fewer tests. This testing methodology discussed in this report is an alternative to the typical binomial testing and in many ways is far superior.

  13. 18 CFR 342.4 - Other rate changing methodologies.

    Science.gov (United States)

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Other rate changing methodologies. 342.4 Section 342.4 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY... regard to the applicable ceiling level under § 342.3. (b) Market-based rates. A carrier may attempt to...

  14. 42 CFR 413.196 - Notification of changes in rate-setting methodologies and payment rates.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 2 2010-10-01 2010-10-01 false Notification of changes in rate-setting... NURSING FACILITIES Payment for End-Stage Renal Disease (ESRD) Services and Organ Procurement Costs § 413.196 Notification of changes in rate-setting methodologies and payment rates. Link to an amendment...

  15. Proposed Objective Odor Control Test Methodology for Waste Containment

    Science.gov (United States)

    Vos, Gordon

    2010-01-01

    The Orion Cockpit Working Group has requested that an odor control testing methodology be proposed to evaluate the odor containment effectiveness of waste disposal bags to be flown on the Orion Crew Exploration Vehicle. As a standardized "odor containment" test does not appear to be a matter of record for the project, a new test method is being proposed. This method is based on existing test methods used in industrial hygiene for the evaluation of respirator fit in occupational settings, and takes into consideration peer reviewed documentation of human odor thresholds for standardized contaminates, industry stardnard atmostpheric testing methodologies, and established criteria for laboratory analysis. The proposed methodology is quantitative, though it can readily be complimented with a qualitative subjective assessment. Isoamyl acetate (IAA - also known at isopentyl acetate) is commonly used in respirator fit testing, and there are documented methodologies for both measuring its quantitative airborne concentrations. IAA is a clear, colorless liquid with a banana-like odor, documented detectable smell threshold for humans of 0.025 PPM, and a 15 PPB level of quantation limit.

  16. Nondestructive Semistatic Testing Methodology for Assessing Fish Textural Characteristics via Closed-Form Mathematical Expressions

    Directory of Open Access Journals (Sweden)

    D. Dimogianopoulos

    2017-01-01

    Full Text Available This paper presents a novel methodology based on semistatic nondestructive testing of fish for the analytical computation of its textural characteristics via closed-form mathematical expressions. The novelty is that, unlike alternatives, explicit values for both stiffness and viscoelastic textural attributes may be computed, even if fish of different size/weight are tested. Furthermore, the testing procedure may be adapted to the specifications (sampling rate and accuracy of the available equipment. The experimental testing involves a fish placed on the pan of a digital weigh scale, which is subsequently tested with a ramp-like load profile in a custom-made installation. The ramp slope is (to some extent adjustable according to the specification (sampling rate and accuracy of the equipment. The scale’s reaction to fish loading, namely, the reactive force, is collected throughout time and is shown to depend on the fish textural attributes according to a closed-form mathematical formula. The latter is subsequently used along with collected data in order to compute these attributes rapidly and effectively. Four whole raw sea bass (Dicentrarchus labrax of various sizes and textures were tested. Changes in texture, related to different viscoelastic characteristics among the four fish, were correctly detected and quantified using the proposed methodology.

  17. Testing methodology of embedded software in digital plant protection system

    International Nuclear Information System (INIS)

    Seong, Ah Young; Choi, Bong Joo; Lee, Na Young; Hwang, Il Soon

    2001-01-01

    It is necessary to assure the reliability of software in order to digitalize RPS(Reactor Protection System). Since RPS causes fatal damage on accidental cases, it is classified as Safety 1E class. Therefore we propose the effective testing methodology to assure the reliability of embedded software in the DPPS(Digital Plant Protection System). To test the embedded software effectively in DPPS, our methodology consists of two steps. The first is the re-engineering step that extracts classes from structural source program, and the second is the level of testing step which is composed of unit testing, Integration Testing and System Testing. On each testing step we test the embedded software with selected test cases after the test item identification step. If we use this testing methodology, we can test the embedded software effectively by reducing the cost and the time

  18. Effective Rating Scale Development for Speaking Tests: Performance Decision Trees

    Science.gov (United States)

    Fulcher, Glenn; Davidson, Fred; Kemp, Jenny

    2011-01-01

    Rating scale design and development for testing speaking is generally conducted using one of two approaches: the measurement-driven approach or the performance data-driven approach. The measurement-driven approach prioritizes the ordering of descriptors onto a single scale. Meaning is derived from the scaling methodology and the agreement of…

  19. A methodology of SiP testing based on boundary scan

    Science.gov (United States)

    Qin, He; Quan, Haiyang; Han, Yifei; Zhu, Tianrui; Zheng, Tuo

    2017-10-01

    System in Package (SiP) play an important role in portable, aerospace and military electronic with the microminiaturization, light weight, high density, and high reliability. At present, SiP system test has encountered the problem on system complexity and malfunction location with the system scale exponentially increase. For SiP system, this paper proposed a testing methodology and testing process based on the boundary scan technology. Combining the character of SiP system and referencing the boundary scan theory of PCB circuit and embedded core test, the specific testing methodology and process has been proposed. The hardware requirement of the under test SiP system has been provided, and the hardware platform of the testing has been constructed. The testing methodology has the character of high test efficiency and accurate malfunction location.

  20. Stress Optical Coefficient, Test Methodology, and Glass Standard Evaluation

    Science.gov (United States)

    2016-05-01

    ARL-TN-0756 ● MAY 2016 US Army Research Laboratory Stress Optical Coefficient, Test Methodology , and Glass Standard Evaluation...Stress Optical Coefficient, Test Methodology , and Glass Standard Evaluation by Clayton M Weiss Oak Ridge Institute for Science and Education...ORISE), Belcamp, MD Parimal J Patel Weapons and Materials Research Directorate, ARL Approved for public release; distribution is

  1. Testing Methodology in the Student Learning Process

    Science.gov (United States)

    Gorbunova, Tatiana N.

    2017-01-01

    The subject of the research is to build methodologies to evaluate the student knowledge by testing. The author points to the importance of feedback about the mastering level in the learning process. Testing is considered as a tool. The object of the study is to create the test system models for defence practice problems. Special attention is paid…

  2. A Design Methodology for Computer Security Testing

    OpenAIRE

    Ramilli, Marco

    2013-01-01

    The field of "computer security" is often considered something in between Art and Science. This is partly due to the lack of widely agreed and standardized methodologies to evaluate the degree of the security of a system. This dissertation intends to contribute to this area by investigating the most common security testing strategies applied nowadays and by proposing an enhanced methodology that may be effectively applied to different threat scenarios with the same degree of effectiveness. ...

  3. Determination of area reduction rate by continuous ball indentation test

    International Nuclear Information System (INIS)

    Zou, Bin; Guan, Kai Shu; Wu, Sheng Bao

    2016-01-01

    Rate of area reduction is an important mechanical property to appraise the plasticity of metals, which is always obtained from the uniaxial tensile test. A methodology is proposed to determine the area reduction rate by continuous ball indentation test technique. The continuum damage accumulation theory has been adopted in this work to identify the failure point in the indentation. The corresponding indentation depth of this point can be obtained and used to estimate the area reduction rate. The local strain limit criterion proposed in the ASME VIII-2 2007 alternative rules is also adopted in this research to convert the multiaxial strain of indentation test to uniaxial strain of tensile test. The pile-up and sink-in phenomenon which can affect the result significantly is also discussed in this paper. This method can be useful in engineering practice to evaluate the material degradation under severe working condition due to the non-destructive nature of ball indentation test. In order to validate the method, continuous ball indentation test is performed on ferritic steel 16MnR and ASTM (A193B16), then the results are compared with that got from the traditional uniaxial tensile test.

  4. Environmental Testing Methodology in Biometrics

    OpenAIRE

    Fernández Saavedra, Belén; Sánchez Reíllo, Raúl; Alonso Moreno, Raúl; Miguel Hurtado, Óscar

    2010-01-01

    8 pages document + 5-slide presentation.-- Contributed to: 1st International Biometric Performance Conference (IBPC 2010, NIST, Gaithersburg, MD, US, Mar 1-5, 2010). Recently, biometrics is used in many security systems and these systems can be located in different environments. As many experts claim and previous works have demonstrated, environmental conditions influence biometric performance. Nevertheless, there is not a specific methodology for testing this influence at the moment...

  5. Covariance methodology applied to uncertainties in I-126 disintegration rate measurements

    International Nuclear Information System (INIS)

    Fonseca, K.A.; Koskinas, M.F.; Dias, M.S.

    1996-01-01

    The covariance methodology applied to uncertainties in 126 I disintegration rate measurements is described. Two different coincidence systems were used due to the complex decay scheme of this radionuclide. The parameters involved in the determination of the disintegration rate in each experimental system present correlated components. In this case, the conventional statistical methods to determine the uncertainties (law of propagation) result in wrong values for the final uncertainty. Therefore, use of the methodology of the covariance matrix is necessary. The data from both systems were combined taking into account all possible correlations between the partial uncertainties. (orig.)

  6. Equipment qualification testing methodology research at Sandia Laboratories

    International Nuclear Information System (INIS)

    Jeppesen, D.

    1983-01-01

    The Equipment Qualification Research Testing (EQRT) program is an evolutionary outgrowth of the Qualification Testing Evaluation (QTE) program at Sandia. The primary emphasis of the program has been qualification methodology research. The EQRT program offers to the industry a research-oriented perspective on qualification-related component performance, as well as refinements to component testing standards which are based upon actual component testing research

  7. Cassini's Test Methodology for Flight Software Verification and Operations

    Science.gov (United States)

    Wang, Eric; Brown, Jay

    2007-01-01

    The Cassini spacecraft was launched on 15 October 1997 on a Titan IV-B launch vehicle. The spacecraft is comprised of various subsystems, including the Attitude and Articulation Control Subsystem (AACS). The AACS Flight Software (FSW) and its development has been an ongoing effort, from the design, development and finally operations. As planned, major modifications to certain FSW functions were designed, tested, verified and uploaded during the cruise phase of the mission. Each flight software upload involved extensive verification testing. A standardized FSW testing methodology was used to verify the integrity of the flight software. This paper summarizes the flight software testing methodology used for verifying FSW from pre-launch through the prime mission, with an emphasis on flight experience testing during the first 2.5 years of the prime mission (July 2004 through January 2007).

  8. The Menopause Rating Scale (MRS scale: A methodological review

    Directory of Open Access Journals (Sweden)

    Strelow Frank

    2004-09-01

    Full Text Available Abstract Background This paper compiles data from different sources to get a first comprehensive picture of psychometric and other methodological characteristics of the Menopause Rating Scale (MRS scale. The scale was designed and standardized as a self-administered scale to (a to assess symptoms/complaints of aging women under different conditions, (b to evaluate the severity of symptoms over time, and (c to measure changes pre- and postmenopause replacement therapy. The scale became widespread used (available in 10 languages. Method A large multinational survey (9 countries in 4 continents from 2001/ 2002 is the basis for in depth analyses on reliability and validity of the MRS. Additional small convenience samples were used to get first impressions about test-retest reliability. The data were centrally analyzed. Data from a postmarketing HRT study were used to estimate discriminative validity. Results Reliability measures (consistency and test-retest stability were found to be good across countries, although the sample size for test-retest reliability was small. Validity: The internal structure of the MRS across countries was astonishingly similar to conclude that the scale really measures the same phenomenon in symptomatic women. The sub-scores and total score correlations were high (0.7–0.9 but lower among the sub-scales (0.5–0.7. This however suggests that the subscales are not fully independent. Norm values from different populations were presented showing that a direct comparison between Europe and North America is possible, but caution recommended with comparisons of data from Latin America and Indonesia. But this will not affect intra-individual comparisons within clinical trials. The comparison with the Kupperman Index showed sufficiently good correlations, illustrating an adept criterion-oriented validity. The same is true for the comparison with the generic quality-of-life scale SF-36 where also a sufficiently close association

  9. Probabilistic fatigue life prediction methodology for notched components based on simple smooth fatigue tests

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Z. R.; Li, Z. X. [Dept.of Engineering Mechanics, Jiangsu Key Laboratory of Engineering Mechanics, Southeast University, Nanjing (China); Hu, X. T.; Xin, P. P.; Song, Y. D. [State Key Laboratory of Mechanics and Control of Mechanical Structures, Nanjing University of Aeronautics and Astronautics, Nanjing (China)

    2017-01-15

    The methodology of probabilistic fatigue life prediction for notched components based on smooth specimens is presented. Weakestlink theory incorporating Walker strain model has been utilized in this approach. The effects of stress ratio and stress gradient have been considered. Weibull distribution and median rank estimator are used to describe fatigue statistics. Fatigue tests under different stress ratios were conducted on smooth and notched specimens of titanium alloy TC-1-1. The proposed procedures were checked against the test data of TC-1-1 notched specimens. Prediction results of 50 % survival rate are all within a factor of two scatter band of the test results.

  10. Development of a flight software testing methodology

    Science.gov (United States)

    Mccluskey, E. J.; Andrews, D. M.

    1985-01-01

    The research to develop a testing methodology for flight software is described. An experiment was conducted in using assertions to dynamically test digital flight control software. The experiment showed that 87% of typical errors introduced into the program would be detected by assertions. Detailed analysis of the test data showed that the number of assertions needed to detect those errors could be reduced to a minimal set. The analysis also revealed that the most effective assertions tested program parameters that provided greater indirect (collateral) testing of other parameters. In addition, a prototype watchdog task system was built to evaluate the effectiveness of executing assertions in parallel by using the multitasking features of Ada.

  11. Development of a methodology for life cycle building energy ratings

    International Nuclear Information System (INIS)

    Hernandez, Patxi; Kenny, Paul

    2011-01-01

    Traditionally the majority of building energy use has been linked to its operation (heating, cooling, lighting, etc.), and much attention has been directed to reduce this energy use through technical innovation, regulatory control and assessed through a wide range of rating methods. However buildings generally employ an increasing amount of materials and systems to reduce the energy use in operation, and energy embodied in these can constitute an important part of the building's life cycle energy use. For buildings with 'zero-energy' use in operation the embodied energy is indeed the only life cycle energy use. This is not addressed by current building energy assessment and rating methods. This paper proposes a methodology to extend building energy assessment and rating methods accounting for embodied energy of building components and systems. The methodology is applied to the EU Building Energy Rating method and, as an illustration, as implemented in Irish domestic buildings. A case study dwelling is used to illustrate the importance of embodied energy on life cycle energy performance, particularly relevant when energy use in operation tends to zero. The use of the Net Energy Ratio as an indicator to select appropriate building improvement measures is also presented and discussed. - Highlights: → The definitions for 'zero energy buildings' and current building energy ratings are examined. → There is a need to integrate a life cycle perspective within building energy ratings. → A life cycle building energy rating method (LC-BER), including embodied energy is presented. → Net Energy Ratio is proposed as an indicator to select building energy improvement options.

  12. Aerospace Payloads Leak Test Methodology

    Science.gov (United States)

    Lvovsky, Oleg; Grayson, Cynthia M.

    2010-01-01

    Pressurized and sealed aerospace payloads can leak on orbit. When dealing with toxic or hazardous materials, requirements for fluid and gas leakage rates have to be properly established, and most importantly, reliably verified using the best Nondestructive Test (NDT) method available. Such verification can be implemented through application of various leak test methods that will be the subject of this paper, with a purpose to show what approach to payload leakage rate requirement verification is taken by the National Aeronautics and Space Administration (NASA). The scope of this paper will be mostly a detailed description of 14 leak test methods recommended.

  13. The influence of data collection rate, containment size and data smoothing on containment Integrated Leak Rate Tests

    International Nuclear Information System (INIS)

    Wagner, W.T.; Langan, J.P.; Norris, W.E.; Lurie, D.

    1988-01-01

    Phase I of a U.S. Nuclear Regulatory Commission contract investigated nuclear power plant Integrated Leak Rate Tests (ILRTs) using data gathered at many domestic and foreign ILRTs. The study evaluated ILRTs with the ANS criteria (in ANSI/ANS-56.8-1987) and the proposed extended ANS criteria (in draft Regulatory Guide, Task MS 021-5, October 1986). The study considered (1) the effects of data collection rates on ILRT conclusions, (2) a possible relationship between containment size, data collection rate and ILRT duration, (3) the impact of the proposed extended ANS methodology on ILRTs, and (4) the influence of data smoothing on ILRT data. The study was performed using 20 sets of Type A and 17 sets of verification data

  14. An atomistic methodology of energy release rate for graphene at nanoscale

    International Nuclear Information System (INIS)

    Zhang, Zhen; Lee, James D.; Wang, Xianqiao

    2014-01-01

    Graphene is a single layer of carbon atoms packed into a honeycomb architecture, serving as a fundamental building block for electric devices. Understanding the fracture mechanism of graphene under various conditions is crucial for tailoring the electrical and mechanical properties of graphene-based devices at atomic scale. Although most of the fracture mechanics concepts, such as stress intensity factors, are not applicable in molecular dynamics simulation, energy release rate still remains to be a feasible and crucial physical quantity to characterize the fracture mechanical property of materials at nanoscale. This work introduces an atomistic simulation methodology, based on the energy release rate, as a tool to unveil the fracture mechanism of graphene at nanoscale. This methodology can be easily extended to any atomistic material system. We have investigated both opening mode and mixed mode at different temperatures. Simulation results show that the critical energy release rate of graphene is independent of initial crack length at low temperature. Graphene with inclined pre-crack possesses higher fracture strength and fracture deformation but smaller critical energy release rate compared with the graphene with vertical pre-crack. Owing to its anisotropy, graphene with armchair chirality always has greater critical energy release rate than graphene with zigzag chirality. The increase of temperature leads to the reduction of fracture strength, fracture deformation, and the critical energy release rate of graphene. Also, higher temperature brings higher randomness of energy release rate of graphene under a variety of predefined crack lengths. The energy release rate is independent of the strain rate as long as the strain rate is small enough

  15. Rating the methodological quality of single-subject designs and n-of-1 trials: introducing the Single-Case Experimental Design (SCED) Scale.

    Science.gov (United States)

    Tate, Robyn L; McDonald, Skye; Perdices, Michael; Togher, Leanne; Schultz, Regina; Savage, Sharon

    2008-08-01

    Rating scales that assess methodological quality of clinical trials provide a means to critically appraise the literature. Scales are currently available to rate randomised and non-randomised controlled trials, but there are none that assess single-subject designs. The Single-Case Experimental Design (SCED) Scale was developed for this purpose and evaluated for reliability. Six clinical researchers who were trained and experienced in rating methodological quality of clinical trials developed the scale and participated in reliability studies. The SCED Scale is an 11-item rating scale for single-subject designs, of which 10 items are used to assess methodological quality and use of statistical analysis. The scale was developed and refined over a 3-year period. Content validity was addressed by identifying items to reduce the main sources of bias in single-case methodology as stipulated by authorities in the field, which were empirically tested against 85 published reports. Inter-rater reliability was assessed using a random sample of 20/312 single-subject reports archived in the Psychological Database of Brain Impairment Treatment Efficacy (PsycBITE). Inter-rater reliability for the total score was excellent, both for individual raters (overall ICC = 0.84; 95% confidence interval 0.73-0.92) and for consensus ratings between pairs of raters (overall ICC = 0.88; 95% confidence interval 0.78-0.95). Item reliability was fair to excellent for consensus ratings between pairs of raters (range k = 0.48 to 1.00). The results were replicated with two independent novice raters who were trained in the use of the scale (ICC = 0.88, 95% confidence interval 0.73-0.95). The SCED Scale thus provides a brief and valid evaluation of methodological quality of single-subject designs, with the total score demonstrating excellent inter-rater reliability using both individual and consensus ratings. Items from the scale can also be used as a checklist in the design, reporting and critical

  16. Evaluation and testing methodology for evolving entertainment systems

    NARCIS (Netherlands)

    Jurgelionis, A.; Bellotti, F.; IJsselsteijn, W.A.; Kort, de Y.A.W.; Bernhaupt, R.; Tscheligi, M.

    2007-01-01

    This paper presents a testing and evaluation methodology for evolving pervasive gaming and multimedia systems. We introduce the Games@Large system, a complex gaming and multimedia architecture comprised of a multitude of elements: heterogeneous end user devices, wireless and wired network

  17. Prediction of work metabolism from heart rate measurements in forest work: some practical methodological issues.

    Science.gov (United States)

    Dubé, Philippe-Antoine; Imbeau, Daniel; Dubeau, Denise; Auger, Isabelle; Leone, Mario

    2015-01-01

    Individual heart rate (HR) to workload relationships were determined using 93 submaximal step-tests administered to 26 healthy participants attending physical activities in a university training centre (laboratory study) and 41 experienced forest workers (field study). Predicted maximum aerobic capacity (MAC) was compared to measured MAC from a maximal treadmill test (laboratory study) to test the effect of two age-predicted maximum HR Equations (220-age and 207-0.7 × age) and two clothing insulation levels (0.4 and 0.91 clo) during the step-test. Work metabolism (WM) estimated from forest work HR was compared against concurrent work V̇O2 measurements while taking into account the HR thermal component. Results show that MAC and WM can be accurately predicted from work HR measurements and simple regression models developed in this study (1% group mean prediction bias and up to 25% expected prediction bias for a single individual). Clothing insulation had no impact on predicted MAC nor age-predicted maximum HR equations. Practitioner summary: This study sheds light on four practical methodological issues faced by practitioners regarding the use of HR methodology to assess WM in actual work environments. More specifically, the effect of wearing work clothes and the use of two different maximum HR prediction equations on the ability of a submaximal step-test to assess MAC are examined, as well as the accuracy of using an individual's step-test HR to workload relationship to predict WM from HR data collected during actual work in the presence of thermal stress.

  18. Security Testing in Agile Web Application Development - A Case Study Using the EAST Methodology

    CERN Document Server

    Erdogan, Gencer

    2010-01-01

    There is a need for improved security testing methodologies specialized for Web applications and their agile development environment. The number of web application vulnerabilities is drastically increasing, while security testing tends to be given a low priority. In this paper, we analyze and compare Agile Security Testing with two other common methodologies for Web application security testing, and then present an extension of this methodology. We present a case study showing how our Extended Agile Security Testing (EAST) performs compared to a more ad hoc approach used within an organization. Our working hypothesis is that the detection of vulnerabilities in Web applications will be significantly more efficient when using a structured security testing methodology specialized for Web applications, compared to existing ad hoc ways of performing security tests. Our results show a clear indication that our hypothesis is on the right track.

  19. The fractional scaling methodology (FSM) Part 1. methodology development

    International Nuclear Information System (INIS)

    Novak Zuber; Ivan Catton; Upendra S Rohatgi; Wolfgang Wulff

    2005-01-01

    Full text of publication follows: a quantitative methodology is developed, based on the concepts of hierarchy and synthesis, to integrate and organize information and data. The methodology uses scaling to synthesize experimental data and analytical results, and to provide quantitative criteria for evaluating the effects of various design and operating parameters that influence processes in a complex system such as a nuclear power plant or a related test facility. Synthesis and scaling are performed on three hierarchical levels: the process, component and system levels. Scaling on the process level determines the effect of a selected process on a particular state variable during a selected scenario. At the component level this scaling determines the effects various processes have on a state variable, and it ranks the processes according to their importance by the magnitude of the fractional change they cause on that state variable. At the system level the scaling determines the governing processes and corresponding components, ranking these in the order of importance according to their effect on the fractional change of system-wide state variables. The scaling methodology reveals on all levels the fractional change of state variables and is called therefore the Fractional Scaling Methodology (FSM). FSM synthesizes process parameters and assigns to each thermohydraulic process a dimensionless effect metric Ω = ωt, that is the product of the specific rate of fractional change ω and the characteristic time t. The rate of fractional change ω is the ratio of process transport rate over content of a preserved quantity in a component. The effect metric Ω quantifies the contribution of the process to the fractional change of a state variable in a given component. Ordering of a component effect metrics provides the hierarchy of processes in a component, then in all components and the system. FSM separates quantitatively dominant from minor processes and components and

  20. Photovoltaic module energy rating methodology development

    Energy Technology Data Exchange (ETDEWEB)

    Kroposki, B.; Myers, D.; Emery, K.; Mrig, L. [National Renewable Energy Lab., Golden, CO (United States); Whitaker, C.; Newmiller, J. [Endecon Engineering, San Ramon, CA (United States)

    1996-05-01

    A consensus-based methodology to calculate the energy output of a PV module will be described in this paper. The methodology develops a simple measure of PV module performance that provides for a realistic estimate of how a module will perform in specific applications. The approach makes use of the weather data profiles that describe conditions throughout the United States and emphasizes performance differences between various module types. An industry-representative Technical Review Committee has been assembled to provide feedback and guidance on the strawman and final approach used in developing the methodology.

  1. Universal Verification Methodology Based Register Test Automation Flow.

    Science.gov (United States)

    Woo, Jae Hun; Cho, Yong Kwan; Park, Sun Kyu

    2016-05-01

    In today's SoC design, the number of registers has been increased along with complexity of hardware blocks. Register validation is a time-consuming and error-pron task. Therefore, we need an efficient way to perform verification with less effort in shorter time. In this work, we suggest register test automation flow based UVM (Universal Verification Methodology). UVM provides a standard methodology, called a register model, to facilitate stimulus generation and functional checking of registers. However, it is not easy for designers to create register models for their functional blocks or integrate models in test-bench environment because it requires knowledge of SystemVerilog and UVM libraries. For the creation of register models, many commercial tools support a register model generation from register specification described in IP-XACT, but it is time-consuming to describe register specification in IP-XACT format. For easy creation of register model, we propose spreadsheet-based register template which is translated to IP-XACT description, from which register models can be easily generated using commercial tools. On the other hand, we also automate all the steps involved integrating test-bench and generating test-cases, so that designers may use register model without detailed knowledge of UVM or SystemVerilog. This automation flow involves generating and connecting test-bench components (e.g., driver, checker, bus adaptor, etc.) and writing test sequence for each type of register test-case. With the proposed flow, designers can save considerable amount of time to verify functionality of registers.

  2. HIV / AIDS prevalence testing - merits, methodology and outcomes ...

    African Journals Online (AJOL)

    HIV / AIDS prevalence testing - merits, methodology and outcomes of a survey conducted at a large mining organisation in South Africa. ... These baseline prevalence data also provide an opportunity for monitoring of proposed interventions using cross-sectional surveys at designated intervals in the future. South African ...

  3. Performance Testing Methodology for Safety-Critical Programmable Logic Controller

    International Nuclear Information System (INIS)

    Kim, Chang Ho; Oh, Do Young; Kim, Ji Hyeon; Kim, Sung Ho; Sohn, Se Do

    2009-01-01

    The Programmable Logic Controller (PLC) for use in Nuclear Power Plant safety-related applications is being developed and tested first time in Korea. This safety-related PLC is being developed with requirements of regulatory guideline and industry standards for safety system. To test that the quality of the developed PLC is sufficient to be used in safety critical system, document review and various product testings were performed over the development documents for S/W, H/W, and V/V. This paper provides the performance testing methodology and its effectiveness for PLC platform conducted by KOPEC

  4. Metodologia de rating em cooperativas agropecuárias: um estudo de caso Rating methodology in agricultural cooperatives: a case study

    Directory of Open Access Journals (Sweden)

    Davi Rogério de Moura Costa

    2009-12-01

    information asymmetry among cooperatives and the financial markets agents. In this way, developing a rating methodology applied to agricultural cooperatives in Brazil to reduce moral hazard and adverse selection that creates business and contractual organizational inefficiency among cooperatives and financial markets agents. The 'five C' method was used which considers organizational financial capacity, corporate governance characteristics, relations with members, and commodity market characteristics, among others. A weighting and evaluation were associated with each variable. The result was applied to an agricultural cooperative case study which concluded that this methodology is applicable and that results, evaluations and weights, could be discussed in cooperative rating committees. As a final consideration, an agenda is discussed for new applications of this methodology, for testing in other agricultural cooperative organizations and to consolidate it as a market financial information signal.

  5. Methodology for testing subcomponents; background and motivation for subcomponent testing of wind turbine rotor blades

    DEFF Research Database (Denmark)

    Antoniou, Alexandros; Branner, Kim; Lekou, D.J.

    2016-01-01

    This report aims to provide an overview of the design methodology followed by wind turbine blade structural designers, along with the testing procedure on full scale blades which are followed by testing laboratories for blade manufacturers as required by the relevant standards and certification...... bodies’ recommendations for design and manufacturing verification. The objective of the report is not to criticize the design methodology or testing procedure and the standards thereof followed in the wind energy community, but to identify those items offered by state of the art structural design tools...... investigations performed are based on the INNWIND.EU reference 10MW horizontal axis wind turbine [1]. The structural properties and material and layout definition used within IRPWIND are defined in the INNWIND.EU report [2]. The layout of the report includes a review of the structural analysis models used...

  6. Applying Lean Six Sigma methodology to reduce cesarean section rate.

    Science.gov (United States)

    Chai, Ze-Ying; Hu, Hua-Min; Ren, Xiu-Ling; Zeng, Bao-Jin; Zheng, Ling-Zhi; Qi, Feng

    2017-06-01

    This study aims to reduce cesarean section rate and increase rate of vaginal delivery. By using Lean Six Sigma (LSS) methodology, the cesarean section rate was investigated and analyzed through a 5-phase roadmap consisting of Define, Measure, Analyze, Improve, and Control. The principal causes of cesarean section were identified, improvement measures were implemented, and the rate of cesarean section before and after intervention was compared. After patients with a valid medical reason for cesarean were excluded, the main causes of cesarean section were maternal request, labor pain, parturient women assessment, and labor observation. A series of measures was implemented, including an improved parturient women assessment system, strengthened pregnancy nutrition guidance, implementation of painless labor techniques, enhanced midwifery team building, and promotion of childbirth-assist skills. Ten months after introduction of the improvement measures, the cesarean section rate decreased from 41.83% to 32.00%, and the Six Sigma score (ie, Z value) increased from 1.706 to 1.967 (P < .001). LSS is an effective way to reduce the rate of cesarean section. © 2016 John Wiley & Sons, Ltd.

  7. An engineering methodology for implementing and testing VLSI (Very Large Scale Integrated) circuits

    Science.gov (United States)

    Corliss, Walter F., II

    1989-03-01

    The engineering methodology for producing a fully tested VLSI chip from a design layout is presented. A 16-bit correlator, NPS CORN88, that was previously designed, was used as a vehicle to demonstrate this methodology. The study of the design and simulation tools, MAGIC and MOSSIM II, was the focus of the design and validation process. The design was then implemented and the chip was fabricated by MOSIS. This fabricated chip was then used to develop a testing methodology for using the digital test facilities at NPS. NPS CORN88 was the first full custom VLSI chip, designed at NPS, to be tested with the NPS digital analysis system, Tektronix DAS 9100 series tester. The capabilities and limitations of these test facilities are examined. NPS CORN88 test results are included to demonstrate the capabilities of the digital test system. A translator, MOS2DAS, was developed to convert the MOSSIM II simulation program to the input files required by the DAS 9100 device verification software, 91DVS. Finally, a tutorial for using the digital test facilities, including the DAS 9100 and associated support equipments, is included as an appendix.

  8. Methodological studies for long range environmental gamma rate survey in Brazil

    International Nuclear Information System (INIS)

    Souza, Elder M.; Wasserman, Maria Angelica V.; Rochedo, Elaine R. R.

    2009-01-01

    The objective of this work is to support the establishment of a methodology for gamma radiation survey over large areas in order to estimate public exposure to natural background radiation in Brazil. In a first stage, two different sites close to large water bodies were chosen, Guanabara Bay, RJ and Amazon River close to Santarem, PA. Early results showed similar results for over water surveys despite the type of water body. Dose rates over land are higher than those over water, due to the natural radioactivity on soil, pavements and other building materials. In this study the focus was on variability of measurements performed in the same area and variability for different types of area, including roads and urbanized environments. Several measurements have been performed of several areas, that included roads and towns in Para, Bahia, Rio de Janeiro and Minas Gerais. Measurements were done by car and on boats, using a AT6101C Scanner - Spectral Radiation Scanner. Differences were detected for different areas, with roads generally presenting lower dose rates than highly urbanized areas. Also, for roads close to granite rocks and mountains, dose rates are higher than those at both coastal areas and inland lowlands. Large towns present large variability, with individual measurements close to average dose rates from anomalous uranium sites. The results will be used to derive a methodology for assessing background radiation exposure for the Brazilian population. It can be concluded that surveys are to be based on population distribution grids rather than on a simple area based grid distribution, due to both the uneven population distribution and the variability on external dose rates throughout the Brazilian territory. (author)

  9. A European test of pesticide-leaching models: methodology and major recommendations

    NARCIS (Netherlands)

    Vanclooster, M.; Boesten, J.J.T.I.; Trevisan, M.; Brown, C.D.; Capri, E.; Eklo, O.M.; Gottesbüren, B.; Gouy, V.; Linden, van der A.M.A.

    2000-01-01

    Testing of pesticide-leaching models is important in view of their increasing use in pesticide registration procedures in the European Union. This paper presents the methodology and major conclusions of a test of pesticide-leaching models. Twelve models simulating the vertical one-dimensional

  10. Buffer Construction Methodology in Demonstration Test For Cavern Type Disposal Facility

    International Nuclear Information System (INIS)

    Yoshihiro, Akiyama; Takahiro, Nakajima; Katsuhide, Matsumura; Kenji, Terada; Takao, Tsuboya; Kazuhiro, Onuma; Tadafumi, Fujiwara

    2009-01-01

    A number of studies concerning a cavern type disposal facility have been carried out for disposal of low level radioactive waste mainly generated by power plant decommissioning in Japan. The disposal facility is composed of an engineered barrier system with concrete pit and bentonite buffer, and planed to be constructed in sub-surface 50 - 100 meters depth. Though the previous studies have mainly used laboratory and mock-up tests, we conducted a demonstration test in a full-size cavern. The main objectives of the test were to study the construction methodology and to confirm the quality of the engineered barrier system. The demonstration test was planned as the construction of full scale mock-up. It was focused on a buffer construction test to evaluate the construction methodology and quality control in this paper. Bentonite material was compacted to 1.6 Mg/m 3 in-site by large vibrating roller in this test. Through the construction of the buffer part, a 1.6 Mg/m 3 of the density was accomplished, and the data of workability and quality is collected. (authors)

  11. Two-step rating-based 'double-faced applicability' test for sensory analysis of spread products as an alternative to descriptive analysis with trained panel.

    Science.gov (United States)

    Kim, In-Ah; den-Hollander, Elyn; Lee, Hye-Seong

    2018-03-01

    Descriptive analysis with a trained sensory panel has thus far been the most well defined methodology to characterize various products. However, in practical terms, intensive training in descriptive analysis has been recognized as a serious defect. To overcome this limitation, various novel rapid sensory profiling methodologies have been suggested in the literature. Among these, attribute-based methodologies such as check-all-that-apply (CATA) questions showed results comparable to those of conventional sensory descriptive analysis. Kim, Hopkinson, van Hout, and Lee (2017a, 2017b) have proposed a novel attribute-based methodology termed the two-step rating-based 'double-faced applicability' test with a novel output measure of applicability magnitude (d' A ) for measuring consumers' product usage experience throughout various product usage stages. In this paper, the potential of the two-step rating-based 'double-faced applicability' test with d' A was investigated as an alternative to conventional sensory descriptive analysis in terms of sensory characterization and product discrimination. Twelve commercial spread products were evaluated using both conventional sensory descriptive analysis with a trained sensory panel and two-step rating-based 'double-faced applicability' test with an untrained sensory panel. The results demonstrated that the 'double-faced applicability' test can be used to provide a direct measure of the applicability magnitude of sensory attributes of the samples tested in terms of d' A for sensory characterization of individual samples and multiple sample comparisons. This suggests that when the appropriate list of attributes to be used in the questionnaire is already available, the two-step rating-based 'double-faced applicability' test with d' A can be used as a more efficient alternative to conventional descriptive analysis, without requiring any intensive training process. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Reference Performance Test Methodology for Degradation Assessment of Lithium-Sulfur Batteries

    DEFF Research Database (Denmark)

    Knap, Vaclav; Stroe, Daniel-Ioan; Purkayastha, Rajlakshmi

    2018-01-01

    Lithium-Sulfur (Li-S) is an emerging battery technology receiving a growing amount of attention due to its potentially high gravimetric energy density, safety, and low production cost. However, there are still some obstacles preventing its swift commercialization. Li-S batteries are driven...... by different electrochemical processes than commonly used Lithium-ion batteries, which often results in very different behavior. Therefore, the testing and modeling of these systems have to be adjusted to reflect their unique behavior and to prevent possible bias. A methodology for a Reference Performance Test...... (RPT) for the Li-S batteries is proposed in this study to point out Li-S battery features and provide guidance to users how to deal with them and possible results into standardization. The proposed test methodology is demonstrated for 3.4 Ah Li-S cells aged under different conditions....

  13. 49 CFR 1109.4 - Mandatory mediation in rate cases to be considered under the stand-alone cost methodology.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 8 2010-10-01 2010-10-01 false Mandatory mediation in rate cases to be considered... § 1109.4 Mandatory mediation in rate cases to be considered under the stand-alone cost methodology. (a) A... methodology must engage in non-binding mediation of its dispute with the railroad upon filing a formal...

  14. A study on assessment methodology of surveillance test interval and allowed outage time

    International Nuclear Information System (INIS)

    Che, Moo Seong; Cheong, Chang Hyeon; Lee, Byeong Cheol

    1996-07-01

    The objectives of this study is the development of methodology by which assessing the optimizes Surveillance Test Interval(STI) and Allowed Outage Time(AOT) using PSA method that can supplement the current deterministic methods and the improvement of Korea nuclear power plants safety. In the first year of this study, the survey about the assessment methodologies, modeling and results performed by domestic and international researches is performed as the basic step before developing the assessment methodology of this study. The assessment methodology that supplement the revealed problems in many other studies is presented and the application of new methodology into the example system assures the feasibility of this method

  15. A study on assessment methodology of surveillance test interval and allowed outage time

    Energy Technology Data Exchange (ETDEWEB)

    Che, Moo Seong; Cheong, Chang Hyeon; Lee, Byeong Cheol [Seoul Nationl Univ., Seoul (Korea, Republic of)] (and others)

    1996-07-15

    The objectives of this study is the development of methodology by which assessing the optimizes Surveillance Test Interval(STI) and Allowed Outage Time(AOT) using PSA method that can supplement the current deterministic methods and the improvement of Korea nuclear power plants safety. In the first year of this study, the survey about the assessment methodologies, modeling and results performed by domestic and international researches is performed as the basic step before developing the assessment methodology of this study. The assessment methodology that supplement the revealed problems in many other studies is presented and the application of new methodology into the example system assures the feasibility of this method.

  16. Models of expected returns on the brazilian market: Empirical tests using predictive methodology

    Directory of Open Access Journals (Sweden)

    Adriano Mussa

    2009-01-01

    Full Text Available Predictive methodologies for test of the expected returns models are largely diffused on the international academic environment. However, these methods have not been used in Brazil in a systematic way. Generally, empirical studies proceeded with Brazilian stock market data are concentrated only in the first step of these methodologies. The purpose of this article was test and compare the models CAPM, 3-factors and 4-factors using a predictive methodology, considering two steps – temporal and cross-section regressions – with standard errors obtained by the techniques of Fama and Macbeth (1973. The results indicated the superiority of the 4-fators model as compared to the 3-fators model, and the superiority of the 3- factors model as compared to the CAPM, but no one of the tested models were enough on the explanation of the Brazilian stock returns. Contrary to some empirical evidences, that do not use predictive methodology, the size and momentum effect seem do not exist on the Brazilian capital markets, but there are evidences of the value effect and the relevance of the market for explanation of expected returns. These finds rise some questions, mainly caused by the originality of the methodology on the local market and by the fact that this subject is still incipient and polemic on the Brazilian academic environment.

  17. 26 CFR 1.1274-4 - Test rate.

    Science.gov (United States)

    2010-04-01

    ... applicable Federal rate for the debt instrument is a foreign currency rate of interest that is analogous to... test rate of interest—(1) In general—(i) Test rate is the 3-month rate. Except as provided in paragraph (a)(2) of this section, the test rate of interest for a debt instrument issued in consideration for...

  18. Test Methodologies for Hydrogen Sensor Performance Assessment: Chamber vs. Flow Through Test Apparatus: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Buttner, William J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hartmann, Kevin S [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Schmidt, Kara [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Cebolla, Rafeal O [Joint Research Centre, Petten, the Netherlands; Weidner, Eveline [Joint Research Centre, Petten, the Netherlands; Bonato, Christian [Joint Research Centre, Petten, the Netherlands

    2017-11-06

    Certification of hydrogen sensors to standards often prescribes using large-volume test chambers [1, 2]. However, feedback from stakeholders such as sensor manufacturers and end-users indicate that chamber test methods are often viewed as too slow and expensive for routine assessment. Flow through test methods potentially are an efficient, cost-effective alternative for sensor performance assessment. A large number of sensors can be simultaneously tested, in series or in parallel, with an appropriate flow through test fixture. The recent development of sensors with response times of less than 1s mandates improvements in equipment and methodology to properly capture the performance of this new generation of fast sensors; flow methods are a viable approach for accurate response and recovery time determinations, but there are potential drawbacks. According to ISO 26142 [1], flow through test methods may not properly simulate ambient applications. In chamber test methods, gas transport to the sensor can be dominated by diffusion which is viewed by some users as mimicking deployment in rooms and other confined spaces. Alternatively, in flow through methods, forced flow transports the gas to the sensing element. The advective flow dynamics may induce changes in the sensor behaviour relative to the quasi-quiescent condition that may prevail in chamber test methods. One goal of the current activity in the JRC and NREL sensor laboratories [3, 4] is to develop a validated flow through apparatus and methods for hydrogen sensor performance testing. In addition to minimizing the impact on sensor behaviour induced by differences in flow dynamics, challenges associated with flow through methods include the ability to control environmental parameters (humidity, pressure and temperature) during the test and changes in the test gas composition induced by chemical reactions with upstream sensors. Guidelines on flow through test apparatus design and protocols for the evaluation of

  19. Auditing HIV Testing Rates across Europe

    DEFF Research Database (Denmark)

    Raben, D; Mocroft, A; Rayment, M

    2015-01-01

    European guidelines recommend the routine offer of an HIV test in patients with a number of AIDS-defining and non-AIDS conditions believed to share an association with HIV; so called indicator conditions (IC). Adherence with this guidance across Europe is not known. We audited HIV testing behaviour...... audits from 23 centres, representing 7037 patients. The median test rate across audits was 72% (IQR 32-97), lowest in Northern Europe (median 44%, IQR 22-68%) and highest in Eastern Europe (median 99%, IQR 86-100). Uptake of testing was close to 100% in all regions. The median HIV+ rate was 0.9% (IQR 0.......0-4.9), with 29 audits (60.4%) having an HIV+ rate >0.1%. After adjustment, there were no differences between regions of Europe in the proportion with >0.1% testing positive (global p = 0.14). A total of 113 patients tested HIV+. Applying the observed rates of testing HIV+ within individual ICs and regions to all...

  20. Postal auditing methodology used to find out the performance of high rate brachytherapy equipment

    International Nuclear Information System (INIS)

    Morales, J.A.; Campa, R.

    1998-01-01

    This work describes results from a methodology implemented at the Secondary Laboratory for Dosimetric Calibration at CPHR used to check the brachytherapy performance at high doses rate using Cesium 137 or cobalt 60 sources

  1. Small punch creep test: A promising methodology for high temperature plant components life evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Tettamanti, S [CISE SpA, Milan (Italy); Crudeli, R [ENEL SpA, Milan (Italy)

    1999-12-31

    CISE and ENEL are involved for years in a miniaturization creep methodology project to obtain similar non-destructive test with the same standard creep test reliability. The goal can be reached with `Small punch creep test` that collect all the requested characteristics; quasi nondestructive disk specimens extracted both on external or internal side of components, than accurately machined and tested on little and cheap apparatus. CISE has developed complete creep small punch procedure that involved peculiar test facility and correlation`s law comparable with the more diffused isostress methodology for residual life evaluation on ex-serviced high temperature plant components. The aim of this work is to obtain a simple and immediately applicable relationship useful for plant maintenance managing. More added work is need to validate the Small Punch methodology and for relationship calibration on most diffusion high temperature structural materials. First obtained results on a comparative work on ASTM A355 P12 ex-serviced pipe material are presented joint with a description of the Small Punch apparatus realized in CISE. (orig.) 6 refs.

  2. Small punch creep test: A promising methodology for high temperature plant components life evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Tettamanti, S. [CISE SpA, Milan (Italy); Crudeli, R. [ENEL SpA, Milan (Italy)

    1998-12-31

    CISE and ENEL are involved for years in a miniaturization creep methodology project to obtain similar non-destructive test with the same standard creep test reliability. The goal can be reached with `Small punch creep test` that collect all the requested characteristics; quasi nondestructive disk specimens extracted both on external or internal side of components, than accurately machined and tested on little and cheap apparatus. CISE has developed complete creep small punch procedure that involved peculiar test facility and correlation`s law comparable with the more diffused isostress methodology for residual life evaluation on ex-serviced high temperature plant components. The aim of this work is to obtain a simple and immediately applicable relationship useful for plant maintenance managing. More added work is need to validate the Small Punch methodology and for relationship calibration on most diffusion high temperature structural materials. First obtained results on a comparative work on ASTM A355 P12 ex-serviced pipe material are presented joint with a description of the Small Punch apparatus realized in CISE. (orig.) 6 refs.

  3. A study on assessment methodology of surveillance test interval and Allowed Outage Time

    Energy Technology Data Exchange (ETDEWEB)

    Che, Moo Seong; Cheong, Chang Hyeon; Ryu, Yeong Woo; Cho, Jae Seon; Heo, Chang Wook; Kim, Do Hyeong; Kim, Joo Yeol; Kim, Yun Ik; Yang, Hei Chang [Seoul National Univ., Seoul (Korea, Republic of)

    1997-07-15

    Objectives of this study is the development of methodology by which assesses the optimization of Surveillance Test Interval(STI) and Allowed Outage Time(AOT) using PSA method that can supplement the current deterministic methods and the improvement of Korean nuclear power plants safety. In the first year of this study, the survey about the assessment methodologies, modeling and results performed by domestic and international researches are performed as the basic step before developing the assessment methodology of this study. The assessment methodology that supplement the revealed problems in many other studies is presented and the application of new methodology into the example system assures the feasibility of this method. In the second year of this study, the sensitivity analyses about the failure factors of the components are performed in the bases of the assessment methodologies of the first study, the interaction modeling of the STI and AOT is quantified. And the reliability assessment methodology about the diesel generator is reviewed and applied to the PSA code.

  4. A study on assessment methodology of surveillance test interval and Allowed Outage Time

    International Nuclear Information System (INIS)

    Che, Moo Seong; Cheong, Chang Hyeon; Ryu, Yeong Woo; Cho, Jae Seon; Heo, Chang Wook; Kim, Do Hyeong; Kim, Joo Yeol; Kim, Yun Ik; Yang, Hei Chang

    1997-07-01

    Objectives of this study is the development of methodology by which assesses the optimization of Surveillance Test Interval(STI) and Allowed Outage Time(AOT) using PSA method that can supplement the current deterministic methods and the improvement of Korean nuclear power plants safety. In the first year of this study, the survey about the assessment methodologies, modeling and results performed by domestic and international researches are performed as the basic step before developing the assessment methodology of this study. The assessment methodology that supplement the revealed problems in many other studies is presented and the application of new methodology into the example system assures the feasibility of this method. In the second year of this study, the sensitivity analyses about the failure factors of the components are performed in the bases of the assessment methodologies of the first study, the interaction modeling of the STI and AOT is quantified. And the reliability assessment methodology about the diesel generator is reviewed and applied to the PSA code

  5. Containment leakage rate testing requirements

    International Nuclear Information System (INIS)

    Arndt, E.G.

    1992-01-01

    This report presents the status of several documents under revision or development that provide requirements and guidance for testing nuclear power plant containment systems for leakage rates. These documents include the general revision to 10 CFR Part 50, Appendix J; the regulatory guide affiliated with the revision to Appendix J; the national standard that the regulatory guide endorses, ANSI/ANS-56.8, 'Containment System Leakage Rate Testing Requirements'; and the draft industry Licensing Topical Report, 'Standardized Program for Primary Containment Integrity Testing'. The actual or potential relationships between these documents are also explored

  6. Selected hydraulic test analysis techniques for constant-rate discharge tests

    International Nuclear Information System (INIS)

    Spane, F.A. Jr.

    1993-03-01

    The constant-rate discharge test is the principal field method used in hydrogeologic investigations for characterizing the hydraulic properties of aquifers. To implement this test, the aquifer is stressed by withdrawing ground water from a well, by using a downhole pump. Discharge during the withdrawal period is regulated and maintained at a constant rate. Water-level response within the well is monitored during the active pumping phase (i.e., drawdown) and during the subsequent recovery phase following termination of pumping. The analysis of drawdown and recovery response within the stress well (and any monitored, nearby observation wells) provides a means for estimating the hydraulic properties of the tested aquifer, as well as discerning formational and nonformational flow conditions (e.g., wellbore storage, wellbore damage, presence of boundaries, etc.). Standard analytical methods that are used for constant-rate pumping tests include both log-log type-curve matching and semi-log straight-line methods. This report presents a current ''state of the art'' review of selected transient analysis procedures for constant-rate discharge tests. Specific topics examined include: analytical methods for constant-rate discharge tests conducted within confined and unconfined aquifers; effects of various nonideal formation factors (e.g., anisotropy, hydrologic boundaries) and well construction conditions (e.g., partial penetration, wellbore storage) on constant-rate test response; and the use of pressure derivatives in diagnostic analysis for the identification of specific formation, well construction, and boundary conditions

  7. Mathematical model of marine diesel engine simulator for a new methodology of self propulsion tests

    Energy Technology Data Exchange (ETDEWEB)

    Izzuddin, Nur; Sunarsih,; Priyanto, Agoes [Faculty of Mechanical Engineering, Universiti Teknologi Malaysia, 81310 Skudai, Johor (Malaysia)

    2015-05-15

    As a vessel operates in the open seas, a marine diesel engine simulator whose engine rotation is controlled to transmit through propeller shaft is a new methodology for the self propulsion tests to track the fuel saving in a real time. Considering the circumstance, this paper presents the real time of marine diesel engine simulator system to track the real performance of a ship through a computer-simulated model. A mathematical model of marine diesel engine and the propeller are used in the simulation to estimate fuel rate, engine rotating speed, thrust and torque of the propeller thus achieve the target vessel’s speed. The input and output are a real time control system of fuel saving rate and propeller rotating speed representing the marine diesel engine characteristics. The self-propulsion tests in calm waters were conducted using a vessel model to validate the marine diesel engine simulator. The simulator then was used to evaluate the fuel saving by employing a new mathematical model of turbochargers for the marine diesel engine simulator. The control system developed will be beneficial for users as to analyze different condition of vessel’s speed to obtain better characteristics and hence optimize the fuel saving rate.

  8. Covariance methodology applied to 35S disintegration rate measurements by the CIEMAT/NIST method

    International Nuclear Information System (INIS)

    Koskinas, M.F.; Nascimento, T.S.; Yamazaki, I.M.; Dias, M.S.

    2014-01-01

    The Nuclear Metrology Laboratory (LMN) at IPEN is carrying out measurements in a LSC (Liquid Scintillation Counting system), applying the CIEMAT/NIST method. In this context 35 S is an important radionuclide for medical applications and it is difficult to be standardized by other primary methods due to low beta ray energy. The CIEMAT/NIST is a standard technique used by most metrology laboratories in order to improve accuracy and speed up beta emitter standardization. The focus of the present work was to apply the covariance methodology for determining the overall uncertainty in the 35 S disintegration rate. All partial uncertainties involved in the measurements were considered, taking into account all possible correlations between each pair of them. - Highlights: ► 35 S disintegration rate measured in Liquid Scintillator system using CIEMAT/NIST method. ► Covariance methodology applied to the overall uncertainty in the 35 S disintegration rate. ► Monte Carlo simulation was applied to determine 35 S activity in the 4πβ(PC)-γ coincidence system

  9. High rate loading tests and impact tests of concrete and reinforcement

    International Nuclear Information System (INIS)

    Takeda, J.I.; Tachikawa, H.; Fujimoto, K.

    1982-01-01

    The responses of reinforced concrete structural members and structures subjected to impact or impulsive loadings are affected by the behavior of constituent concrete and reinforcement which are the synthesis of the rate effects and the contribution of propagating stress waves of them. The rate effects and the contribution of stress waves do not have the same tendency in the variation of magnitude of them with speed of impact or impulsive loadings. Therefore the rate effects, mentioned above, should be obtained by the tests minimized the effect of stress waves (high rate loading test). This paper deals with the testing techniques with high rate loadings and impact, and also reports the main results of these tests. (orig.) [de

  10. A methodology to investigate size scale effects in crystalline plasticity using uniaxial compression testing

    International Nuclear Information System (INIS)

    Uchic, Michael D.; Dimiduk, Dennis M.

    2005-01-01

    A methodology for performing uniaxial compression tests on samples having micron-size dimensions is presented. Sample fabrication is accomplished using focused ion beam milling to create cylindrical samples of uniform cross-section that remain attached to the bulk substrate at one end. Once fabricated, samples are tested in uniaxial compression using a nanoindentation device outfitted with a flat tip, and a stress-strain curve is obtained. The methodology can be used to examine the plastic response of samples of different sizes that are from the same bulk material. In this manner, dimensional size effects at the micron scale can be explored for single crystals, using a readily interpretable test that minimizes imposed stretch and bending gradients. The methodology was applied to a single-crystal Ni superalloy and a transition from bulk-like to size-affected behavior was observed for samples 5 μm in diameter and smaller

  11. Rating the methodological quality in systematic reviews of studies on measurement properties: a scoring system for the COSMIN checklist.

    Science.gov (United States)

    Terwee, Caroline B; Mokkink, Lidwine B; Knol, Dirk L; Ostelo, Raymond W J G; Bouter, Lex M; de Vet, Henrica C W

    2012-05-01

    The COSMIN checklist is a standardized tool for assessing the methodological quality of studies on measurement properties. It contains 9 boxes, each dealing with one measurement property, with 5-18 items per box about design aspects and statistical methods. Our aim was to develop a scoring system for the COSMIN checklist to calculate quality scores per measurement property when using the checklist in systematic reviews of measurement properties. The scoring system was developed based on discussions among experts and testing of the scoring system on 46 articles from a systematic review. Four response options were defined for each COSMIN item (excellent, good, fair, and poor). A quality score per measurement property is obtained by taking the lowest rating of any item in a box ("worst score counts"). Specific criteria for excellent, good, fair, and poor quality for each COSMIN item are described. In defining the criteria, the "worst score counts" algorithm was taken into consideration. This means that only fatal flaws were defined as poor quality. The scores of the 46 articles show how the scoring system can be used to provide an overview of the methodological quality of studies included in a systematic review of measurement properties. Based on experience in testing this scoring system on 46 articles, the COSMIN checklist with the proposed scoring system seems to be a useful tool for assessing the methodological quality of studies included in systematic reviews of measurement properties.

  12. Client Perceptions of Helpfulness in Therapy: a Novel Video-Rating Methodology for Examining Process Variables at Brief Intervals During a Single Session.

    Science.gov (United States)

    Cocklin, Alexandra A; Mansell, Warren; Emsley, Richard; McEvoy, Phil; Preston, Chloe; Comiskey, Jody; Tai, Sara

    2017-11-01

    The value of clients' reports of their experiences in therapy is widely recognized, yet quantitative methodology has rarely been used to measure clients' self-reported perceptions of what is helpful over a single session. A video-rating method using was developed to gather data at brief intervals using process measures of client perceived experience and standardized measures of working alliance (Session Rating Scale; SRS). Data were collected over the course of a single video-recorded session of cognitive therapy (Method of Levels Therapy; Carey, 2006; Mansell et al., 2012). We examined the acceptability and feasibility of the methodology and tested the concurrent validity of the measure by utilizing theory-led constructs. Eighteen therapy sessions were video-recorded and clients each rated a 20-minute session of therapy at two-minute intervals using repeated measures. A multi-level analysis was used to test for correlations between perceived levels of helpfulness and client process variables. The design proved to be feasible. Concurrent validity was borne out through high correlations between constructs. A multi-level regression examined the independent contributions of client process variables to client perceived helpfulness. Client perceived control (b = 0.39, 95% CI .05 to 0.73), the ability to talk freely (b = 0.30, SE = 0.11, 95% CI .09 to 0.51) and therapist approach (b = 0.31, SE = 0.14, 95% CI .04 to 0.57) predicted client-rated helpfulness. We identify a feasible and acceptable method for studying continuous measures of helpfulness and their psychological correlates during a single therapy session.

  13. Methodology and application of 13C breath test in gastroenterology practice

    International Nuclear Information System (INIS)

    Yan Weili; Jiang Yibin

    2002-01-01

    13 C breath test has been widely used in research of nutrition, pharmacology and gastroenterology for its properties such as safety, non-invasion and so on. The author describes the principle, methodology of 13 C breath test and its application in detection to Helico-bacteria pylori infection in stomach and small bowl bacterial overgrowth, measurement of gastric emptying, pancreatic exocrine function and liver function with various substrates

  14. The Reliability of Methodological Ratings for speechBITE Using the PEDro-P Scale

    Science.gov (United States)

    Murray, Elizabeth; Power, Emma; Togher, Leanne; McCabe, Patricia; Munro, Natalie; Smith, Katherine

    2013-01-01

    Background: speechBITE (http://www.speechbite.com) is an online database established in order to help speech and language therapists gain faster access to relevant research that can used in clinical decision-making. In addition to containing more than 3000 journal references, the database also provides methodological ratings on the PEDro-P (an…

  15. A study on assessment methodology of surveillance test interval and allowed outage time

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Chang Hyun; You, Young Woo; Cho, Jae Seon; Huh, Chang Wook; Kim, Do Hyoung; Kim, Ju Youl; Kim, Yoon Ik; Yang, Hui Chang; Park, Kang Min [Seoul National Univ., Seoul (Korea, Republic of)

    1998-03-15

    The objectives of this study is the development of methodology by which assesses the optimization of Surveillance Test Internal(STI) and Allowed Outage Time(AOT) using PSA method that can supplement the current deterministic methods and the improvement of Korean nuclear power plant safety. In this study, the survey about the assessment methodologies, modelings and results performed by domestic and international researches are performed as the basic step before developing the assessment methodology of this study. The assessment methodology that supplement the revealed problems in many other studies is presented and the application of new methodology into the example system assures the feasibility of this method. The sensitivity analyses about the failure factors of the components are performed in the bases of the and AOT is quantified. And the reliability assessment methodology about the diesel generator is reviewed and applied to the PSA code. The qualitative assessment for the STI/AOR of RPS/ESFAS assured safety the most important system in the nuclear power plant are performed.

  16. Hypothesis testing on the fractal structure of behavioral sequences: the Bayesian assessment of scaling methodology.

    Science.gov (United States)

    Moscoso del Prado Martín, Fermín

    2013-12-01

    I introduce the Bayesian assessment of scaling (BAS), a simple but powerful Bayesian hypothesis contrast methodology that can be used to test hypotheses on the scaling regime exhibited by a sequence of behavioral data. Rather than comparing parametric models, as typically done in previous approaches, the BAS offers a direct, nonparametric way to test whether a time series exhibits fractal scaling. The BAS provides a simpler and faster test than do previous methods, and the code for making the required computations is provided. The method also enables testing of finely specified hypotheses on the scaling indices, something that was not possible with the previously available methods. I then present 4 simulation studies showing that the BAS methodology outperforms the other methods used in the psychological literature. I conclude with a discussion of methodological issues on fractal analyses in experimental psychology. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  17. Comparison of two bond strength testing methodologies for bilayered all-ceramics

    NARCIS (Netherlands)

    Dundar, Mine; Ozcan, Mutlu; Gokce, Bulent; Comlekoglu, Erhan; Leite, Fabiola; Valandro, Luiz Felipe

    Objectives. This study compared the shear bond strength (SBS) and microtensile (MTBS) testing methodologies for core and veneering ceramics in four types of all-ceramic systems. Methods. Four different ceramic veneer/core combinations, three of which were feldspathic and the other a fluor-apatite to

  18. Development of methodology for certification of Type B shipping containers using analytical and testing techniques

    International Nuclear Information System (INIS)

    Sharp, R.R.; Varley, D.T.

    1992-01-01

    The Analysis and Testing Group (WX-11) of the Design Engineering Division at Los Alamos National Laboratory (LANL) is developing methodology for designing and providing a basis for certification of Type B shipping containers. This methodology will include design, analysis, testing, fabrication, procurement, and obtaining certification of the Type B containers, allowing usage in support of the United States Department of Energy programs. While all aspects of the packaging development are included in this methodology, this paper focuses on the use of analysis and testing techniques for enhancing the design and providing a basis for certification. This methodology is based on concurrent engineering principles. Multidisciplinary teams within LANL are responsible for the design and certification of specific Type B Radioactive Material Shipping Containers. These teams include personnel with the various backgrounds and areas of expertise required to support the design, testing, analysis and certification tasks. To demonstrate that a package can pass all the performance requirements, the design needs to be characterized as completely as possible. Understanding package responses to the various environments and how these responses influence the effectiveness of the packaging requires expertise in several disciplines. In addition to characterizing the shipping container designs, these multidisciplinary teams should be able to provide insight into improving new package designs

  19. A methodology for the parametric modelling of the flow coefficients and flow rate in hydraulic valves

    International Nuclear Information System (INIS)

    Valdés, José R.; Rodríguez, José M.; Saumell, Javier; Pütz, Thomas

    2014-01-01

    Highlights: • We develop a methodology for the parametric modelling of flow in hydraulic valves. • We characterize the flow coefficients with a generic function with two parameters. • The parameters are derived from CFD simulations of the generic geometry. • We apply the methodology to two cases from the automotive brake industry. • We validate by comparing with CFD results varying the original dimensions. - Abstract: The main objective of this work is to develop a methodology for the parametric modelling of the flow rate in hydraulic valve systems. This methodology is based on the derivation, from CFD simulations, of the flow coefficient of the critical restrictions as a function of the Reynolds number, using a generalized square root function with two parameters. The methodology is then demonstrated by applying it to two completely different hydraulic systems: a brake master cylinder and an ABS valve. This type of parametric valve models facilitates their implementation in dynamic simulation models of complex hydraulic systems

  20. Nutrients interaction investigation to improve Monascus purpureus FTC5391 growth rate using Response Surface Methodology and Artificial Neural Network

    Directory of Open Access Journals (Sweden)

    Mohamad, R.

    2013-01-01

    Full Text Available Aims: Two vital factors, certain environmental conditions and nutrients as a source of energy are entailed for successful growth and reproduction of microorganisms. Manipulation of nutritional requirement is the simplest and most effectual strategy to stimulate and enhance the activity of microorganisms. Methodology and Results: In this study, response surface methodology (RSM and artificial neural network (ANN were employed to optimize the carbon and nitrogen sources in order to improve growth rate of Monascus purpureus FTC5391,a new local isolate. The best models for optimization of growth rate were a multilayer full feed-forward incremental back propagation network, and a modified response surface model using backward elimination. The optimum condition for cell mass production was: sucrose 2.5%, yeast extract 0.045%, casamino acid 0.275%, sodium nitrate 0.48%, potato starch 0.045%, dextrose 1%, potassium nitrate 0.57%. The experimental cell mass production using this optimal condition was 21 mg/plate/12days, which was 2.2-fold higher than the standard condition (sucrose 5%, yeast extract 0.15%, casamino acid 0.25%, sodium nitrate 0.3%, potato starch 0.2%, dextrose 1%, potassium nitrate 0.3%. Conclusion, significance and impact of study: The results of RSM and ANN showed that all carbon and nitrogen sources tested had significant effect on growth rate (P-value < 0.05. In addition the use of RSM and ANN alongside each other provided a proper growth prediction model.

  1. Preloading To Accelerate Slow-Crack-Growth Testing

    Science.gov (United States)

    Gyekenyesi, John P.; Choi, Sung R.; Pawlik, Ralph J.

    2004-01-01

    An accelerated-testing methodology has been developed for measuring the slow-crack-growth (SCG) behavior of brittle materials. Like the prior methodology, the accelerated-testing methodology involves dynamic fatigue ( constant stress-rate) testing, in which a load or a displacement is applied to a specimen at a constant rate. SCG parameters or life prediction parameters needed for designing components made of the same material as that of the specimen are calculated from the relationship between (1) the strength of the material as measured in the test and (2) the applied stress rate used in the test. Despite its simplicity and convenience, dynamic fatigue testing as practiced heretofore has one major drawback: it is extremely time-consuming, especially at low stress rates. The present accelerated methodology reduces the time needed to test a specimen at a given rate of applied load, stress, or displacement. Instead of starting the test from zero applied load or displacement as in the prior methodology, one preloads the specimen and increases the applied load at the specified rate (see Figure 1). One might expect the preload to alter the results of the test and indeed it does, but fortunately, it is possible to account for the effect of the preload in interpreting the results. The accounting is done by calculating the normalized strength (defined as the strength in the presence of preload the strength in the absence of preload) as a function of (1) the preloading factor (defined as the preload stress the strength in the absence of preload) and (2) a SCG parameter, denoted n, that is used in a power-law crack-speed formulation. Figure 2 presents numerical results from this theoretical calculation.

  2. Success Rates by Software Development Methodology in Information Technology Project Management: A Quantitative Analysis

    Science.gov (United States)

    Wright, Gerald P.

    2013-01-01

    Despite over half a century of Project Management research, project success rates are still too low. Organizations spend a tremendous amount of valuable resources on Information Technology projects and seek to maximize the utility gained from their efforts. The author investigated the impact of software development methodology choice on ten…

  3. Development of Testing Methodologies for the Mechanical Properties of MEMS

    Science.gov (United States)

    Ekwaro-Osire, Stephen

    2003-01-01

    This effort is to investigate and design testing strategies to determine the mechanical properties of MicroElectroMechanical Systems (MEMS) as well as investigate the development of a MEMS Probabilistic Design Methodology (PDM). One item of potential interest is the design of a test for the Weibull size effect in pressure membranes. The Weibull size effect is a consequence of a stochastic strength response predicted from the Weibull distribution. Confirming that MEMS strength is controlled by the Weibull distribution will enable the development of a probabilistic design methodology for MEMS - similar to the GRC developed CARES/Life program for bulk ceramics. However, the primary area of investigation will most likely be analysis and modeling of material interfaces for strength as well as developing a strategy to handle stress singularities at sharp corners, filets, and material interfaces. This will be a continuation of the previous years work. The ultimate objective of this effort is to further develop and verify the ability of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) code to predict the time-dependent reliability of MEMS structures subjected to multiple transient loads.

  4. 42 CFR 413.220 - Methodology for calculating the per-treatment base rate under the ESRD prospective payment system...

    Science.gov (United States)

    2010-10-01

    ... rate under the ESRD prospective payment system effective January 1, 2011. 413.220 Section 413.220...-treatment base rate under the ESRD prospective payment system effective January 1, 2011. (a) Data sources. The methodology for determining the per treatment base rate under the ESRD prospective payment system...

  5. Latent Trait Theory Applications to Test Item Bias Methodology. Research Memorandum No. 1.

    Science.gov (United States)

    Osterlind, Steven J.; Martois, John S.

    This study discusses latent trait theory applications to test item bias methodology. A real data set is used in describing the rationale and application of the Rasch probabilistic model item calibrations across various ethnic group populations. A high school graduation proficiency test covering reading comprehension, writing mechanics, and…

  6. Development of Testing Methodologies to Evaluate Postflight Locomotor Performance

    Science.gov (United States)

    Mulavara, A. P.; Peters, B. T.; Cohen, H. S.; Richards, J. T.; Miller, C. A.; Brady, R.; Warren, L. E.; Bloomberg, J. J.

    2006-01-01

    Crewmembers experience locomotor and postural instabilities during ambulation on Earth following their return from space flight. Gait training programs designed to facilitate recovery of locomotor function following a transition to a gravitational environment need to be accompanied by relevant assessment methodologies to evaluate their efficacy. The goal of this paper is to demonstrate the operational validity of two tests of locomotor function that were used to evaluate performance after long duration space flight missions on the International Space Station (ISS).

  7. Methodology for predicting the life of waste-package materials, and components using multifactor accelerated life tests

    International Nuclear Information System (INIS)

    Thomas, R.E.; Cote, R.W.

    1983-09-01

    Accelerated life tests are essential for estimating the service life of waste-package materials and components. A recommended methodology for generating accelerated life tests is described in this report. The objective of the methodology is to define an accelerated life test program that is scientifically and statistically defensible. The methodology is carried out using a select team of scientists and usually requires 4 to 12 man-months of effort. Specific agendas for the successive meetings of the team are included in the report for use by the team manager. The agendas include assignments for the team scientists and a different set of assignments for the team statistician. The report also includes descriptions of factorial tables, hierarchical trees, and associated mathematical models that are proposed as technical tools to guide the efforts of the design team

  8. Methodology to identify risk-significant components for inservice inspection and testing

    International Nuclear Information System (INIS)

    Anderson, M.T.; Hartley, R.S.; Jones, J.L. Jr.; Kido, C.; Phillips, J.H.

    1992-08-01

    Periodic inspection and testing of vital system components should be performed to ensure the safe and reliable operation of Department of Energy (DOE) nuclear processing facilities. Probabilistic techniques may be used to help identify and rank components by their relative risk. A risk-based ranking would allow varied DOE sites to implement inspection and testing programs in an effective and cost-efficient manner. This report describes a methodology that can be used to rank components, while addressing multiple risk issues

  9. Accelerated lifetime testing methodology for lifetime estimation of Lithium-ion batteries used in augmented wind power plants

    DEFF Research Database (Denmark)

    Stroe, Daniel Ioan; Swierczynski, Maciej Jozef; Stan, Ana-Irina

    2013-01-01

    The development of lifetime estimation models for Lithium-ion battery cells, which are working under highly variable mission profiles characteristic for wind power plant applications, requires a lot of expenditures and time resources. Therefore, batteries have to be tested under accelerated...... lifetime ageing conditions. This paper presents a three-stage methodology used for accelerated lifetime testing of Lithium-ion batteries. The results obtained at the end of the accelerated ageing process can be used for the parametrization of a performance-degradation lifetime model. In the proposed...... methodology both calendar and cycling lifetime tests are considered since both components are influencing the lifetime of Lithium-ion batteries. The methodology proposes also a lifetime model verification stage, where Lithium-ion battery cells are tested at normal operating conditions using an application...

  10. Testing Strategies and Methodologies for the Max Launch Abort System

    Science.gov (United States)

    Schaible, Dawn M.; Yuchnovicz, Daniel E.

    2011-01-01

    The National Aeronautics and Space Administration (NASA) Engineering and Safety Center (NESC) was tasked to develop an alternate, tower-less launch abort system (LAS) as risk mitigation for the Orion Project. The successful pad abort flight demonstration test in July 2009 of the "Max" launch abort system (MLAS) provided data critical to the design of future LASs, while demonstrating the Agency s ability to rapidly design, build and fly full-scale hardware at minimal cost in a "virtual" work environment. Limited funding and an aggressive schedule presented a challenge for testing of the complex MLAS system. The successful pad abort flight demonstration test was attributed to the project s systems engineering and integration process, which included: a concise definition of, and an adherence to, flight test objectives; a solid operational concept; well defined performance requirements, and a test program tailored to reducing the highest flight test risks. The testing ranged from wind tunnel validation of computational fluid dynamic simulations to component ground tests of the highest risk subsystems. This paper provides an overview of the testing/risk management approach and methodologies used to understand and reduce the areas of highest risk - resulting in a successful flight demonstration test.

  11. Testing methodologies

    International Nuclear Information System (INIS)

    Bender, M.A.

    1990-01-01

    Several methodologies are available for screening human populations for exposure to ionizing radiation. Of these, aberration frequency determined in peripheral blood lymphocytes is the best developed. Individual exposures to large doses can easily be quantitated, and population exposures to occupational levels can be detected. However, determination of exposures to the very low doses anticipated from a low-level radioactive waste disposal site is more problematical. Aberrations occur spontaneously, without known cause. Exposure to radiation induces no new or novel types, but only increases their frequency. The limitations of chromosomal aberration dosimetry for detecting low level radiation exposures lie mainly in the statistical ''signal to noise'' problem, the distribution of aberrations among cells and among individuals, and the possible induction of aberrations by other environmental occupational or medical exposures. However, certain features of the human peripheral lymphocyte-chromosomal aberration system make it useful in screening for certain types of exposures. Future technical developments may make chromosomal aberration dosimetry more useful for low-level radiation exposures. Other methods, measuring gene mutations or even minute changes on the DNA level, while presently less will developed techniques, may eventually become even more practical and sensitive assays for human radiation exposure. 15 refs

  12. Testing Methodology of Breaking into Secured Storages of Mobile Operational System Google Android

    Directory of Open Access Journals (Sweden)

    Elena Vyacheslavovna Elistratova

    2013-02-01

    Full Text Available The methodology is developed for carrying out the test of breaking into internal storages of mobile operational system Google Android in order to detect security threats for personal data.

  13. Improving inferior vena cava filter retrieval rates with the define, measure, analyze, improve, control methodology.

    Science.gov (United States)

    Sutphin, Patrick D; Reis, Stephen P; McKune, Angie; Ravanzo, Maria; Kalva, Sanjeeva P; Pillai, Anil K

    2015-04-01

    To design a sustainable process to improve optional inferior vena cava (IVC) filter retrieval rates based on the Define, Measure, Analyze, Improve, Control (DMAIC) methodology of the Six Sigma process improvement paradigm. DMAIC, an acronym for Define, Measure, Analyze, Improve, and Control, was employed to design and implement a quality improvement project to increase IVC filter retrieval rates at a tertiary academic hospital. Retrievable IVC filters were placed in 139 patients over a 2-year period. The baseline IVC filter retrieval rate (n = 51) was reviewed through a retrospective analysis, and two strategies were devised to improve the filter retrieval rate: (a) mailing of letters to clinicians and patients for patients who had filters placed within 8 months of implementation of the project (n = 43) and (b) a prospective automated scheduling of a clinic visit at 4 weeks after filter placement for all new patients (n = 45). The effectiveness of these strategies was assessed by measuring the filter retrieval rates and estimated increase in revenue to interventional radiology. IVC filter retrieval rates increased from a baseline of 8% to 40% with the mailing of letters and to 52% with the automated scheduling of a clinic visit 4 weeks after IVC filter placement. The estimated revenue per 100 IVC filters placed increased from $2,249 to $10,518 with the mailing of letters and to $17,022 with the automated scheduling of a clinic visit. Using the DMAIC methodology, a simple and sustainable quality improvement intervention was devised that markedly improved IVC filter retrieval rates in eligible patients. Copyright © 2015 SIR. Published by Elsevier Inc. All rights reserved.

  14. Evaluation methodologies for security testing biometric systems beyond technological evaluation

    OpenAIRE

    Fernández Saavedra, María Belén

    2013-01-01

    The main objective of this PhD Thesis is the specification of formal evaluation methodologies for testing the security level achieved by biometric systems when these are working under specific contour conditions. This analysis is conducted through the calculation of the basic technical biometric system performance and its possible variations. To that end, the next two relevant contributions have been developed. The first contribution is the definition of two independent biometric performance ...

  15. Comparison of two bond strength testing methodologies for bilayered all-ceramics.

    Science.gov (United States)

    Dündar, Mine; Ozcan, Mutlu; Gökçe, Bülent; Cömlekoğlu, Erhan; Leite, Fabiola; Valandro, Luiz Felipe

    2007-05-01

    This study compared the shear bond strength (SBS) and microtensile (MTBS) testing methodologies for core and veneering ceramics in four types of all-ceramic systems. Four different ceramic veneer/core combinations, three of which were feldspathic and the other a fluor-apatite to their respectively corresponding cores, namely leucite-reinforced ceramic ((IPS)Empress, Ivoclar), low leucite-reinforced ceramic (Finesse, Ceramco), glass-infiltrated alumina (In-Ceram Alumina, Vita) and lithium disilicate ((IPS)Empress 2, Ivoclar) were used for SBS and MTBS tests. Ceramic cores (N=40, n=10/group for SBS test method, N=5 blocks/group for MTBS test method) were fabricated according to the manufacturers' instructions (for SBS: thickness, 3mm; diameter, 5mm and for MTBS: 10 mm x 10 mm x 2 mm) and ultrasonically cleaned. The veneering ceramics (thickness: 2mm) were vibrated and condensed in stainless steel moulds and fired onto the core ceramic materials. After trying the specimens in the mould for minor adjustments, they were again ultrasonically cleaned and embedded in PMMA. The specimens were stored in distilled water at 37 degrees C for 1 week and bond strength tests were performed in universal testing machines (cross-head speed: 1mm/min). The bond strengths (MPa+/-S.D.) and modes of failures were recorded. Significant difference between the two test methods and all-ceramic types were observed (P<0.05) (2-way ANOVA, Tukey's test and Bonferroni). The mean SBS values for veneering ceramic to lithium disilicate was significantly higher (41+/-8 MPa) than those to low leucite (28+/-4 MPa), glass-infiltrated (26+/-4 MPa) and leucite-reinforced (23+/-3 MPa) ceramics, while the mean MTBS for low leucite ceramic was significantly higher (15+/-2 MPa) than those of leucite (12+/-2 MPa), glass-infiltrated (9+/-1 MPa) and lithium disilicate ceramic (9+/-1 MPa) (ANOVA, P<0.05). Both the testing methodology and the differences in chemical compositions of the core and veneering ceramics

  16. FLECHT low flooding rate cosine test series data report

    International Nuclear Information System (INIS)

    Rosal, E.R.; Hochreiter, L.E.; McGuire, M.F.; Krepinevich, M.C.

    1975-12-01

    The FLECHT Low Flooding Rate Tests were conducted in an improved original FLECHT Test Facility to provide heat transfer coefficient and entrainment data at forced flooding rates of 1 in./sec and below. In addition these tests were performed to supplement parametric effects studied in the original FLECHT program, provide data for reflood model development, repeat original FLECHT tests with new instrumentation and data processing techniques, and to provide data to establish test repeatability. These tests examined the effects of low initial clad temperature, variable stepped and continuously variable flooding rates, housing heat release, run peak power, constant low flooding rates, coolant subcooling, hot and cold channel entrainment, and bundle stored and generated power. Data obtained in sixty four runs which met the test specifications are reported, and include rod clad temperatures, turn around and quench times, heat transfer coefficients, inlet flooding rates, overall mass balances, differential pressures and calculated void fractions in the test section, thimble wall and steam temperatures, exhaust steam and liquid carryover rates, and housing total and rate of heat release

  17. Testing methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Bender, M.A.

    1990-01-01

    Several methodologies are available for screening human populations for exposure to ionizing radiation. Of these, aberration frequency determined in peripheral blood lymphocytes is the best developed. Individual exposures to large doses can easily be quantitated, and population exposures to occupational levels can be detected. However, determination of exposures to the very low doses anticipated from a low-level radioactive waste disposal site is more problematical. Aberrations occur spontaneously, without known cause. Exposure to radiation induces no new or novel types, but only increases their frequency. The limitations of chromosomal aberration dosimetry for detecting low level radiation exposures lie mainly in the statistical signal to noise'' problem, the distribution of aberrations among cells and among individuals, and the possible induction of aberrations by other environmental occupational or medical exposures. However, certain features of the human peripheral lymphocyte-chromosomal aberration system make it useful in screening for certain types of exposures. Future technical developments may make chromosomal aberration dosimetry more useful for low-level radiation exposures. Other methods, measuring gene mutations or even minute changes on the DNA level, while presently less will developed techniques, may eventually become even more practical and sensitive assays for human radiation exposure. 15 refs.

  18. Accelerated Lifetime Testing Methodology for Lifetime Estimation of Lithium-ion Batteries used in Augmented Wind Power Plants

    DEFF Research Database (Denmark)

    Stroe, Daniel Ioan; Swierczynski, Maciej Jozef; Stan, Ana-Irina

    2014-01-01

    The development of lifetime estimation models for Lithium-ion battery cells, which are working under highly variable mission profiles characteristic for wind power plant applications, requires a lot of expenditures and time resources. Therefore, batteries have to be tested under accelerated...... lifetime ageing conditions. This paper presents a three-stage methodology used for accelerated lifetime testing of Lithium ion batteries. The results obtained at the end of the accelerated ageing process were used for the parametrization of a performance-degradation lifetime model, which is able to predict...... both the capacity fade and the power capability decrease of the selected Lithium-ion battery cells. In the proposed methodology both calendar and cycling lifetime tests were considered since both components are influencing the lifetime of Lithium-ion batteries. Furthermore, the proposed methodology...

  19. Comprehensive Error Rate Testing (CERT)

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Centers for Medicare and Medicaid Services (CMS) implemented the Comprehensive Error Rate Testing (CERT) program to measure improper payments in the Medicare...

  20. FLECHT low flooding rate skewed test series data report

    International Nuclear Information System (INIS)

    Rosal, E.R.; Conway, C.E.; Krepinevich, M.C.

    1977-05-01

    The FLECHT Low Flooding Rate Tests were conducted in an improved original FLECHT Test Facility to provide heat transfer coefficient and entrainment data at forced flooding rates of 1 in./sec. and with electrically heated rod bundles which had cosine and top skewed axial power profiles. The top-skewed axial power profile test series has now been successfully completed and is here reported. For these tests the rod bundle was enclosed in a low mass cylindrical housing which would minimize the wall housing effects encountered in the cosine test series. These tests examined the effects of initial clad temperature, variable stepped and continuously variable flooding rates, housing heat release, rod peak power, constant low flooding rates, coolant subcooling, hot and cold channel entrainment, and bundle stored and generated power. Data obtained in runs which met the test specifications are reported here, and include rod clad temperatures, turn around and quench times, heat transfer coefficients, inlet flooding rates, overall mass balances, differential pressures and calculated void fractions in the test section, thimble wall and steam temperatures, and exhaust steam and liquid carryover rates

  1. Hydrologic testing methodology and results from deep basalt boreholes

    International Nuclear Information System (INIS)

    Strait, S.R.; Spane, F.A.; Jackson, R.L.; Pidcoe, W.W.

    1982-05-01

    The objective of the hydrologic field-testing program is to provide data for characterization of the groundwater systems wihin the Pasco Basin that are significant to understanding waste isolation. The effort is directed toward characterizing the areal and vertical distributions of hydraulic head, hydraulic properties, and hydrochemistry. Data obtained from these studies provide input for numerical modeling of groundwater flow and solute transport. These models are then used for evaluating potential waste migration as a function of space and time. The groundwater system beneath the Hanford Site and surrounding area consists of a thick, accordantly layered sequence of basalt flows and associated sedimentary interbed that primarily occur in the upper part of the Columbia River basalt. Permeable horizons of the sequence are associated with the interbeds and the interflow zones within the basalt. The columnar interiors of a flow act as low-permeability aquitards, separating the more-permeable interflows or interbeds. This paper discusses the hydrologic field-gathering activities, specifically, field-testing methodology and test results from deep basalt boreholes

  2. FY17 Status Report on Testing Supporting the Inclusion of Grade 91 Steel as an Acceptable Material for Application of the EPP Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Messner, Mark C. [Argonne National Lab. (ANL), Argonne, IL (United States); Sham, Sam [Argonne National Lab. (ANL), Argonne, IL (United States); Wang, Yanli [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-08-01

    This report summarizes the experiments performed in FY17 on Gr. 91 steels. The testing of Gr. 91 has technical significance because, currently, it is the only approved material for Class A construction that is strongly cyclic softening. Specific FY17 testing includes the following activities for Gr. 91 steel. First, two types of key feature testing have been initiated, including two-bar thermal ratcheting and Simplified Model Testing (SMT). The goal is to qualify the Elastic – Perfectly Plastic (EPP) design methodologies and to support incorporation of these rules for Gr. 91 into the ASME Division 5 Code. The preliminary SMT test results show that Gr. 91 is most damaging when tested with compression hold mode under the SMT creep fatigue testing condition. Two-bar thermal ratcheting test results at a temperature range between 350 to 650o C were compared with the EPP strain limits code case evaluation, and the results show that the EPP strain limits code case is conservative. The material information obtained from these key feature tests can also be used to verify its material model. Second, to provide experimental data in support of the viscoplastic material model development at Argonne National Laboratory, selective tests were performed to evaluate the effect of cyclic softening on strain rate sensitivity and creep rates. The results show the prior cyclic loading history decreases the strain rate sensitivity and increases creep rates. In addition, isothermal cyclic stress-strain curves were generated at six different temperatures, and a nonisothermal thermomechanical testing was also performed to provide data to calibrate the viscoplastic material model.

  3. Enhancing the Benefit of the Chemical Mixture Methodology: A Report on Methodology Testing and Potential Approaches for Improving Performance

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Xiao-Ying; Yao, Juan; He, Hua; Glantz, Clifford S.; Booth, Alexander E.

    2012-01-01

    Extensive testing shows that the current version of the Chemical Mixture Methodology (CMM) is meeting its intended mission to provide conservative estimates of the health effects from exposure to airborne chemical mixtures. However, the current version of the CMM could benefit from several enhancements that are designed to improve its application of Health Code Numbers (HCNs) and employ weighting factors to reduce over conservatism.

  4. The Decisions of Elementary School Principals: A Test of Ideal Type Methodology.

    Science.gov (United States)

    Greer, John T.

    Interviews with 25 Georgia elementary school principals provided data that could be used to test an application of Max Weber's ideal type methodology to decision-making. Alfred Schuetz's model of the rational act, based on one of Weber's ideal types, was analyzed and translated into describable acts and behaviors. Interview procedures were…

  5. [The methodological assessment and qualitative evaluation of psychometric performance tests based on the example of modern tests that assess reading and spelling skills].

    Science.gov (United States)

    Galuschka, Katharina; Rothe, Josefine; Schulte-Körne, Gerd

    2015-09-01

    This article looks at a means of objectively evaluating the quality of psychometric tests. This approach enables users to evaluate psychometric tests based on their methodological characteristics, in order to decide which instrument should be used. Reading and spelling assessment tools serve as examples. The paper also provides a review of German psychometric tests for the assessment of reading and spelling skills. This method facilitates the identification of psychometric tests.of high methodological quality which can be used for the assessment of reading and spelling skills. Reading performance should ideally be assessed with the following instruments: ELFE 1-6, LGVT 6-12, LESEN 6-7, LESEN 8-9, or WLLP-R. The tests to be used for the evaluation of spelling skills are DERET 1-2+, DERET 3-4+, WRT 1+, WRT 2+, WRT 3+, WRT 4+ or HSP 1-10.

  6. The significance of some methodological effects on filtration and ingestion rates of the rotifer Brachionus plicatilis

    Science.gov (United States)

    Schlosser, H. J.; Anger, K.

    1982-06-01

    Filtration rate (F) and ingestion rate (I) were measured in the rotifer Brachionus plicatilis feeding on the flagellate Dunaliella spec. and on yeast cells (Saccharomyces cerevisiae). 60-min experiments in rotating bottles served as a standard for testing methodological effects on levels of F and I. A lack of rotation reduced F values by 40 %, and a rise in temperature from 18° to 23.5 °C increased them by 42 %. Ingestion rates increased significantly up to a particle (yeast) concentration of ca. 600-800 cells · μl-1; then they remained constant, whereas filtration rates decreased beyond this threshold. Rotifer density (up to 1000 ind · ml-1) and previous starvation (up to 40 h) did not significantly influence food uptake rates. The duration of the experiment proved to have the most significant effect on F and I values: in 240-min experiments, these values were on the average more than 90 % lower than in 15-min experiments. From this finding it is concluded that ingestion rates obtained from short-term experiments (60 min or less) cannot be used in energy budgets, because they severely overestimate the actual long-term feeding capacity of the rotifers. At the lower end of the particle size spectrum (2 to 3 µm) there are not only food cells, but apparently also contaminating faecal particles. Their number increased with increasing duration of experiments and lead to an underestimation of F and I. Elemental analyses of rotifers and their food suggest that B. plicatilis can ingest up to 0.6 mJ or ca. 14 % of its own body carbon within 15 min. The long term average was estimated as 3.4 mJ · ind-1 · d-1 or ca. 75 % of body carbon · d-1.

  7. Thermal-hydraulic analysis for changing feedwater check valve leakage rate testing methodology

    International Nuclear Information System (INIS)

    Fuller, R.; Harrell, J.

    1996-01-01

    The current design and testing requirements for the feedwater check valves (FWCVs) at the Grand Gulf Nuclear Station are established from original licensing requirements that necessitate extremely restrictive air testing with tight allowable leakage limits. As a direct result of these requirements, the original high endurance hard seats in the FWCVs were modified with elastomeric seals to provide a sealing surface capable of meeting the stringent air leakage limits. However, due to the relatively short functional life of the elastomeric seals compared to the hard seats, the overall reliability of the sealing function actually decreased. This degraded performance was exhibited by frequent seal failures and subsequent valve repairs. The original requirements were based on limited analysis and the belief that all of the high energy feedwater vaporized during the LOCA blowdown. These phenomena would have resulted in completely voided feedwater lines and thus a steam environment within the feedwater leak pathway. To challenge these criteria, a comprehensive design basis accident analysis was developed using the RELAP5/MOD3.1 thermal-hydraulic code. Realistic assumptions were used to more accurately model the post-accident fluid conditions within the feedwater system. The results of this analysis demonstrated that no leak path exists through the feedwater lines during the reactor blowdown phase and that sufficient subcooled water remains in various portions of the feedwater piping to form liquid water loop seals that effectively isolate this leak path. These results provided the bases for changing the leak testing requirements of the FWCVs from air to water. The analysis results also established more accurate allowable leakage limits, determined the real effective margins associated with the FWCV safety functions, and led to design changes that improved the overall functional performance of the valves

  8. Thermal-hydraulic analysis for changing feedwater check valve leakage rate testing methodology

    Energy Technology Data Exchange (ETDEWEB)

    Fuller, R.; Harrell, J.

    1996-12-01

    The current design and testing requirements for the feedwater check valves (FWCVs) at the Grand Gulf Nuclear Station are established from original licensing requirements that necessitate extremely restrictive air testing with tight allowable leakage limits. As a direct result of these requirements, the original high endurance hard seats in the FWCVs were modified with elastomeric seals to provide a sealing surface capable of meeting the stringent air leakage limits. However, due to the relatively short functional life of the elastomeric seals compared to the hard seats, the overall reliability of the sealing function actually decreased. This degraded performance was exhibited by frequent seal failures and subsequent valve repairs. The original requirements were based on limited analysis and the belief that all of the high energy feedwater vaporized during the LOCA blowdown. These phenomena would have resulted in completely voided feedwater lines and thus a steam environment within the feedwater leak pathway. To challenge these criteria, a comprehensive design basis accident analysis was developed using the RELAP5/MOD3.1 thermal-hydraulic code. Realistic assumptions were used to more accurately model the post-accident fluid conditions within the feedwater system. The results of this analysis demonstrated that no leak path exists through the feedwater lines during the reactor blowdown phase and that sufficient subcooled water remains in various portions of the feedwater piping to form liquid water loop seals that effectively isolate this leak path. These results provided the bases for changing the leak testing requirements of the FWCVs from air to water. The analysis results also established more accurate allowable leakage limits, determined the real effective margins associated with the FWCV safety functions, and led to design changes that improved the overall functional performance of the valves.

  9. Constant displacement rate testing at elevated temperatures

    International Nuclear Information System (INIS)

    Pepe, J.J.; Gonyea, D.C.

    1989-01-01

    A short time test has been developed which is capable of determining the long time notch sensitivity tendencies of CrMoV rotor forging materials. This test is based on Constant Displacement Rate (CDR) testing of a specific notch bar specimen at 1200 0 F at 2 mils/in/hour displacement rate. These data were correlated to conventional smooth and notch bar rupture behavior for a series of CrMoV materials with varying long time ductility tendencies. The purpose of this paper is to describe the details of this new test procedure and some of the relevant mechanics of material information generated during its development

  10. Testing methodologies and systems for semiconductor optical amplifiers

    Science.gov (United States)

    Wieckowski, Michael

    Semiconductor optical amplifiers (SOA's) are gaining increased prominence in both optical communication systems and high-speed optical processing systems, due primarily to their unique nonlinear characteristics. This in turn, has raised questions regarding their lifetime performance reliability and has generated a demand for effective testing techniques. This is especially critical for industries utilizing SOA's as components for system-in-package products. It is important to note that very little research to date has been conducted in this area, even though production volume and market demand has continued to increase. In this thesis, the reliability of dilute-mode InP semiconductor optical amplifiers is studied experimentally and theoretically. The aging characteristics of the production level devices are demonstrated and the necessary techniques to accurately characterize them are presented. In addition, this work proposes a new methodology for characterizing the optical performance of these devices using measurements in the electrical domain. It is shown that optical performance degradation, specifically with respect to gain, can be directly qualified through measurements of electrical subthreshold differential resistance. This metric exhibits a linear proportionality to the defect concentration in the active region, and as such, can be used for prescreening devices before employing traditional optical testing methods. A complete theoretical analysis is developed in this work to explain this relationship based upon the device's current-voltage curve and its associated leakage and recombination currents. These results are then extended to realize new techniques for testing semiconductor optical amplifiers and other similarly structured devices. These techniques can be employed after fabrication and during packaged operation through the use of a proposed stand-alone testing system, or using a proposed integrated CMOS self-testing circuit. Both methods are capable

  11. Assessing the Impact of Clothing and Individual Equipment (CIE) on Soldier Physical, Biomechanical, and Cognitive Performance Part 1: Test Methodology

    Science.gov (United States)

    2018-02-01

    29 during Soldier Equipment Configuration Impact on Performance: Establishing a Test Methodology for the...during ACSM’S resource manual for exercise testing and prescription Human Movement Science, 31(2), Proceedings of the 2016 American Biomechanics...Performance of Medium Rucksack Prototypes An investigation: Comparison of live-fire and weapon simulator test methodologies and the of three extremity armor

  12. 46 CFR 107.260 - Rated load test for cranes.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 4 2010-10-01 2010-10-01 false Rated load test for cranes. 107.260 Section 107.260... INSPECTION AND CERTIFICATION Inspection and Certification § 107.260 Rated load test for cranes. (a) To meet the requirements in § 107.231(l), each crane must meet the following rated load test at both the...

  13. Testing Optimum Seeding Rates for five Bread Wheat Cultivars

    International Nuclear Information System (INIS)

    Wekesa, S.J.; Kiriswa, F.; Owuoche, J.

    1999-01-01

    A cultivar by seed rate trial was conducted in 1994-1995 crop seasons at Njoro, Kenya. Yield results were found to be significant (P > 0.01) for year, variety, seed rate and year by seed rate interaction. Test weight was highly significant (P -1 were grouped together for significantly higher yields (A) whereas seed rates 85 and 50 kg ha -1 had lower significant yields (B and C respectively). The same grouping was repeated for test weight. There was no significant cultivar by seed rate interaction and no cultivar, specific seed rate. However, since seed rates 245, 205, 165 and 125 kg ha -1 were grouped together, the lowest seed rate, 125 kg ha -1 can be recommended as the optimum seed rate for the above cultivars, as higher seed rates do not give significantly higher yields or higher test weights

  14. Rate-control algorithms testing by using video source model

    DEFF Research Database (Denmark)

    Belyaev, Evgeny; Turlikov, Andrey; Ukhanova, Anna

    2008-01-01

    In this paper the method of rate control algorithms testing by the use of video source model is suggested. The proposed method allows to significantly improve algorithms testing over the big test set.......In this paper the method of rate control algorithms testing by the use of video source model is suggested. The proposed method allows to significantly improve algorithms testing over the big test set....

  15. TESTS AND METHODOLOGIES FOR THE SURVEY OF NARROW SPACES

    Directory of Open Access Journals (Sweden)

    L. Perfetti

    2017-02-01

    Full Text Available The research illustrated in this article aimed at identifying a good standard methodology to survey very narrow spaces during 3D investigation of Cultural Heritage. It is an important topic in today’s era of BIM modelling applied to Cultural Heritage. Spaces like staircases, corridors and passages are very common in the architectural or archaeological fields, and obtaining a 3D-oriented survey of those areas can be a very complex task when completeness of the model and high precision are requested. Photogrammetry appears to be the most promising solution in terms of versatility and manoeuvrability also considering the quality of the required data. Fisheye lenses were studied and tested in depth because of their significant advantage in the field of view if compared with rectilinear lenses. This advantage alone can be crucial to reduce the total amount of photos and, as a consequence, to obtain manageable data, to simplify the survey phase and to significantly reduce the elaboration time. In order to overcome the main issue that arise when using fisheye lenses, which is the lack of rules that can be employed to design the survey, a general mathematical formulation to precisely estimate the GSD (Ground Sampling Distance for every optical projection is presented here. A complete survey of a real complex case study was performed in order to test and stress the proposed methodology, and to handle a fisheye-based survey from beginning to end: the photogrammetric survey of the Minguzzi Staircase. It is a complex service spiral-staircase located in the Duomo di Milano with a total height of 25 meters and characterized by a narrow walkable space about 70 centimetres wide.

  16. Integrated vehicle-based safety systems light-vehicle field operational test, methodology and results report.

    Science.gov (United States)

    2010-12-01

    "This document presents the methodology and results from the light-vehicle field operational test conducted as part of the Integrated Vehicle-Based Safety Systems program. These findings are the result of analyses performed by the University of Michi...

  17. K West basin isolation barrier leak rate test

    International Nuclear Information System (INIS)

    Whitehurst, R.; McCracken, K.; Papenfuss, J.N.

    1994-01-01

    This document establishes the procedure for performing the acceptance test on the two isolation barriers being installed in K West basin. This acceptance test procedure shall be used to: First establish a basin water loss rate prior to installation of the two isolation barriers between the main basin and the discharge chute in K-Basin West. Second, perform an acceptance test to verify an acceptable leakage rate through the barrier seals

  18. Measuring cognitive task demands using dual task methodology, subjective self-ratings, and expert judgments : A Validation Study

    NARCIS (Netherlands)

    Révész, Andrea; Michel, Marije; Gilabert, Roger

    2016-01-01

    This study explored the usefulness of dual-task methodology, self-ratings, and expert judgements in assessing task-generated cognitive demands as a way to provide validity evidence for manipulations of task complexity. The participants were 96 students and 61 ESL teachers. The students, 48 English

  19. Integrated leak rate test of the FFTF [Fast Flux Test Facility] containment vessel

    International Nuclear Information System (INIS)

    Grygiel, M.L.; Davis, R.H.; Polzin, D.L.; Yule, W.D.

    1987-04-01

    The third integrated leak rate test (ILRT) performed at the Fast Flux Test Facility (FFTF) demonstrated that effective leak rate measurements could be obtained at a pressure of 2 psig. In addition, innovative data reduction methods demonstrated the ability to accurately account for diurnal variations in containment pressure and temperature. Further development of methods used in this test indicate significant savings in the time and effort required to perform an ILRT on Liquid Metal Reactor Systems with consequent reduction in test costs

  20. Efficient preliminary floating offshore wind turbine design and testing methodologies and application to a concrete spar design.

    Science.gov (United States)

    Matha, Denis; Sandner, Frank; Molins, Climent; Campos, Alexis; Cheng, Po Wen

    2015-02-28

    The current key challenge in the floating offshore wind turbine industry and research is on designing economic floating systems that can compete with fixed-bottom offshore turbines in terms of levelized cost of energy. The preliminary platform design, as well as early experimental design assessments, are critical elements in the overall design process. In this contribution, a brief review of current floating offshore wind turbine platform pre-design and scaled testing methodologies is provided, with a focus on their ability to accommodate the coupled dynamic behaviour of floating offshore wind systems. The exemplary design and testing methodology for a monolithic concrete spar platform as performed within the European KIC AFOSP project is presented. Results from the experimental tests compared to numerical simulations are presented and analysed and show very good agreement for relevant basic dynamic platform properties. Extreme and fatigue loads and cost analysis of the AFOSP system confirm the viability of the presented design process. In summary, the exemplary application of the reduced design and testing methodology for AFOSP confirms that it represents a viable procedure during pre-design of floating offshore wind turbine platforms. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  1. Safe transport of radioactive materials - Leakage testing on packages. 1. ed.

    International Nuclear Information System (INIS)

    1996-01-01

    This International Standard describes a method for relating permissible activity release rates of the radioactive contents carried within a containment system to equivalent gas leakage rates under specified test conditions. This approach is called gas leakage test methodology. However, in this International Standard it is recognized that other methodologies might be acceptable. When other methodologies are to be used, it shall be shown that the methodology demonstrates that any release of the radioactive contents will not exceed the regulatory requirements. The use of any alternative methodology shall be by agreement with the competent authority. This International Standard provides both overall and detailed guidance on the complex relationships between an equivalent gas leakage test and a permissible activity release rate. Whereas the overall guidance is universally agreed upon, the use of the detailed guidance shall be agreed upon with the competent authority during the Type B package certification process. It should be noted that, for a given package, demonstration of compliance is not limited to a single methodology. While this International Standard does not require particular gas leakage test procedures, it does present minimum requirements for any test that is to be used. It is the responsibility of the package designer or consignor to estimate or determine the maximum permissible release rate of radioactivity to the environment and to select appropriate leakage test procedures that have adequate sensitivity. This International Standard pertains specifically to Type B packages for which the regulatory containment requirements are specified explicitly

  2. Increasing the competitiveness of maintenance contract rates by using an alternative methodology for the calculation of average vehicle maintenance costs

    Directory of Open Access Journals (Sweden)

    Stephen Carstens

    2008-11-01

    Full Text Available Companies tend to outsource transport to fleet management companies to increase efficiencies if transport is a non-core activity. The provision of fleet management services on contract introduces a certain amount of financial risk to the fleet management company, specifically fixed rate maintenance contracts. The quoted rate needs to be sufficient and also competitive in the market. Currently the quoted maintenance rates are based on the maintenance specifications of the manufacturer and the risk management approach of the fleet management company. This is usually reflected in a contingency that is included in the quoted maintenance rate. An alternative methodology for calculating the average maintenance cost for a vehicle fleet is proposed based on the actual maintenance expenditures of the vehicles and accepted statistical techniques. The proposed methodology results in accurate estimates (and associated confidence limits of the true average maintenance cost and can beused as a basis for the maintenance quote.

  3. Test methodology and technology of fracture toughness for small size specimens

    Energy Technology Data Exchange (ETDEWEB)

    Wakai, E.; Takada, F.; Ishii, T.; Ando, M. [Japan Atomic Energy Agency, Naga-gun, Ibaraki-ken (Japan); Matsukawa, S. [JNE Techno-Research Co., Kanagawa-ken (Japan)

    2007-07-01

    Full text of publication follows: Small specimen test technology (SSTT) is required to investigate mechanical properties in the limited availability of effective irradiation volumes in test reactors and accelerator-based neutron and charged particle sources. The test methodology guideline and the manufacture processes for very small size specimens have not been established, and we would have to formulate it. The technology to control exactly the load and displacement is also required in the test technology under the environment of high dose radiation produced from the specimens. The objective of this study is to examine the test technology and methodology of fracture toughness for very small size specimens. A new bend test machine installed in hot cell has been manufactured to obtain fracture toughness and DBTT (ductile - brittle transition temperature) of reduced-activation ferritic/martensitic steels for small bend specimens of t/2-1/3PCCVN (pre-cracked 1/3 size Charpy V-notch) with 20 mm length and DFMB (deformation and fracture mini bend specimen) with 9 mm length. The new machine can be performed at temperatures from -196 deg. C to 400 deg. C under unloading compliance method. Neutron irradiation was also performed at about 250 deg. C to about 2 dpa in JMTR. After the irradiation, fracture toughness and DBTT were examined by using the machine. Checking of displacement measurement between linear gauge of cross head's displacement and DVRT of the specimen displacement was performed exactly. Conditions of pre-crack due to fatigue in the specimen preparation were also examined and it depended on the shape and size of the specimens. Fracture toughness and DBTT of F82H steel for t/2-1/3PCCVN, DFMB and 0.18DCT specimens before irradiation were examined as a function of temperature. DBTT of smaller size specimens of DFMB was lower than that of larger size specimen of t/2-1/3PCCVN and 0.18DCT. The changes of fracture toughness and DBTT due to irradiation were also

  4. Development and testing of the circumvaginal muscles rating scale.

    Science.gov (United States)

    Worth, A M; Dougherty, M C; McKey, P L

    1986-01-01

    The purpose of this research was to develop an instrument for clinical assessment of the circumvaginal muscles (CVM), to test the reliability of the instrument, and to correlate sample characteristics with this instrument. The 9-point CVM Rating Scale is based on four components: pressure, duration, muscle ribbing, and position of the examiner's finger during examination. Reliability of the CVM Rating Scale was ascertained by use of interrater and test-retest reliability. Interrater reliability was tested on two separate occasions, N = 10, rho = 0.6, p less than .04; N = 10, rho = 0.7, p less than .05. A test-retest sequence was conducted 10 days apart, N = 10, rho = 0.9, p less than .003. Results from these tests indicated that the CVM Rating Scale is a reliable instrument for assessing CVM. A convenience sample of 30 women, aged 18-37, in good general health was tested, using the CVM Rating Scale. Women with a history of pelvic floor reconstructive surgery were excluded. A significant positive correlation between self-reported orgasm and the CVM Rating Scale total scores was found, chi 2 = 7.5, p less than .02. No significant correlations were found between age, race, parity, episiotomy, or self-reported Kegel exercises and the CVM Rating Scale total scores. The scale is a cost-effective, time-efficient, systematic assessment, accessible in clinical settings.

  5. Containment Leakage Rate Testing Program in NPP Krsko

    International Nuclear Information System (INIS)

    Dudas, M.; Heruc, Z.

    2002-01-01

    NPP Krsko adopted new regulations for testing of the reactor building containment as stipulated by 10CFR50 (Code of Federal Regulation) Appendix J, Option B instead of the previous requirement 10CFR50 Appendix J now renamed to 10CFR50 Appendix J, Option A. In the USA a thorough analyses of nuclear power plants reactor building containment testing was conducted. As part of these analyses the test results obtained from testing of various reactor-building containments in the last ten years were reviewed. It was concluded that it would be meaningful to, based on test results historical data, reconsider possibility of redefining testing intervals. The official proposal of such approach was reviewed and approved by the NRC and published in September of 1995 in the FR Vol.60 No.186. Based on directions from 10CFR50 Appendix J, Option B, the new criteria for definition of test intervals were created. Criteria were based upon past performance during testing (Performance-based Requirements) and safety impact. At NPP Krsko, the analyses of the Reactor Building Containment. Integrity Test results was performed . This included test results of the Containment Integrated Leak Rate Testing (CILRT or Type A tests), Containment Isolation Valves Local Leak Rater Tests (Type C tests) and Mechanical and Electrical Penetrations Local Leak Rate tests (Type B tests). In accordance with instructions from NEI 94-01 and based on analyses of test results, NPP Krsko created Containment Leakage Rate Testing Program with the purpose to establish the performance-based definition of test intervals, inspection scope, trending and reporting. Equally, the program gives instructions how to evaluate test results and how to deal with the containment penetration or isolation valve repair contingency. All changes caused with transition from Option A to Option B are marginal to public safety. (author)

  6. Accelerated Testing Methodology for the Determination of Slow Crack Growth of Advanced Ceramics

    Science.gov (United States)

    Choi, Sung R.; Salem, Jonathan A.; Gyekenyesi, John P.

    1997-01-01

    Constant stress-rate (dynamic fatigue) testing has been used for several decades to characterize slow crack growth behavior of glass and ceramics at both ambient and elevated temperatures. The advantage of constant stress-rate testing over other methods lies in its simplicity: Strengths are measured in a routine manner at four or more stress rates by applying a constant crosshead speed or constant loading rate. The slow crack growth parameters (n and A) required for design can be estimated from a relationship between strength and stress rate. With the proper use of preloading in constant stress-rate testing, an appreciable saving of test time can be achieved. If a preload corresponding to 50 % of the strength is applied to the specimen prior to testing, 50 % of the test time can be saved as long as the strength remains unchanged regardless of the applied preload. In fact, it has been a common, empirical practice in strength testing of ceramics or optical fibers to apply some preloading (less then 40%). The purpose of this work is to study the effect of preloading on the strength to lay a theoretical foundation on such an empirical practice. For this purpose, analytical and numerical solutions of strength as a function of preloading were developed. To verify the solution, constant stress-rate testing using glass and alumina at room temperature and alumina silicon nitride, and silicon carbide at elevated temperatures was conducted in a range of preloadings from O to 90 %.

  7. Strength of wood versus rate of testing - A theoretical approach

    DEFF Research Database (Denmark)

    Nielsen, Lauge Fuglsang

    2007-01-01

    Strength of wood is normally measured in ramp load experiments. Experience shows that strength increases with increasing rate of testing. This feature is considered theoretically in this paper. It is shown that the influence of testing rate is a phenomenon, which depends on the quality...... of the considered wood. Low quality wood shows lesser influence of testing rate. This observation agrees with the well-known statement made by Borg Madsen that weak wood subjected to a constant load, has a longer lifetime than strong wood. In general, the influence of testing rate on strength increases...

  8. Improvement of test methodology for evaluating diesel fuel stability

    Energy Technology Data Exchange (ETDEWEB)

    Gutman, M.; Tartakovsky, L.; Kirzhner, Y.; Zvirin, Y. [Internal Combustion Engines Lab., Haifa (Israel); Luria, D. [Fuel Authority, Tel Aviv (Israel); Weiss, A.; Shuftan, M. [Israel Defence Forces, Tel Aviv (Israel)

    1995-05-01

    The storage stability of diesel fuel has been extensively investigated for many years under laboratory conditions. Although continuous efforts have been made to improve testing techniques, there does not yet exist a generally accepted correlation between laboratory methods (such as chemical analysis of the fuel) and actual diesel engine tests. A testing method was developed by the Technion Internal Combustion Engines Laboratory (TICEL), in order to address this problem. The test procedure was designed to simulate diesel engine operation under field conditions. It is based on running a laboratory-modified single cylinder diesel engine for 50 h under cycling operating conditions. The overall rating of each test is based on individual evaluation of the deposits and residue formation in the fuel filter, nozzle body and needle, piston head, piston rings, exhaust valve, and combustion chamber (six parameters). Two methods for analyzing the test results were used: objective, based on measured data, and subjective, based on visual evaluation results of these deposits by a group of experts. Only the residual level in the fuel filter was evaluated quantitatively by measured results. In order to achieve higher accuracy of the method, the test procedure was improved by introducing the measured results of nozzle fouling as an additional objective evaluating (seventh) parameter. This factor is evaluated on the basis of the change in the air flow rate through the nozzle before and after the complete engine test. Other improvements in the method include the use of the nozzle assembly photograph in the test evaluation, and representation of all seven parameters on a continuous scale instead of the discrete scale used anteriorly, in order to achieve higher accuracy. This paper also contains the results obtained by application of this improved fuel stability test for a diesel fuel stored for a five-year period.

  9. Leakage localisation method in a water distribution system based on sensitivity matrix: methodology and real test

    OpenAIRE

    Pascual Pañach, Josep

    2010-01-01

    Leaks are present in all water distribution systems. In this paper a method for leakage detection and localisation is presented. It uses pressure measurements and simulation models. Leakage localisation methodology is based on pressure sensitivity matrix. Sensitivity is normalised and binarised using a common threshold for all nodes, so a signatures matrix is obtained. A pressure sensor optimal distribution methodology is developed too, but it is not used in the real test. To validate this...

  10. Accelerated testing for cosmic soft-error rate

    International Nuclear Information System (INIS)

    Ziegler, J.F.; Muhlfeld, H.P.; Montrose, C.J.; Curtis, H.W.; O'Gorman, T.J.; Ross, J.M.

    1996-01-01

    This paper describes the experimental techniques which have been developed at IBM to determine the sensitivity of electronic circuits to cosmic rays at sea level. It relates IBM circuit design and modeling, chip manufacture with process variations, and chip testing for SER sensitivity. This vertical integration from design to final test and with feedback to design allows a complete picture of LSI sensitivity to cosmic rays. Since advanced computers are designed with LSI chips long before the chips have been fabricated, and the system architecture is fully formed before the first chips are functional, it is essential to establish the chip reliability as early as possible. This paper establishes techniques to test chips that are only partly functional (e.g., only 1Mb of a 16Mb memory may be working) and can establish chip soft-error upset rates before final chip manufacturing begins. Simple relationships derived from measurement of more than 80 different chips manufactured over 20 years allow total cosmic soft-error rate (SER) to be estimated after only limited testing. Comparisons between these accelerated test results and similar tests determined by ''field testing'' (which may require a year or more of testing after manufacturing begins) show that the experimental techniques are accurate to a factor of 2

  11. Leak rate test of containment personnel lock

    International Nuclear Information System (INIS)

    Julien, J.T.; Peters, S.W.

    1988-01-01

    As part of the US NRC Containment Integrity Program, a leak rate test was performed on a full size personnel airlock for a nuclear containment building. The airlock was subjected to conditions simulating severe accident conditions. The objective of the test was to characterize the performance of airlock door seals when subjected to conditions that exceeded design. The seals tested were a double dog-ear configuration and made from EPDM E603. The data obtained from this test will be used by SNL as a benchmark for development of analytical methods. In addition to leak rate information, strain, temperature, displacements, and pressure data were measured and recorded from over 330 transducers. The test lasted approximately 60 hours. Data were recorded at regular intervals and during heating, pressurization and depressurization. The inner airlock door and bulkhead were exposed to a maximum air temperature of 850 F and a maximum air pressure of 300 psig. The airlock was originally designed for 340 F and 60 psig. Two heating and pressurization cycles were planned; one to heat to 400 F and pressurize to 300 psig, and the second to heat to 800 F and pressurize to 300 psig. No significant leakage was recorded during these two cycles. A third cycle was added to the test program. The air temperature was increased to 850 F and held at this temperature for approximately 10 hours. The inner door seal failed quickly at a pressure of 150.5 psig. The maximum leak rate was 706 SCFM

  12. Measuring Cognitive Task Demands Using Dual-Task Methodology, Subjective Self-Ratings, and Expert Judgments: A Validation Study

    Science.gov (United States)

    Revesz, Andrea; Michel, Marije; Gilabert, Roger

    2016-01-01

    This study explored the usefulness of dual-task methodology, self-ratings, and expert judgments in assessing task-generated cognitive demands as a way to provide validity evidence for manipulations of task complexity. The participants were 96 students and 61 English as a second language (ESL) teachers. The students, 48 English native speakers and…

  13. Methodology for determination of radon-222 production rate of residential building and experimental verification

    International Nuclear Information System (INIS)

    Tung, Thomas C.W.; Niu, J.L.; Burnett, J.; Lau, Judy O.W.

    2005-01-01

    Indoor radon concentration is mainly associated with the radon production rate of building material, ventilation rate, and the outdoor radon concentrations. Radon production rate of a room is defined as the sum of the products of the radon emanation rates and the exposed areas of the materials. Since the selection of the building materials and the exposed areas are different from room to room, it makes the radon production rate of homes fall in a wide range. Here, the radon production rate of a room is suggested to be quantified by a sealing method, in which the systematic radon growth curve is obtained. The radon production rate of the room can be determined from the initial slope of the growth curve. Three rooms at different homes in Hong Kong were selected in the study for verifying the methodology. The uncertainty characterized by data scatter arisen from the coupling effect of the leakage rate and outdoor radon was also included in the discussion. During the measurements, no occupant was allowed into the home. No mechanical ventilation was involved in the measurement. The indoor and outdoor radon concentrations of the sampled homes were monitored simultaneously and lasted for more than three days. The radon production rates and the uncertainties of three rooms at Homes 1, 2, and 3 were found to be 232.8, 46.0, 414.6, and 20.3, 9.4, 59.2Bqh -1 , respectively. The approach is valid when the air leakage rate of the room is controlled below 0.1h -1

  14. Accelerated Testing Methodology Developed for Determining the Slow Crack Growth of Advanced Ceramics

    Science.gov (United States)

    Choi, Sung R.; Gyekenyesi, John P.

    1998-01-01

    Constant stress-rate ("dynamic fatigue") testing has been used for several decades to characterize the slow crack growth behavior of glass and structural ceramics at both ambient and elevated temperatures. The advantage of such testing over other methods lies in its simplicity: strengths are measured in a routine manner at four or more stress rates by applying a constant displacement or loading rate. The slow crack growth parameters required for component design can be estimated from a relationship between strength and stress rate. With the proper use of preloading in constant stress-rate testing, test time can be reduced appreciably. If a preload corresponding to 50 percent of the strength is applied to the specimen prior to testing, 50 percent of the test time can be saved as long as the applied preload does not change the strength. In fact, it has been a common, empirical practice in the strength testing of ceramics or optical fibers to apply some preloading (<40 percent). The purpose of this work at the NASA Lewis Research Center is to study the effect of preloading on measured strength in order to add a theoretical foundation to the empirical practice.

  15. Testing for variation in taxonomic extinction probabilities: a suggested methodology and some results

    Science.gov (United States)

    Conroy, M.J.; Nichols, J.D.

    1984-01-01

    Several important questions in evolutionary biology and paleobiology involve sources of variation in extinction rates. In all cases of which we are aware, extinction rates have been estimated from data in which the probability that an observation (e.g., a fossil taxon) will occur is related both to extinction rates and to what we term encounter probabilities. Any statistical method for analyzing fossil data should at a minimum permit separate inferences on these two components. We develop a method for estimating taxonomic extinction rates from stratigraphic range data and for testing hypotheses about variability in these rates. We use this method to estimate extinction rates and to test the hypothesis of constant extinction rates for several sets of stratigraphic range data. The results of our tests support the hypothesis that extinction rates varied over the geologic time periods examined. We also present a test that can be used to identify periods of high or low extinction probabilities and provide an example using Phanerozoic invertebrate data. Extinction rates should be analyzed using stochastic models, in which it is recognized that stratigraphic samples are random varlates and that sampling is imperfect

  16. Integrated leak rate test results of JOYO reactor containment vessel

    International Nuclear Information System (INIS)

    Tamura, M.; Endo, J.

    1982-02-01

    Integrated leak rate tests of JOYO after the reactor coolant system had been filled with sodium have been performed two times since 1978 (February 1978 and December 1979). The tests were conducted with the in-containment sodium systems, primary argon cover gas system and air conditioning systems operating. Both the absolute pressure method and the reference chamber method were employed during the test. The results of both tests confirmed the functioning of the containment vessel, and leak rate limits were satisfied. In Addition, the adequancy of the test instrumentation system and the test method was demonstrated. Finally the plant conditions required to maintain reasonable accuracy for the leak rate testing of LMFBR were established. In this paper, the test conditions and the test results are described. (author)

  17. 77 FR 75896 - Alcohol and Drug Testing: Determination of Minimum Random Testing Rates for 2013

    Science.gov (United States)

    2012-12-26

    ...-11213, Notice No. 16] Alcohol and Drug Testing: Determination of Minimum Random Testing Rates for 2013...., Washington, DC 20590, (telephone 202-493- 1342); or Kathy Schnakenberg, FRA Alcohol/Drug Program Specialist... from FRA's Management Information System, the rail industry's random drug testing positive rate has...

  18. 75 FR 79308 - Alcohol and Drug Testing: Determination of Minimum Random Testing Rates for 2011

    Science.gov (United States)

    2010-12-20

    ...-11213, Notice No. 14] Alcohol and Drug Testing: Determination of Minimum Random Testing Rates for 2011... random testing positive rates were .037 percent for drugs and .014 percent for alcohol. Because the... effective December 20, 2010. FOR FURTHER INFORMATION CONTACT: Lamar Allen, Alcohol and Drug Program Manager...

  19. Test Methodology Development for Experimental Structural Assessment of ASC Planar Spring Material for Long-Term Durability

    Science.gov (United States)

    Yun, Gunjin; Abdullah, A. B. M.; Binienda, Wieslaw; Krause, David L.; Kalluri, Sreeramesh

    2014-01-01

    A vibration-based testing methodology has been developed that will assess fatigue behavior of the metallic material of construction for the Advanced Stirling Convertor displacer (planar) spring component. To minimize the testing duration, the test setup is designed for base-excitation of a multiplespecimen arrangement, driven in a high-frequency resonant mode; this allows completion of fatigue testing in an accelerated period. A high performance electro-dynamic exciter (shaker) is used to generate harmonic oscillation of cantilever beam specimens, which are clasped on the shaker armature with specially-designed clamp fixtures. The shaker operates in closed-loop control with dynamic specimen response feedback provided by a scanning laser vibrometer. A test coordinator function synchronizes the shaker controller and the laser vibrometer to complete the closed-loop scheme. The test coordinator also monitors structural health of the test specimens throughout the test period, recognizing any change in specimen dynamic behavior. As this may be due to fatigue crack initiation, the test coordinator terminates test progression and then acquires test data in an orderly manner. Design of the specimen and fixture geometry was completed by finite element analysis such that peak stress does not occur at the clamping fixture attachment points. Experimental stress evaluation was conducted to verify the specimen stress predictions. A successful application of the experimental methodology was demonstrated by validation tests with carbon steel specimens subjected to fully-reversed bending stress; high-cycle fatigue failures were induced in such specimens using higher-than-prototypical stresses

  20. The effect of instructional methodology on high school students natural sciences standardized tests scores

    Science.gov (United States)

    Powell, P. E.

    Educators have recently come to consider inquiry based instruction as a more effective method of instruction than didactic instruction. Experience based learning theory suggests that student performance is linked to teaching method. However, research is limited on inquiry teaching and its effectiveness on preparing students to perform well on standardized tests. The purpose of the study to investigate whether one of these two teaching methodologies was more effective in increasing student performance on standardized science tests. The quasi experimental quantitative study was comprised of two stages. Stage 1 used a survey to identify teaching methods of a convenience sample of 57 teacher participants and determined level of inquiry used in instruction to place participants into instructional groups (the independent variable). Stage 2 used analysis of covariance (ANCOVA) to compare posttest scores on a standardized exam by teaching method. Additional analyses were conducted to examine the differences in science achievement by ethnicity, gender, and socioeconomic status by teaching methodology. Results demonstrated a statistically significant gain in test scores when taught using inquiry based instruction. Subpopulation analyses indicated all groups showed improved mean standardized test scores except African American students. The findings benefit teachers and students by presenting data supporting a method of content delivery that increases teacher efficacy and produces students with a greater cognition of science content that meets the school's mission and goals.

  1. Efficient field testing for load rating railroad bridges

    Science.gov (United States)

    Schulz, Jeffrey L.; Brett C., Commander

    1995-06-01

    As the condition of our infrastructure continues to deteriorate, and the loads carried by our bridges continue to increase, an ever growing number of railroad and highway bridges require load limits. With safety and transportation costs at both ends of the spectrum. the need for accurate load rating is paramount. This paper describes a method that has been developed for efficient load testing and evaluation of short- and medium-span bridges. Through the use of a specially-designed structural testing system and efficient load test procedures, a typical bridge can be instrumented and tested at 64 points in less than one working day and with minimum impact on rail traffic. Various techniques are available to evaluate structural properties and obtain a realistic model. With field data, a simple finite element model is 'calibrated' and its accuracy is verified. Appropriate design and rating loads are applied to the resulting model and stress predictions are made. This technique has been performed on numerous structures to address specific problems and to provide accurate load ratings. The merits and limitations of this approach are discussed in the context of actual examples of both rail and highway bridges that were tested and evaluated.

  2. Ensemble Empirical Mode Decomposition based methodology for ultrasonic testing of coarse grain austenitic stainless steels.

    Science.gov (United States)

    Sharma, Govind K; Kumar, Anish; Jayakumar, T; Purnachandra Rao, B; Mariyappa, N

    2015-03-01

    A signal processing methodology is proposed in this paper for effective reconstruction of ultrasonic signals in coarse grained high scattering austenitic stainless steel. The proposed methodology is comprised of the Ensemble Empirical Mode Decomposition (EEMD) processing of ultrasonic signals and application of signal minimisation algorithm on selected Intrinsic Mode Functions (IMFs) obtained by EEMD. The methodology is applied to ultrasonic signals obtained from austenitic stainless steel specimens of different grain size, with and without defects. The influence of probe frequency and data length of a signal on EEMD decomposition is also investigated. For a particular sampling rate and probe frequency, the same range of IMFs can be used to reconstruct the ultrasonic signal, irrespective of the grain size in the range of 30-210 μm investigated in this study. This methodology is successfully employed for detection of defects in a 50mm thick coarse grain austenitic stainless steel specimens. Signal to noise ratio improvement of better than 15 dB is observed for the ultrasonic signal obtained from a 25 mm deep flat bottom hole in 200 μm grain size specimen. For ultrasonic signals obtained from defects at different depths, a minimum of 7 dB extra enhancement in SNR is achieved as compared to the sum of selected IMF approach. The application of minimisation algorithm with EEMD processed signal in the proposed methodology proves to be effective for adaptive signal reconstruction with improved signal to noise ratio. This methodology was further employed for successful imaging of defects in a B-scan. Copyright © 2014. Published by Elsevier B.V.

  3. Integral leakage rate tests of containments

    International Nuclear Information System (INIS)

    Engel, M.; Siefart, E.; Walter, R.

    1978-01-01

    A method is presented for the integral leakage rate tests of containments. This method, used in conjunction with statistical methods, provides reliable information on the tightness of the containment. This method forms the basis of DIN 25436/KTA 3405. (orig.) [de

  4. Leach test methodology for the Waste/Rock Interactions Technology Program

    International Nuclear Information System (INIS)

    Bradley, D.J.; McVay, G.L.; Coles, D.G.

    1980-05-01

    Experimental leach studies in the WRIT Program have two primary functions. The first is to determine radionuclide release from waste forms in laboratory environments which attempt to simulate repository conditions. The second is to elucidate leach mechanisms which can ultimately be incorporated into nearfield transport models. The tests have been utilized to generate rates of removal of elements from various waste forms and to provide specimens for surface analysis. Correlation between constituents released to the solution and corresponding solid state profiles is invaluable in the development of a leach mechanism. Several tests methods are employed in our studies which simulate various proposed leach incident scenarios. Static tests include low temperature (below 100 0 C) and high temperature (above 100 0 C) hydrothermal tests. These tests reproduce nonflow or low-flow repository conditions and can be used to compare materials and leach solution effects. The dynamic tests include single-pass, continuous-flow(SPCF) and solution-change (IAA)-type tests in which the leach solutions are changed at specific time intervals. These tests simulate repository conditions of higher flow rates and can also be used to compare materials and leach solution effects under dynamic conditions. The modified IAEA test is somewhat simpler to use than the one-pass flow and gives adequate results for comparative purposes. The static leach test models the condition of near-zero flow in a repository and provides information on element readsorption and solubility limits. The SPCF test is used to study the effects of flowing solutions at velocities that may be anticipated for geologic groundwaters within breached repositories. These two testing methods, coupled with the use of autoclaves, constitute the current thrust of WRIT leach testing

  5. Effects of health information exchange adoption on ambulatory testing rates.

    Science.gov (United States)

    Ross, Stephen E; Radcliff, Tiffany A; Leblanc, William G; Dickinson, L Miriam; Libby, Anne M; Nease, Donald E

    2013-01-01

    To determine the effects of the adoption of ambulatory electronic health information exchange (HIE) on rates of laboratory and radiology testing and allowable charges. Claims data from the dominant health plan in Mesa County, Colorado, from 1 April 2005 to 31 December 2010 were matched to HIE adoption data on the provider level. Using mixed effects regression models with the quarter as the unit of analysis, the effect of HIE adoption on testing rates and associated charges was assessed. Claims submitted by 306 providers in 69 practices for 34 818 patients were analyzed. The rate of testing per provider was expressed as tests per 1000 patients per quarter. For primary care providers, the rate of laboratory testing increased over the time span (baseline 1041 tests/1000 patients/quarter, increasing by 13.9 each quarter) and shifted downward with HIE adoption (downward shift of 83, prates or imputed charges in either provider group. Ambulatory HIE adoption is unlikely to produce significant direct savings through reductions in rates of testing. The economic benefits of HIE may reside instead in other downstream outcomes of better informed, higher quality care.

  6. Prediction of material removal rate and surface roughness for wire electrical discharge machining of nickel using response surface methodology

    Directory of Open Access Journals (Sweden)

    Thangam Chinnadurai

    2016-12-01

    Full Text Available This study focuses on investigating the effects of process parameters, namely, Peak current (Ip, Pulse on time (Ton, Pulse off time (Toff, Water pressure (Wp, Wire feed rate (Wf, Wire tension (Wt, Servo voltage (Sv and Servo feed setting (Sfs, on the Material Removal Rate (MRR and Surface Roughness (SR for Wire electrical discharge machining (Wire-EDM of nickel using Taguchi method. Response Surface Methodology (RSM is adopted to evolve mathematical relationships between the wire cutting process parameters and the output variables of the weld joint to determine the welding input parameters that lead to the desired optimal wire cutting quality. Besides, using response surface plots, the interaction effects of process parameters on the responses are analyzed and discussed. The statistical software Mini-tab is used to establish the design and to obtain the regression equations. The developed mathematical models are tested by analysis-of-variance (ANOVA method to check their appropriateness and suitability. Finally, a comparison is made between measured and calculated results, which are in good agreement. This indicates that the developed models can predict the responses accurately and precisely within the limits of cutting parameter being used.

  7. Prediction of material removal rate and surface roughness for wire electrical discharge machining of nickel using response surface methodology

    International Nuclear Information System (INIS)

    Chinnadurai, T.; Vendan, S.A.

    2016-01-01

    This study focuses on investigating the effects of process parameters, namely, Peak current (Ip), Pulse on time (Ton), Pulse off time (Toff), Water pressure (Wp), Wire feed rate (Wf), Wire tension (Wt), Servo voltage (Sv) and Servo feed setting (Sfs), on the Material Removal Rate (MRR) and Surface Roughness (SR) for Wire electrical discharge machining (Wire-EDM) of nickel using Taguchi method. Response Surface Methodology (RSM) is adopted to evolve mathematical relationships between the wire cutting process parameters and the output variables of the weld joint to determine the welding input parameters that lead to the desired optimal wire cutting quality. Besides, using response surface plots, the interaction effects of process parameters on the responses are analyzed and discussed. The statistical software Mini-tab is used to establish the design and to obtain the regression equations. The developed mathematical models are tested by analysis-of-variance (ANOVA) method to check their appropriateness and suitability. Finally, a comparison is made between measured and calculated results, which are in good agreement. This indicates that the developed models can predict the responses accurately and precisely within the limits of cutting parameter being used. (Author)

  8. Prediction of material removal rate and surface roughness for wire electrical discharge machining of nickel using response surface methodology

    Energy Technology Data Exchange (ETDEWEB)

    Chinnadurai, T.; Vendan, S.A.

    2016-07-01

    This study focuses on investigating the effects of process parameters, namely, Peak current (Ip), Pulse on time (Ton), Pulse off time (Toff), Water pressure (Wp), Wire feed rate (Wf), Wire tension (Wt), Servo voltage (Sv) and Servo feed setting (Sfs), on the Material Removal Rate (MRR) and Surface Roughness (SR) for Wire electrical discharge machining (Wire-EDM) of nickel using Taguchi method. Response Surface Methodology (RSM) is adopted to evolve mathematical relationships between the wire cutting process parameters and the output variables of the weld joint to determine the welding input parameters that lead to the desired optimal wire cutting quality. Besides, using response surface plots, the interaction effects of process parameters on the responses are analyzed and discussed. The statistical software Mini-tab is used to establish the design and to obtain the regression equations. The developed mathematical models are tested by analysis-of-variance (ANOVA) method to check their appropriateness and suitability. Finally, a comparison is made between measured and calculated results, which are in good agreement. This indicates that the developed models can predict the responses accurately and precisely within the limits of cutting parameter being used. (Author)

  9. Evaluation of constraint methodologies applied to a shallow-flaw cruciform bend specimen tested under biaxial loading conditions

    International Nuclear Information System (INIS)

    Bass, B.R.; McAfee, W.J.; Williams, P.T.; Pennell, W.E.

    1998-01-01

    A technology to determine shallow-flaw fracture toughness of reactor pressure vessel (RPV) steels is being developed for application to the safety assessment of RPVs containing postulated shallow surface flaws. Matrices of cruciform beam tests were developed to investigate and quantify the effects of temperature, biaxial loading, and specimen size on fracture initiation toughness of two-dimensional (constant depth), shallow surface flaws. The cruciform beam specimens were developed at Oak Ridge National Laboratory (ORNL) to introduce a prototypic, far-field. out-of-plane biaxial stress component in the test section that approximates the nonlinear stresses resulting from pressurized-thermal-shock or pressure-temperature loading of an RPV. Tests were conducted under biaxial load ratios ranging from uniaxial to equibiaxial. These tests demonstrated that biaxial loading can have a pronounced effect on shallow-flaw fracture toughness in the lower transition temperature region for RPV materials. The cruciform fracture toughness data were used to evaluate fracture methodologies for predicting the observed effects of biaxial loading on shallow-flaw fracture toughness. Initial emphasis was placed on assessment of stress-based methodologies. namely, the J-Q formulation, the Dodds-Anderson toughness scaling model, and the Weibull approach. Applications of these methodologies based on the hydrostatic stress fracture criterion indicated an effect of loading-biaxiality on fracture toughness, the conventional maximum principal stress criterion indicated no effect

  10. Find-rate methodology and resource base estimates of the Hydrocarbon Supply Model (1990 update). Topical report

    International Nuclear Information System (INIS)

    Woods, T.

    1991-02-01

    The Hydrocarbon Supply Model is used to develop long-term trends in Lower-48 gas production and costs. The model utilizes historical find-rate patterns to predict the discovery rate and size distribution of future oil and gas field discoveries. The report documents the methodologies used to quantify historical oil and gas field find-rates and to project those discovery patterns for future drilling. It also explains the theoretical foundations for the find-rate approach. The new field and reserve growth resource base is documented and compared to other published estimates. The report has six sections. Section 1 provides background information and an overview of the model. Sections 2, 3, and 4 describe the theoretical foundations of the model, the databases, and specific techniques used. Section 5 presents the new field resource base by region and depth. Section 6 documents the reserve growth model components

  11. The Leeb Hardness Test for Rock: An Updated Methodology and UCS Correlation

    Science.gov (United States)

    Corkum, A. G.; Asiri, Y.; El Naggar, H.; Kinakin, D.

    2018-03-01

    The Leeb hardness test (LHT with test value of L D ) is a rebound hardness test, originally developed for metals, that has been correlated with the Unconfined Compressive Strength (test value of σ c ) of rock by several authors. The tests can be carried out rapidly, conveniently and nondestructively on core and block samples or on rock outcrops. This makes the relatively small LHT device convenient for field tests. The present study compiles test data from literature sources and presents new laboratory testing carried out by the authors to develop a substantially expanded database with wide-ranging rock types. In addition, the number of impacts that should be averaged to comprise a "test result" was revisited along with the issue of test specimen size. Correlation for L D and σ c for various rock types is provided along with recommended testing methodology. The accuracy of correlated σ c estimates was assessed and reasonable correlations were observed between L D and σ c . The study findings show that LHT can be useful particularly for field estimation of σ c and offers a significant improvement over the conventional field estimation methods outlined by the ISRM (e.g., hammer blows). This test is rapid and simple, with relatively low equipment costs, and provides a reasonably accurate estimate of σ c .

  12. Integrated leak rate testing of the fast flux test facility reactor containment building

    International Nuclear Information System (INIS)

    James, E.B.; Farabee, O.A.; Bliss, R.J.

    1978-01-01

    The initial Integrated Leak Rate Test (ILRT) of the Fast Flux Test Facility containment building was performed from May 27 to June 2, 1978. The test was conducted in air with systems vented and with the containment recirculating coolers in operation. 10 psig and 5 psig tests were run using the absolute pressure test method. The measured leakage rates were .033% Vol/24 hr. and -.0015% Vol/24 hrs. respectively. Subsequent verification tests at both 10 psig and 5 psig proved that the test equipment was operating properly and it was sensitive enough to detect leaks at low pressures. This ILRT was performed at a lower pressure than any previous ILRT on a reactor containment structure in the United States. While the initial design requirements for ice condenser containments called for a part pressure test at 6 psig, the tests were waived due to the apparent statistical problems of data analysis and the repeatability of the data itself at such low pressure. In contrast to this belief, both the 5 and 10 psig ILRT's were performed in a successful manner at FFTF

  13. High rate tests of the LHCb RICH Upgrade system

    CERN Multimedia

    Blago, Michele Piero

    2016-01-01

    One of the biggest challenges for the upgrade of the LHCb RICH detectors from 2020 is to readout the photon detectors at the full 40 MHz rate of the LHC proton-proton collisions. A test facility has been setup at CERN with the purpose to investigate the behaviour of the Multi Anode PMTs, which have been proposed for the upgrade, and their readout electronics at high trigger rates. The MaPMTs are illuminated with a monochromatic laser that can be triggered independently of the readout electronics. A first series of tests, including threshold scans, is performed at low trigger rates (20 kHz) for both the readout and the laser with the purpose to characterise the behaviour of the system under test. Then the trigger rate is increased in two separate steps. First the MaPMTs are exposed to high illumination by triggering the pulsed laser at a high (20 MHz) repetition rate while the DAQ is readout at the same low rate as before. In this way the performance of the MaPMTs and the attached electronics can be evaluated ...

  14. Comparison of methodological quality rating of systematic reviews on neuropathic pain using AMSTAR and R-AMSTAR.

    Science.gov (United States)

    Dosenovic, Svjetlana; Jelicic Kadic, Antonia; Vucic, Katarina; Markovina, Nikolina; Pieper, Dawid; Puljak, Livia

    2018-05-08

    quality ratings. Our results point out to weaknesses in the methodology of existing SRs on interventions for the management NeuP and call for future improvement by better adherence to analyzed quality checklists, either AMSTAR or R-AMSTAR.

  15. Fast Lemons and Sour Boulders: Testing Crossmodal Correspondences Using an Internet-Based Testing Methodology

    Directory of Open Access Journals (Sweden)

    Andy T. Woods

    2013-09-01

    Full Text Available According to a popular family of hypotheses, crossmodal matches between distinct features hold because they correspond to the same polarity on several conceptual dimensions (such as active–passive, good–bad, etc. that can be identified using the semantic differential technique. The main problem here resides in turning this hypothesis into testable empirical predictions. In the present study, we outline a series of plausible consequences of the hypothesis and test a variety of well-established and previously untested crossmodal correspondences by means of a novel internet-based testing methodology. The results highlight that the semantic hypothesis cannot easily explain differences in the prevalence of crossmodal associations built on the same semantic pattern (fast lemons, slow prunes, sour boulders, heavy red; furthermore, the semantic hypothesis only minimally predicts what happens when the semantic dimensions and polarities that are supposed to drive such crossmodal associations are made more salient (e.g., by adding emotional cues that ought to make the good/bad dimension more salient; finally, the semantic hypothesis does not explain why reliable matches are no longer observed once intramodal dimensions with congruent connotations are presented (e.g., visually presented shapes and colour do not appear to correspond.

  16. Physiological and methodological aspects of rate of force development assessment in human skeletal muscle.

    Science.gov (United States)

    Rodríguez-Rosell, David; Pareja-Blanco, Fernando; Aagaard, Per; González-Badillo, Juan José

    2017-12-20

    Rate of force development (RFD) refers to the ability of the neuromuscular system to increase contractile force from a low or resting level when muscle activation is performed as quickly as possible, and it is considered an important muscle strength parameter, especially for athletes in sports requiring high-speed actions. The assessment of RFD has been used for strength diagnosis, to monitor the effects of training interventions in both healthy populations and patients, discriminate high-level athletes from those of lower levels, evaluate the impairment in mechanical muscle function after acute bouts of eccentric muscle actions and estimate the degree of fatigue and recovery after acute exhausting exercise. Notably, the evaluation of RFD in human skeletal muscle is a complex task as influenced by numerous distinct methodological factors including mode of contraction, type of instruction, method used to quantify RFD, devices used for force/torque recording and ambient temperature. Another important aspect is our limited understanding of the mechanisms underpinning rapid muscle force production. Therefore, this review is primarily focused on (i) describing the main mechanical characteristics of RFD; (ii) analysing various physiological factors that influence RFD; and (iii) presenting and discussing central biomechanical and methodological factors affecting the measurement of RFD. The intention of this review is to provide more methodological and analytical coherency on the RFD concept, which may aid to clarify the thinking of coaches and sports scientists in this area. © 2017 Scandinavian Society of Clinical Physiology and Nuclear Medicine. Published by John Wiley & Sons Ltd.

  17. Fisheye Photogrammetry: Tests and Methodologies for the Survey of Narrow Spaces

    Science.gov (United States)

    Perfetti, L.; Polari, C.; Fassi, F.

    2017-02-01

    The research illustrated in this article aimed at identifying a good standard methodology to survey very narrow spaces during 3D investigation of Cultural Heritage. It is an important topic in today's era of BIM modelling applied to Cultural Heritage. Spaces like staircases, corridors and passages are very common in the architectural or archaeological fields, and obtaining a 3D-oriented survey of those areas can be a very complex task when completeness of the model and high precision are requested. Photogrammetry appears to be the most promising solution in terms of versatility and manoeuvrability also considering the quality of the required data. Fisheye lenses were studied and tested in depth because of their significant advantage in the field of view if compared with rectilinear lenses. This advantage alone can be crucial to reduce the total amount of photos and, as a consequence, to obtain manageable data, to simplify the survey phase and to significantly reduce the elaboration time. In order to overcome the main issue that arise when using fisheye lenses, which is the lack of rules that can be employed to design the survey, a general mathematical formulation to precisely estimate the GSD (Ground Sampling Distance) for every optical projection is presented here. A complete survey of a real complex case study was performed in order to test and stress the proposed methodology, and to handle a fisheye-based survey from beginning to end: the photogrammetric survey of the Minguzzi Staircase. It is a complex service spiral-staircase located in the Duomo di Milano with a total height of 25 meters and characterized by a narrow walkable space about 70 centimetres wide.

  18. RDANN a new methodology to solve the neutron spectra unfolding problem

    Energy Technology Data Exchange (ETDEWEB)

    Ortiz R, J.M.; Martinez B, M.R.; Vega C, H.R. [UAZ, Av. Ramon Lopez Velarde No. 801, 98000 Zacatecas (Mexico)

    2006-07-01

    The optimization processes known as Taguchi method and DOE methodology are applied to the design, training and testing of Artificial Neural Networks in the neutron spectrometry field, which offer potential benefits in the evaluation of the behavior of the net as well as the ability to examine the interaction of the weights and neurons inside the same one. In this work, the Robust Design of Artificial Neural Networks methodology is used to solve the neutron spectra unfolding problem, designing, training and testing an ANN using a set of 187 neutron spectra compiled by the International Atomic Energy Agency, to obtain the better neutron spectra unfolded from the Bonner spheres spectrometer's count rates. (Author)

  19. RDANN a new methodology to solve the neutron spectra unfolding problem

    International Nuclear Information System (INIS)

    Ortiz R, J.M.; Martinez B, M.R.; Vega C, H.R.

    2006-01-01

    The optimization processes known as Taguchi method and DOE methodology are applied to the design, training and testing of Artificial Neural Networks in the neutron spectrometry field, which offer potential benefits in the evaluation of the behavior of the net as well as the ability to examine the interaction of the weights and neurons inside the same one. In this work, the Robust Design of Artificial Neural Networks methodology is used to solve the neutron spectra unfolding problem, designing, training and testing an ANN using a set of 187 neutron spectra compiled by the International Atomic Energy Agency, to obtain the better neutron spectra unfolded from the Bonner spheres spectrometer's count rates. (Author)

  20. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear-reactor-safety research program is described and compared with other methodologies established for performing uncertainty analyses

  1. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear reactor safety research program is described and compared with other methodologies established for performing uncertainty analyses

  2. Combined approach to reduced duration integrated leakage rate testing

    International Nuclear Information System (INIS)

    Galanti, P.J.

    1987-01-01

    Even though primary reactor containment allowable leakage rates are expressed in weight percent per day of contained air, engineers have been attempting to define acceptable methods to test in < 24 h as long as these tests have been performed. The reasons to reduce testing duration are obvious, because time not generating electricity is time not generating revenue for the utilities. The latest proposed revision to 10CFR50 Appendix J, concerning integrated leakage rate testing (ILRTs), was supplemented with a draft regulatory guide proposing yet another method. This paper proposes a method that includes elements of currently accepted concepts for short duration testing with a standard statistical check for criteria acceptance. Following presentation of the method, several cases are presented showing the results of these combined criteria

  3. Study on the leak rate test for HANARO reactor building

    International Nuclear Information System (INIS)

    Choi, Y. S.; Kim, Y. K.; Kim, M. J.; Park, J. M.; Woo, J. S.

    2002-01-01

    The reactor building of HANARO adopts the confinement concept, which allows a certain amount of air leakage. In order to restrict the air leakage through the confinement boundary, negative pressure of at least 2.5 mmWG is maintained in normal operating condition while maintaining 25 mmWG of negative pressure in abnormal condition, the inside air filtered by a train of charcoal filter is released to the atmosphere through the stack. In this situation, if the emergency ventilation system is not operable, the reactor building is isolated from the outside then the trapped air inside will be leaked out through the building by ground release concept. As the leak rate may be affected by an effect of wind velocity outside the reactor building, the air tightness of confinement should be maintained to limit the leak rate below the allowable value. The local leak rate test method was used since the beginning of the commissioning until July 1999. However it has been pointed out as a defect that the method is so susceptible to the change of temperature and atmospheric pressure during testing. For more accurate leak rate testing, we have introduced a new test method. We have periodically carried out the new leak rate testing and the results indicate that the bad effect by the temperature and atmospheric pressure change is considerably reduced, which gives more stable leak rate measurement

  4. Stationarity test with a direct test for heteroskedasticity in exchange rate forecasting models

    Science.gov (United States)

    Khin, Aye Aye; Chau, Wong Hong; Seong, Lim Chee; Bin, Raymond Ling Leh; Teng, Kevin Low Lock

    2017-05-01

    Global economic has been decreasing in the recent years, manifested by the greater exchange rates volatility on international commodity market. This study attempts to analyze some prominent exchange rate forecasting models on Malaysian commodity trading: univariate ARIMA, ARCH and GARCH models in conjunction with stationarity test on residual diagnosis direct testing of heteroskedasticity. All forecasting models utilized the monthly data from 1990 to 2015. Given a total of 312 observations, the data used to forecast both short-term and long-term exchange rate. The forecasting power statistics suggested that the forecasting performance of ARIMA (1, 1, 1) model is more efficient than the ARCH (1) and GARCH (1, 1) models. For ex-post forecast, exchange rate was increased from RM 3.50 per USD in January 2015 to RM 4.47 per USD in December 2015 based on the baseline data. For short-term ex-ante forecast, the analysis results indicate a decrease in exchange rate on 2016 June (RM 4.27 per USD) as compared with 2015 December. A more appropriate forecasting method of exchange rate is vital to aid the decision-making process and planning on the sustainable commodities' production in the world economy.

  5. TESTING TREE-CLASSIFIER VARIANTS AND ALTERNATE MODELING METHODOLOGIES IN THE EAST GREAT BASIN MAPPING UNIT OF THE SOUTHWEST REGIONAL GAP ANALYSIS PROJECT (SW REGAP)

    Science.gov (United States)

    We tested two methods for dataset generation and model construction, and three tree-classifier variants to identify the most parsimonious and thematically accurate mapping methodology for the SW ReGAP project. Competing methodologies were tested in the East Great Basin mapping un...

  6. Efficient testing methodologies for microcameras in a gigapixel imaging system

    Science.gov (United States)

    Youn, Seo Ho; Marks, Daniel L.; McLaughlin, Paul O.; Brady, David J.; Kim, Jungsang

    2013-04-01

    Multiscale parallel imaging--based on a monocentric optical design--promises revolutionary advances in diverse imaging applications by enabling high resolution, real-time image capture over a wide field-of-view (FOV), including sport broadcast, wide-field microscopy, astronomy, and security surveillance. Recently demonstrated AWARE-2 is a gigapixel camera consisting of an objective lens and 98 microcameras spherically arranged to capture an image over FOV of 120° by 50°, using computational image processing to form a composite image of 0.96 gigapixels. Since microcameras are capable of individually adjusting exposure, gain, and focus, true parallel imaging is achieved with a high dynamic range. From the integration perspective, manufacturing and verifying consistent quality of microcameras is a key to successful realization of AWARE cameras. We have developed an efficient testing methodology that utilizes a precisely fabricated dot grid chart as a calibration target to extract critical optical properties such as optical distortion, veiling glare index, and modulation transfer function to validate imaging performance of microcameras. This approach utilizes an AWARE objective lens simulator which mimics the actual objective lens but operates with a short object distance, suitable for a laboratory environment. Here we describe the principles of the methodologies developed for AWARE microcameras and discuss the experimental results with our prototype microcameras. Reference Brady, D. J., Gehm, M. E., Stack, R. A., Marks, D. L., Kittle, D. S., Golish, D. R., Vera, E. M., and Feller, S. D., "Multiscale gigapixel photography," Nature 486, 386--389 (2012).

  7. Non-animal methodologies within biomedical research and toxicity testing.

    Science.gov (United States)

    Knight, Andrew

    2008-01-01

    Laboratory animal models are limited by scientific constraints on human applicability, and increasing regulatory restrictions, driven by social concerns. Reliance on laboratory animals also incurs marked - and in some cases, prohibitive - logistical challenges, within high-throughput chemical testing programmes, such as those currently underway within Europe and the US. However, a range of non-animal methodologies is available within biomedical research and toxicity testing. These include: mechanisms to enhance the sharing and assessment of existing data prior to conducting further studies, and physicochemical evaluation and computerised modelling, including the use of structure-activity relationships and expert systems. Minimally-sentient animals from lower phylogenetic orders or early developmental vertebral stages may be used, as well as microorganisms and higher plants. A variety of tissue cultures, including immortalised cell lines, embryonic and adult stem cells, and organotypic cultures, are also available. In vitro assays utilising bacterial, yeast, protozoal, mammalian or human cell cultures exist for a wide range of toxic and other endpoints. These may be static or perfused, and may be used individually, or combined within test batteries. Human hepatocyte cultures and metabolic activation systems offer potential assessment of metabolite activity and organ-organ interaction. Microarray technology may allow genetic expression profiling, increasing the speed of toxin detection, well prior to more invasive endpoints. Enhanced human clinical trials utilising micro- dosing, staggered dosing, and more representative study populations and durations, as well as surrogate human tissues, advanced imaging modalities and human epidemiological, sociological and psycho- logical studies, may increase our understanding of illness aetiology and pathogenesis, and facilitate the development of safe and effective pharmacologic interventions. Particularly when human tissues

  8. Cernavoda NPP risk - Based test and maintenance planning - Methodology development

    International Nuclear Information System (INIS)

    Georgescu, G.; Popa, P.; Petrescu, A.; Naum, M.; Gutu, M.

    1997-01-01

    The Cernavoda Power Plant starts the commercial operation in November 1996. During operation of the nuclear power plant, several mandatory tests and maintenance are performed on stand-by safety system components to ensure their availability in case of accident. The basic purpose of such activities is the early detection of any failure and degradation, and timely correction of deteriorations. Because of the large number of such activities, emphasis on plant safety and allocation of resources becomes difficult. The probabilistic model and methodology can be effectively used to obtain the risk significance of these activities so that the resources are directed to the most important areas. The proposed Research Contract activity is strongly connected with other safety related areas under development. Since, the Cernavoda Probabilistic Safety Evaluation Level 1 PSA Study (CPSE) was performed and now the study is revised taking into account the as-built information, it is recommended to implement into the model the necessary modeling features to support further PSA application, especially related to Test and Maintenance optimization. Methods need to be developed in order to apply the PSA model including risk information together with other needed information for Test and Maintenance optimization. Also, in parallel with the CPSE study updating, the software interface for the PSA model is under development (Risk Monitor Software class), methods and models needing to be developed for the purpose of using it for qualified monitoring of Test and Maintenance Strategy efficiency. Similar, the Data Collection System need to be appropriate for the purpose of an ongoing implementation of a risk - based Test and Maintenance Strategy. (author). 4 refs, 1 fig

  9. Validation of a Methodology to Predict Micro-Vibrations Based on Finite Element Model Approach

    Science.gov (United States)

    Soula, Laurent; Rathband, Ian; Laduree, Gregory

    2014-06-01

    This paper presents the second part of the ESA R&D study called "METhodology for Analysis of structure- borne MICro-vibrations" (METAMIC). After defining an integrated analysis and test methodology to help predicting micro-vibrations [1], a full-scale validation test campaign has been carried out. It is based on a bread-board representative of typical spacecraft (S/C) platform consisting in a versatile structure made of aluminium sandwich panels equipped with different disturbance sources and a dummy payload made of a silicon carbide (SiC) bench. The bread-board has been instrumented with a large set of sensitive accelerometers and tests have been performed including back-ground noise measurement, modal characterization and micro- vibration tests. The results provided responses to the perturbation coming from a reaction wheel or cryo-cooler compressors, operated independently then simultaneously with different operation modes. Using consistent modelling and associated experimental characterization techniques, a correlation status has been assessed by comparing test results with predictions based on FEM approach. Very good results have been achieved particularly for the case of a wheel in sweeping rate operation with test results over-predicted within a reasonable margin lower than two. Some limitations of the methodology have also been identified for sources operating at a fixed rate or coming with a small number of dominant harmonics and recommendations have been issued in order to deal with model uncertainties and stay conservative.

  10. Standardized Testing Practices: Effect on Graduation and NCLEX® Pass Rates.

    Science.gov (United States)

    Randolph, Pamela K

    The use standardized testing in pre-licensure nursing programs has been accompanied by conflicting reports of effective practices. The purpose of this project was to describe standardized testing practices in one states' nursing programs and discover if the use of a cut score or oversight of remediation had any effect on (a) first time NCLEX® pass rates, (b) on-time graduation (OTG) or (c) the combination of (a) and (b). Administrators of 38 nursing programs in one Southwest state were sent surveys; surveys were returned by 34 programs (89%). Survey responses were compared to each program's NCLEX pass rate and on-time graduation rate; t-tests were conducted for significant differences associated with a required minimum score (cut score) and oversight of remediation. There were no significant differences in NCLEX pass or on-time graduation rates related to establishment of a cut score. There was a significant difference when the NCLEX pass rate and on-time graduation rate were combined (Outcome Index "OI") with significantly higher program outcomes (P=.02.) for programs without cut-scores. There were no differences associated with faculty oversight of remediation. The results of this study do not support establishment of a cut-score when implementing a standardized testing. Copyright © 2016. Published by Elsevier Inc.

  11. Testing jumps via false discovery rate control.

    Science.gov (United States)

    Yen, Yu-Min

    2013-01-01

    Many recently developed nonparametric jump tests can be viewed as multiple hypothesis testing problems. For such multiple hypothesis tests, it is well known that controlling type I error often makes a large proportion of erroneous rejections, and such situation becomes even worse when the jump occurrence is a rare event. To obtain more reliable results, we aim to control the false discovery rate (FDR), an efficient compound error measure for erroneous rejections in multiple testing problems. We perform the test via the Barndorff-Nielsen and Shephard (BNS) test statistic, and control the FDR with the Benjamini and Hochberg (BH) procedure. We provide asymptotic results for the FDR control. From simulations, we examine relevant theoretical results and demonstrate the advantages of controlling the FDR. The hybrid approach is then applied to empirical analysis on two benchmark stock indices with high frequency data.

  12. The UCERF3 grand inversion: Solving for the long‐term rate of ruptures in a fault system

    Science.gov (United States)

    Page, Morgan T.; Field, Edward H.; Milner, Kevin; Powers, Peter M.

    2014-01-01

    We present implementation details, testing, and results from a new inversion‐based methodology, known colloquially as the “grand inversion,” developed for the Uniform California Earthquake Rupture Forecast (UCERF3). We employ a parallel simulated annealing algorithm to solve for the long‐term rate of all ruptures that extend through the seismogenic thickness on major mapped faults in California while simultaneously satisfying available slip‐rate, paleoseismic event‐rate, and magnitude‐distribution constraints. The inversion methodology enables the relaxation of fault segmentation and allows for the incorporation of multifault ruptures, which are needed to remove magnitude‐distribution misfits that were present in the previous model, UCERF2. The grand inversion is more objective than past methodologies, as it eliminates the need to prescriptively assign rupture rates. It also provides a means to easily update the model as new data become available. In addition to UCERF3 model results, we present verification of the grand inversion, including sensitivity tests, tuning of equation set weights, convergence metrics, and a synthetic test. These tests demonstrate that while individual rupture rates are poorly resolved by the data, integrated quantities such as magnitude–frequency distributions and, most importantly, hazard metrics, are much more robust.

  13. Auditing HIV Testing Rates across Europe: Results from the HIDES 2 Study.

    Directory of Open Access Journals (Sweden)

    D Raben

    Full Text Available European guidelines recommend the routine offer of an HIV test in patients with a number of AIDS-defining and non-AIDS conditions believed to share an association with HIV; so called indicator conditions (IC. Adherence with this guidance across Europe is not known. We audited HIV testing behaviour in patients accessing care for a number of ICs. Participating centres reviewed the case notes of either 100 patients or of all consecutive patients in one year, presenting for each of the following ICs: tuberculosis, non-Hodgkins lymphoma, anal and cervical cancer, hepatitis B and C and oesophageal candidiasis. Observed HIV-positive rates were applied by region and IC to estimate the number of HIV diagnoses potentially missed. Outcomes examined were: HIV test rate (% of total patients with IC, HIV test accepted (% of tests performed/% of tests offered and new HIV diagnosis rate (%. There were 49 audits from 23 centres, representing 7037 patients. The median test rate across audits was 72% (IQR 32-97, lowest in Northern Europe (median 44%, IQR 22-68% and highest in Eastern Europe (median 99%, IQR 86-100. Uptake of testing was close to 100% in all regions. The median HIV+ rate was 0.9% (IQR 0.0-4.9, with 29 audits (60.4% having an HIV+ rate >0.1%. After adjustment, there were no differences between regions of Europe in the proportion with >0.1% testing positive (global p = 0.14. A total of 113 patients tested HIV+. Applying the observed rates of testing HIV+ within individual ICs and regions to all persons presenting with an IC suggested that 105 diagnoses were potentially missed. Testing rates in well-established HIV ICs remained low across Europe, despite high prevalence rates, reflecting missed opportunities for earlier HIV diagnosis and care. Significant numbers may have had an opportunity for HIV diagnosis if all persons included in IC audits had been tested.

  14. The evaluation of a rapid in situ HIV confirmation test in a programme with a high failure rate of the WHO HIV two-test diagnostic algorithm.

    Directory of Open Access Journals (Sweden)

    Derryck B Klarkowski

    Full Text Available BACKGROUND: Concerns about false-positive HIV results led to a review of testing procedures used in a Médecins Sans Frontières (MSF HIV programme in Bukavu, eastern Democratic Republic of Congo. In addition to the WHO HIV rapid diagnostic test algorithm (RDT (two positive RDTs alone for HIV diagnosis used in voluntary counselling and testing (VCT sites we evaluated in situ a practical field-based confirmation test against western blot WB. In addition, we aimed to determine the false-positive rate of the WHO two-test algorithm compared with our adapted protocol including confirmation testing, and whether weakly reactive compared with strongly reactive rapid test results were more likely to be false positives. METHODOLOGY/PRINCIPAL FINDINGS: 2864 clients presenting to MSF VCT centres in Bukavu during January to May 2006 were tested using Determine HIV-1/2 and UniGold HIV rapid tests in parallel by nurse counsellors. Plasma samples on 229 clients confirmed as double RDT positive by laboratory retesting were further tested using both WB and the Orgenics Immunocomb Combfirm HIV confirmation test (OIC-HIV. Of these, 24 samples were negative or indeterminate by WB representing a false-positive rate of the WHO two-test algorithm of 10.5% (95%CI 6.6-15.2. 17 of the 229 samples were weakly positive on rapid testing and all were negative or indeterminate by WB. The false-positive rate fell to 3.3% (95%CI 1.3-6.7 when only strong-positive rapid test results were considered. Agreement between OIC-HIV and WB was 99.1% (95%CI 96.9-99.9% with no false OIC-HIV positives if stringent criteria for positive OIC-HIV diagnoses were used. CONCLUSIONS: The WHO HIV two-test diagnostic algorithm produced an unacceptably high level of false-positive diagnoses in our setting, especially if results were weakly positive. The most probable causes of the false-positive results were serological cross-reactivity or non-specific immune reactivity. Our findings show that the OIC

  15. Simplified Abrasion Test Methodology for Candidate EVA Glove Lay-Ups

    Science.gov (United States)

    Rabel, Emily; Aitchison, Lindsay

    2015-01-01

    During the Apollo Program, space suit outer-layer fabrics were badly abraded after performing just a few extravehicular activities (EVAs). For example, the Apollo 12 commander reported abrasive wear on the boots that penetrated the outer-layer fabric into the thermal protection layers after less than 8 hrs of surface operations. Current plans for the exploration planetary space suits require the space suits to support hundreds of hours of EVA on a lunar or Martian surface, creating a challenge for space suit designers to utilize materials advances made over the last 40 years and improve on the space suit fabrics used in the Apollo Program. Over the past 25 years the NASA Johnson Space Center Crew and Thermal Systems Division has focused on tumble testing as means of simulating wear on the outer layer of the space suit fabric. Most recently, in 2009, testing was performed on 4 different candidate outer layers to gather baseline data for future use in design of planetary space suit outer layers. In support of the High Performance EVA Glove Element of the Next Generation Life Support Project, testing a new configuration was recently attempted in which require 10% of the fabric per replicate of that need in 2009. The smaller fabric samples allowed for reduced per sample cost and flexibility to test small samples from manufacturers without the overhead to have a production run completed. Data collected from this iteration was compared to that taken in 2009 to validate the new test method. In addition the method also evaluated the fabrics and fabric layups used in a prototype thermal micrometeoroid garment (TMG) developed for EVA gloves under the NASA High Performance EVA Glove Project. This paper provides a review of previous abrasion studies on space suit fabrics, details methodologies used for abrasion testing in this particular study, results of the validation study, and results of the TMG testing.

  16. A Methodology for the Optimization of Flow Rate Injection to Looped Water Distribution Networks through Multiple Pumping Stations

    Directory of Open Access Journals (Sweden)

    Christian León-Celi

    2016-12-01

    Full Text Available The optimal function of a water distribution network is reached when the consumer demands are satisfied using the lowest quantity of energy, maintaining the minimal pressure required at the same time. One way to achieve this is through optimization of flow rate injection based on the use of the setpoint curve concept. In order to obtain that, a methodology is proposed. It allows for the assessment of the flow rate and pressure head that each pumping station has to provide for the proper functioning of the network while the minimum power consumption is kept. The methodology can be addressed in two ways: the discrete method and the continuous method. In the first method, a finite set of combinations is evaluated between pumping stations. In the continuous method, the search for the optimal solution is performed using optimization algorithms. In this paper, Hooke–Jeeves and Nelder–Mead algorithms are used. Both the hydraulics and the objective function used by the optimization are solved through EPANET and its Toolkit. Two case studies are evaluated, and the results of the application of the different methods are discussed.

  17. Uniaxial tension test on Rubber at constant true strain rate

    Directory of Open Access Journals (Sweden)

    Sourne H.L.

    2012-08-01

    Full Text Available Elastomers are widely used for damping parts in different industrial contexts because of their remarkable dissipation properties. Indeed, they can undergo severe mechanical loading conditions, i.e., high strain rates and large strains. Nevertheless, the mechanical response of these materials can vary from purely rubber-like to glassy depending on the strain rate undergone. Classically, uniaxial tension tests are made in order to find a relation between the stress and the strain in the material at various strain rates. However, even if the strain rate is searched to be constant, it is the nominal strain rate that is considered. Here we develop a test at constant true strain rate, i.e. the strain rate that is experienced by the material. In order to do such a test, the displacement imposed by the machine is an exponential function of time. This test has been performed with a high speed hydraulic machine for strain rates between 0.01/s and 100/s. A specific specimen has been designed, yielding a uniform strain field (and so a uniform stress field. Furthermore, an instrumented aluminum bar has been used to take into account dynamic effects in the measurement of the applied force. A high speed camera enables the determination of strain in the sample using point tracking technique. Using this method, the stress-strain curve of a rubber-like material during a loading-unloading cycle has been determined, up to a stretch ratio λ = 2.5. The influence of the true strain rate both on stiffness and on dissipation of the material is then discussed.

  18. Testing jumps via false discovery rate control.

    Directory of Open Access Journals (Sweden)

    Yu-Min Yen

    Full Text Available Many recently developed nonparametric jump tests can be viewed as multiple hypothesis testing problems. For such multiple hypothesis tests, it is well known that controlling type I error often makes a large proportion of erroneous rejections, and such situation becomes even worse when the jump occurrence is a rare event. To obtain more reliable results, we aim to control the false discovery rate (FDR, an efficient compound error measure for erroneous rejections in multiple testing problems. We perform the test via the Barndorff-Nielsen and Shephard (BNS test statistic, and control the FDR with the Benjamini and Hochberg (BH procedure. We provide asymptotic results for the FDR control. From simulations, we examine relevant theoretical results and demonstrate the advantages of controlling the FDR. The hybrid approach is then applied to empirical analysis on two benchmark stock indices with high frequency data.

  19. Butterfly valve torque prediction methodology

    International Nuclear Information System (INIS)

    Eldiwany, B.H.; Sharma, V.; Kalsi, M.S.; Wolfe, K.

    1994-01-01

    As part of the Motor-Operated Valve (MOV) Performance Prediction Program, the Electric Power Research Institute has sponsored the development of methodologies for predicting thrust and torque requirements of gate, globe, and butterfly MOVs. This paper presents the methodology that will be used by utilities to calculate the dynamic torque requirements for butterfly valves. The total dynamic torque at any disc position is the sum of the hydrodynamic torque, bearing torque (which is induced by the hydrodynamic force), as well as other small torque components (such as packing torque). The hydrodynamic torque on the valve disc, caused by the fluid flow through the valve, depends on the disc angle, flow velocity, upstream flow disturbances, disc shape, and the disc aspect ratio. The butterfly valve model provides sets of nondimensional flow and torque coefficients that can be used to predict flow rate and hydrodynamic torque throughout the disc stroke and to calculate the required actuation torque and the maximum transmitted torque throughout the opening and closing stroke. The scope of the model includes symmetric and nonsymmetric discs of different shapes and aspects ratios in compressible and incompressible fluid applications under both choked and nonchoked flow conditions. The model features were validated against test data from a comprehensive flowloop and in situ test program. These tests were designed to systematically address the effect of the following parameters on the required torque: valve size, disc shapes and disc aspect ratios, upstream elbow orientation and its proximity, and flow conditions. The applicability of the nondimensional coefficients to valves of different sizes was validated by performing tests on 42-in. valve and a precisely scaled 6-in. model. The butterfly valve model torque predictions were found to bound test data from the flow-loop and in situ testing, as shown in the examples provided in this paper

  20. Development of Wireless System for Containment Integrated Leakage Rate Test

    International Nuclear Information System (INIS)

    Lee, Kwang-Dae; Oh, Eung-Se; Yang, Seung-Ok

    2006-01-01

    The containment system leakage rate should be estimated periodically with reliable test equipment. In light-water reactor nuclear power plants, ANSI/ANS- 56.8 is a basis for determining leakage rates. Two types of data acquisition system, centralized type and networked type, has been used. In centralized type, all sensors are connected directly from sensors in the containment to the measuring equipment outside the building. The other hand, the networked type has several branch chains which connect one group of the network-sensors together. To test leakage rate, more than 20 temperature sensors and 6 humidity sensors, which are different for each plant, should be installed on a specific level in the containment. A wireless technology gives the benefits such as reducing installation efforts, making pretest easy, so it is widely used more and more in the plant monitoring. As the containment system has many kinds of complex barriers to the radio frequency, the radio power and frequency band for better transmission rate as well as the interference by the radio frequency should be considered. The overview of the wireless sensor system for the containment leakage rate test is described here and the test results on Yonggwang unit 4 PWR plant is presented

  1. Humidity Testing for Human Rated Spacecraft

    Science.gov (United States)

    Johnson, Gary B.

    2009-01-01

    Determination that equipment can operate in and survive exposure to the humidity environments unique to human rated spacecraft presents widely varying challenges. Equipment may need to operate in habitable volumes where the atmosphere contains perspiration, exhalation, and residual moisture. Equipment located outside the pressurized volumes may be exposed to repetitive diurnal cycles that may result in moisture absorption and/or condensation. Equipment may be thermally affected by conduction to coldplate or structure, by forced or ambient air convection (hot/cold or wet/dry), or by radiation to space through windows or hatches. The equipment s on/off state also contributes to the equipment s susceptibility to humidity. Like-equipment is sometimes used in more than one location and under varying operational modes. Due to these challenges, developing a test scenario that bounds all physical, environmental and operational modes for both pressurized and unpressurized volumes requires an integrated assessment to determine the "worst-case combined conditions." Such an assessment was performed for the Constellation program, considering all of the aforementioned variables; and a test profile was developed based on approximately 300 variable combinations. The test profile has been vetted by several subject matter experts and partially validated by testing. Final testing to determine the efficacy of the test profile on actual space hardware is in the planning stages. When validation is completed, the test profile will be formally incorporated into NASA document CxP 30036, "Constellation Environmental Qualification and Acceptance Testing Requirements (CEQATR)."

  2. A Test Methodology for Determining Space-Readiness of Xilinx SRAM-Based FPGA Designs

    International Nuclear Information System (INIS)

    Quinn, Heather M.; Graham, Paul S.; Morgan, Keith S.; Caffrey, Michael P.

    2008-01-01

    Using reconfigurable, static random-access memory (SRAM) based field-programmable gate arrays (FPGAs) for space-based computation has been an exciting area of research for the past decade. Since both the circuit and the circuit's state is stored in radiation-tolerant memory, both could be alterd by the harsh space radiation environment. Both the circuit and the circuit's state can be prote cted by triple-moduler redundancy (TMR), but applying TMR to FPGA user designs is often an error-prone process. Faulty application of TMR could cause the FPGA user circuit to output incorrect data. This paper will describe a three-tiered methodology for testing FPGA user designs for space-readiness. We will describe the standard approach to testing FPGA user designs using a particle accelerator, as well as two methods using fault injection and a modeling tool. While accelerator testing is the current 'gold standard' for pre-launch testing, we believe the use of fault injection and modeling tools allows for easy, cheap and uniform access for discovering errors early in the design process.

  3. Measurements of integrated components' parameters versus irradiation doses gamma radiation (60Co) dosimetry-methodology-tests

    International Nuclear Information System (INIS)

    Fuan, J.

    1991-01-01

    This paper describes the methodology used for the irradiation of the integrated components and the measurements of their parameters, using Quality Insurance of dosimetry: - Measurement of the integrated dose using the competences of the Laboratoire Central des Industries Electriques (LCIE): - Measurement of irradiation dose versus source/component distance, using a calibrated equipment. - Use of ALANINE dosimeters, placed on the support of the irradiated components. - Assembly and polarization of components during the irradiations. Selection of the irradiator. - Measurement of the irradiated components's parameters, using the competences of the societies: - GenRad: GR130 tests equipement placed in the DEIN/SIR-CEN SACLAY. - Laboratoire Central des Industries Electriques (LCIE): GR125 tests equipment and this associated programmes test [fr

  4. Feed Preparation for Source of Alkali Melt Rate Tests

    International Nuclear Information System (INIS)

    Stone, M. E.; Lambert, D. P.

    2005-01-01

    The purpose of the Source of Alkali testing was to prepare feed for melt rate testing in order to determine the maximum melt-rate for a series of batches where the alkali was increased from 0% Na 2 O in the frit (low washed sludge) to 16% Na 2 O in the frit (highly washed sludge). This document summarizes the feed preparation for the Source of Alkali melt rate testing. The Source of Alkali melt rate results will be issued in a separate report. Five batches of Sludge Receipt and Adjustment Tank (SRAT) product and four batches of Slurry Mix Evaporator (SME) product were produced to support Source of Alkali (SOA) melt rate testing. Sludge Batch 3 (SB3) simulant and frit 418 were used as targets for the 8% Na 2 O baseline run. For the other four cases (0% Na 2 O, 4% Na 2 O, 12% Na 2 O, and 16% Na 2 O in frit), special sludge and frit preparations were necessary. The sludge preparations mimicked washing of the SB3 baseline composition, while frit adjustments consisted of increasing or decreasing Na and then re-normalizing the remaining frit components. For all batches, the target glass compositions were identical. The five SRAT products were prepared for testing in the dry fed melt-rate furnace and the four SME products were prepared for the Slurry-fed Melt-Rate Furnace (SMRF). At the same time, the impacts of washing on a baseline composition from a Chemical Process Cell (CPC) perspective could also be investigated. Five process simulations (0% Na 2 O in frit, 4% Na 2 O in frit, 8% Na 2 O in frit or baseline, 12% Na 2 O in frit, and 16% Na 2 O in frit) were completed in three identical 4-L apparatus to produce the five SRAT products. The SRAT products were later dried and combined with the complementary frits to produce identical glass compositions. All five batches were produced with identical processing steps, including off-gas measurement using online gas chromatographs. Two slurry-fed melter feed batches, a 4% Na 2 O in frit run (less washed sludge combined with

  5. Methodology for assessing the impacts of alternative rate designs on industrial energy use. Draft report

    Energy Technology Data Exchange (ETDEWEB)

    None

    1980-01-11

    A task was undertaken to develop a method for analyzing industrial user responses to alternative rate designs. The method described considers the fuel switching and conservation responses of industrial users and the impact to a hypothetical utility regarding revenue stability, annual gas demand, and seasonal fluctuations. Twenty-seven hypothetical industrial plant types have been specified. For each combustor in the plant, the fuel consumption by season, initial fuel type, fuel switching costs, conservation costs, and amount of fuel conservable is provided. The decision making takes place at the plant level and is aggregated to determine the impact to the utility. Section 2 discusses the factors affecting an industrial user's response to alternative rate designs. Section 3 describes the methodology, includes an overview of the model and an example industrial user's response to a set of fuel prices. The data describing the 27 hypothetical firms is in an appendix.

  6. Development of a novel methodology for indoor emission source identification

    DEFF Research Database (Denmark)

    Han, K.H.; Zhang, J.S.; Knudsen, H.N.

    2011-01-01

    The objective of this study was to develop and evaluate a methodology to identify individual sources of emissions based on the measurements of mixed air samples and the emission signatures of individual materials previously determined by Proton Transfer Reaction-Mass Spectrometry (PTR-MS), an on......-line analytical device. The methodology based on signal processing principles was developed by employing the method of multiple regression least squares (MRLS) and a normalization technique. Samples of nine typical building materials were tested individually and in combination, including carpet, ceiling material...... experiments and investigation are needed for cases where the relative emission rates among different compounds may change over a long-term period....

  7. G-control fatigue testing for cyclic crack propagation in composite structures

    DEFF Research Database (Denmark)

    Manca, Marcello; Berggreen, Christian; Carlsson, Leif A.

    2015-01-01

    This paper presents a computer controlled testing methodology called “The G-control Method” which allows cyclic crack growth testing using real-time control of the cyclic energy release rate. The advantages of using this approach are described and compared with traditional fatigue testing methods...... that the G-control method allows fatigue testing at a constant range of energy release rates leading to a constant crack propagation rate....

  8. 77 FR 2606 - Pipeline Safety: Random Drug Testing Rate

    Science.gov (United States)

    2012-01-18

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket ID PHMSA-2012-0004] Pipeline Safety: Random Drug Testing Rate AGENCY: Pipeline and Hazardous Materials... pipelines and operators of liquefied natural gas facilities must select and test a percentage of covered...

  9. 75 FR 9018 - Pipeline Safety: Random Drug Testing Rate

    Science.gov (United States)

    2010-02-26

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket ID PHMSA-2010-0034] Pipeline Safety: Random Drug Testing Rate AGENCY: Pipeline and Hazardous Materials... pipelines and operators of liquefied natural gas facilities must select and test a percentage of covered...

  10. Comparison Of Irms Delhi Methodology With Who Methodology On Immunization Coverage

    Directory of Open Access Journals (Sweden)

    Singh Padam

    1996-01-01

    Full Text Available Research question: What are the merits of IRMS Model over WHO Model for Coverage Evaluation Survey? Which method is superior and appropriate for coverage evolution survey of immunization in our setting? Objective: To compare IRMS Delhi methodology with WHO methodology on Immunization Coverage. Study Design: Cross-Sectional Setting: Urban and Rural both. Participants: Mothers& Children Sample Size: 300 children between 1-2 years and 300 mothers in rural areas and 75 children and 75 mothers in urban areas. Study Variables: Rural, Urban, Cast-Group, Size of the stratum, Literacy, Sex and Cost effectiveness. Outcome Variables: Coverage level of immunization. Analysis: Routine Statistical Analysis. Results: IRMS developed methodology scores better rating over WHO methodology, especially when coverage evolution is attempted in medium size villages with existence of socio-economic seggregation-which remains the main characteristic of the Indian villages.

  11. Evaluation of strain-rate sensitivity of ion-irradiated austenitic steel using strain-rate jump nanoindentation tests

    Energy Technology Data Exchange (ETDEWEB)

    Kasada, Ryuta, E-mail: r-kasada@iae.kyoto-u.ac.jp [Institute of Advanced Energy, Kyoto University Gokasho, Uji 611-0011, Kyoto (Japan); Konishi, Satoshi [Institute of Advanced Energy, Kyoto University Gokasho, Uji 611-0011, Kyoto (Japan); Hamaguchi, Dai; Ando, Masami; Tanigawa, Hiroyasu [Japan Atomic Energy Agency, Rokkasho, Aomori (Japan)

    2016-11-01

    Highlights: • We examined strain-rate jump nanoindentation on ion-irradiated stainless steel. • We observed irradiation hardening of the ion-irradiated stainless steel. • We found that strain-rate sensitivity parameter was slightly decreased after the ion-irradiation. - Abstract: The present study investigated strain-rate sensitivity (SRS) of a single crystal Fe–15Cr–20Ni austenitic steel before and after 10.5 MeV Fe{sup 3+} ion-irradiation up to 10 dpa at 300 °C using a strain-rate jump (SRJ) nanoindentation test. It was found that the SRJ nanoindentation test is suitable for evaluating the SRS at strain-rates from 0.001 to 0.2 s{sup −1}. Indentation size effect was observed for depth dependence of nanoindentation hardness but not the SRS. The ion-irradiation increased the hardness at the shallow depth region but decreased the SRS slightly.

  12. Measurement of leach rates: a review

    International Nuclear Information System (INIS)

    Mendel, J.E.

    1982-01-01

    A historical perspective of the techniques that can be used to measure the leach rate of radioactive waste forms is presented. The achievement of leach rates that are as low as possible has been an important goal ever since the development of solidification processes for liquid radioactive wastes began in the 1950's. Leach tests can be divided into two major categories, dynamic and static, based on whether or not the leachant in contact with the test specimen is changed during the course of the test. Both types of tests have been used extensively. The results of leach tests can be used to compare waste forms, and that has been a major purpose of leach data heretofore; increasingly, however, the data now are needed for predicting long-term leaching behavior during geologic disposal. This requirement is introducing new complexities into leach testing methodology. 3 figures, 2 tables

  13. Methodology for estimating radiation dose rates to freshwater biota exposed to radionuclides in the environment

    International Nuclear Information System (INIS)

    Blaylock, B.G.; Frank, M.L.; O'Neal, B.R.

    1993-08-01

    The purpose of this report is to present a methodology for evaluating the potential for aquatic biota to incur effects from exposure to chronic low-level radiation in the environment. Aquatic organisms inhabiting an environment contaminated with radioactivity receive external radiation from radionuclides in water, sediment, and from other biota such as vegetation. Aquatic organisms receive internal radiation from radionuclides ingested via food and water and, in some cases, from radionuclides absorbed through the skin and respiratory organs. Dose rate equations, which have been developed previously, are presented for estimating the radiation dose rate to representative aquatic organisms from alpha, beta, and gamma irradiation from external and internal sources. Tables containing parameter values for calculating radiation doses from selected alpha, beta, and gamma emitters are presented in the appendix to facilitate dose rate calculations. The risk of detrimental effects to aquatic biota from radiation exposure is evaluated by comparing the calculated radiation dose rate to biota to the U.S. Department of Energy's (DOE's) recommended dose rate limit of 0.4 mGy h -1 (1 rad d -1 ). A dose rate no greater than 0.4 mGy h -1 to the most sensitive organisms should ensure the protection of populations of aquatic organisms. DOE's recommended dose rate is based on a number of published reviews on the effects of radiation on aquatic organisms that are summarized in the National Council on Radiation Protection and Measurements Report No. 109 (NCRP 1991). DOE recommends that if the results of radiological models or dosimetric measurements indicate that a radiation dose rate of 0. 1 mGy h -1 will be exceeded, then a more detailed evaluation of the potential ecological consequences of radiation exposure to endemic populations should be conducted

  14. Quantification of Greenhouse Gas Emission Rates from strong Point Sources by Airborne IPDA-Lidar Measurements: Methodology and Experimental Results

    Science.gov (United States)

    Ehret, G.; Amediek, A.; Wirth, M.; Fix, A.; Kiemle, C.; Quatrevalet, M.

    2016-12-01

    We report on a new method and on the first demonstration to quantify emission rates from strong greenhouse gas (GHG) point sources using airborne Integrated Path Differential Absorption (IPDA) Lidar measurements. In order to build trust in the self-reported emission rates by countries, verification against independent monitoring systems is a prerequisite to check the reported budget. A significant fraction of the total anthropogenic emission of CO2 and CH4 originates from localized strong point sources of large energy production sites or landfills. Both are not monitored with sufficiently accuracy by the current observation system. There is a debate whether airborne remote sensing could fill in the gap to infer those emission rates from budgeting or from Gaussian plume inversion approaches, whereby measurements of the GHG column abundance beneath the aircraft can be used to constrain inverse models. In contrast to passive sensors, the use of an active instrument like CHARM-F for such emission verification measurements is new. CHARM-F is a new airborne IPDA-Lidar devised for the German research aircraft HALO for the simultaneous measurement of the column-integrated dry-air mixing ratio of CO2 and CH4 commonly denoted as XCO2 und XCH4, respectively. It has successfully been tested in a serious of flights over Central Europe to assess its performance under various reflectivity conditions and in a strongly varying topography like the Alps. The analysis of a methane plume measured in crosswind direction of a coal mine ventilation shaft revealed an instantaneous emission rate of 9.9 ± 1.7 kt CH4 yr-1. We discuss the methodology of our point source estimation approach and give an outlook on the CoMet field experiment scheduled in 2017 for the measurement of anthropogenic and natural GHG emissions by a combination of active and passive remote sensing instruments on research aircraft.

  15. Novel methodology for pharmaceutical expenditure forecast.

    Science.gov (United States)

    Vataire, Anne-Lise; Cetinsoy, Laurent; Aballéa, Samuel; Rémuzat, Cécile; Urbinati, Duccio; Kornfeld, Åsa; Mzoughi, Olfa; Toumi, Mondher

    2014-01-01

    The value appreciation of new drugs across countries today features a disruption that is making the historical data that are used for forecasting pharmaceutical expenditure poorly reliable. Forecasting methods rarely addressed uncertainty. The objective of this project was to propose a methodology to perform pharmaceutical expenditure forecasting that integrates expected policy changes and uncertainty (developed for the European Commission as the 'EU Pharmaceutical expenditure forecast'; see http://ec.europa.eu/health/healthcare/key_documents/index_en.htm). 1) Identification of all pharmaceuticals going off-patent and new branded medicinal products over a 5-year forecasting period in seven European Union (EU) Member States. 2) Development of a model to estimate direct and indirect impacts (based on health policies and clinical experts) on savings of generics and biosimilars. Inputs were originator sales value, patent expiry date, time to launch after marketing authorization, price discount, penetration rate, time to peak sales, and impact on brand price. 3) Development of a model for new drugs, which estimated sales progression in a competitive environment. Clinical expected benefits as well as commercial potential were assessed for each product by clinical experts. Inputs were development phase, marketing authorization dates, orphan condition, market size, and competitors. 4) Separate analysis of the budget impact of products going off-patent and new drugs according to several perspectives, distribution chains, and outcomes. 5) Addressing uncertainty surrounding estimations via deterministic and probabilistic sensitivity analysis. This methodology has proven to be effective by 1) identifying the main parameters impacting the variations in pharmaceutical expenditure forecasting across countries: generics discounts and penetration, brand price after patent loss, reimbursement rate, the penetration of biosimilars and discount price, distribution chains, and the time

  16. Heart rate-based lactate minimum test: a reproducible method.

    NARCIS (Netherlands)

    Strupler, M.; Muller, G.; Perret, C.

    2009-01-01

    OBJECTIVE: To find the individual intensity for aerobic endurance training, the lactate minimum test (LMT) seems to be a promising method. LMTs described in the literature consist of speed or work rate-based protocols, but for training prescription in daily practice mostly heart rate is used. The

  17. Effects of various methodologic strategies: survey response rates among Canadian physicians and physicians-in-training.

    Science.gov (United States)

    Grava-Gubins, Inese; Scott, Sarah

    2008-10-01

    To increase the overall 2007 response rate of the National Physician Survey (NPS) from the survey's 2004 rate of response with the implementation of various methodologic strategies. Physicians were stratified to receive either a long version (12 pages) or a short version (6 pages) of the survey (38% and 62%, respectively). Mixed modes of contact were used-58% were contacted by e-mail and 42% by regular mail-with multiple modes of contact attempted for nonrespondents. The self-administered, confidential surveys were distributed in either English or French. Medical residents and students received e-mail surveys only and were offered a substantial monetary lottery incentive for completing their surveys. A professional communications firm assisted in marketing the survey and delivered advance notification of its impending distribution. Canada. A total of 62 441 practising physicians, 2627 second-year medical residents, and 9162 medical students in Canada. Of the practising physicians group, 60 811 participants were eligible and 19 239 replied, for an overall 2007 study response rate of 31.64% (compared with 35.85% in 2004). No difference in rate of response was found between the longer and shorter versions of the survey. If contacted by regular mail, the response rate was 34.1%; the e-mail group had a response rate of 29.9%. Medical student and resident response rates were 30.8% and 27.9%, respectively (compared with 31.2% and 35.6% in 2004). Despite shortening the questionnaires, contacting more physicians by e-mail, and enhancing marketing and follow-up, the 2007 NPS response rate for practising physicians did not surpass the 2004 NPS response rate. Offering a monetary lottery incentive to medical residents and students was also unsuccessful in increasing their response rates. The role of surveys in gathering information from physicians and physicians-in-training remains problematic. Researchers need to investigate alternative strategies for achieving higher rates of

  18. Cognitive tests predict real-world errors: the relationship between drug name confusion rates in laboratory-based memory and perception tests and corresponding error rates in large pharmacy chains.

    Science.gov (United States)

    Schroeder, Scott R; Salomon, Meghan M; Galanter, William L; Schiff, Gordon D; Vaida, Allen J; Gaunt, Michael J; Bryson, Michelle L; Rash, Christine; Falck, Suzanne; Lambert, Bruce L

    2017-05-01

    Drug name confusion is a common type of medication error and a persistent threat to patient safety. In the USA, roughly one per thousand prescriptions results in the wrong drug being filled, and most of these errors involve drug names that look or sound alike. Prior to approval, drug names undergo a variety of tests to assess their potential for confusability, but none of these preapproval tests has been shown to predict real-world error rates. We conducted a study to assess the association between error rates in laboratory-based tests of drug name memory and perception and real-world drug name confusion error rates. Eighty participants, comprising doctors, nurses, pharmacists, technicians and lay people, completed a battery of laboratory tests assessing visual perception, auditory perception and short-term memory of look-alike and sound-alike drug name pairs (eg, hydroxyzine/hydralazine). Laboratory test error rates (and other metrics) significantly predicted real-world error rates obtained from a large, outpatient pharmacy chain, with the best-fitting model accounting for 37% of the variance in real-world error rates. Cross-validation analyses confirmed these results, showing that the laboratory tests also predicted errors from a second pharmacy chain, with 45% of the variance being explained by the laboratory test data. Across two distinct pharmacy chains, there is a strong and significant association between drug name confusion error rates observed in the real world and those observed in laboratory-based tests of memory and perception. Regulators and drug companies seeking a validated preapproval method for identifying confusing drug names ought to consider using these simple tests. By using a standard battery of memory and perception tests, it should be possible to reduce the number of confusing look-alike and sound-alike drug name pairs that reach the market, which will help protect patients from potentially harmful medication errors. Published by the BMJ

  19. Simultaneous monitoring of ice accretion and thermography of an airfoil: an IR imaging methodology

    International Nuclear Information System (INIS)

    Mohseni, M; Frioult, M; Amirfazli, A

    2012-01-01

    A novel image analysis methodology based on infrared (IR) imaging was developed for simultaneous monitoring of ice accretion and thermography of airfoils. In this study, an IR camera was calibrated and used to measure the surface temperature of the energized airfoils, and monitor the ice accretion and growth pattern on the airfoils’ surfaces. The methodology comprises the automatic processing of a series of IR video frames with the purpose of detecting ice pattern evolution during the icing test period. A specially developed MATLAB code was used to detect the iced areas in the IR images, and simultaneously monitor surface temperature evolution of the airfoil during an icing test. Knowing the correlation between the icing pattern and surface temperature changes during an icing test is essential for energy efficient design of thermal icing mitigation systems. Processed IR images were also used to determine the ice accumulation rate on the airfoil's surface in a given icing test. The proposed methodology has been demonstrated to work successfully, since the optical images taken at the end of icing tests from the airfoils’ surfaces compared well with the processed IR images detecting the ice grown outward from the airfoils’ leading edge area. (paper)

  20. Decision Analysis and Its Application to the Frequency of Containment Integrated Leakage Rate Tests

    International Nuclear Information System (INIS)

    Apostolakis, George E.; Koser, John P.; Sato, Gaku

    2004-01-01

    For nuclear utilities to become competitive in a deregulated electricity market, costs must be reduced, safety must be maintained, and interested stakeholders must remain content with the decisions being made. One way to reduce costs is to reduce the frequency of preventive maintenance and testing. However, these changes must be weighed against their impact on safety and stakeholder relations. We present a methodology that allows the evaluation of decision options using a number of objectives that include safety, economics, and stakeholder relations. First, the candidate decision options are screened to make sure that they satisfy the relevant regulatory requirements. The remaining options are evaluated using multiattribute utility theory. The results of the formal analysis include a ranking of the options according to their desirability as well as the major reasons that explain this ranking. These results are submitted to a deliberative process in which the decision makers scrutinize the results to ensure that they are meaningful. During the deliberation, new decision options may be formulated based on the insights that the formal analysis provides, as happened in the case study of this paper. This case study deals with the reduction in frequency of the containment integrated leak rate test of a boiling water reactor

  1. 75 FR 1547 - Alcohol and Drug Testing: Determination of Minimum Random Testing Rates for 2010

    Science.gov (United States)

    2010-01-12

    ...-11213, Notice No. 13] RIN 2130-AA81 Alcohol and Drug Testing: Determination of Minimum Random Testing... percent for alcohol. Because the industry-wide random drug testing positive rate has remained below 1.0... effective upon publication. FOR FURTHER INFORMATION CONTACT: Lamar Allen, Alcohol and Drug Program Manager...

  2. Winding up the molecular clock in the genus Carabus (Coleoptera: Carabidae: assessment of methodological decisions on rate and node age estimation

    Directory of Open Access Journals (Sweden)

    Andújar Carmelo

    2012-03-01

    Full Text Available Abstract Background Rates of molecular evolution are known to vary across taxa and among genes, and this requires rate calibration for each specific dataset based on external information. Calibration is sensitive to evolutionary model parameters, partitioning schemes and clock model. However, the way in which these and other analytical aspects affect both the rates and the resulting clade ages from calibrated phylogenies are not yet well understood. To investigate these aspects we have conducted calibration analyses for the genus Carabus (Coleoptera, Carabidae on five mitochondrial and four nuclear DNA fragments with 7888 nt total length, testing different clock models and partitioning schemes to select the most suitable using Bayes Factors comparisons. Results We used these data to investigate the effect of ambiguous character and outgroup inclusion on both the rates of molecular evolution and the TMRCA of Carabus. We found considerable variation in rates of molecular evolution depending on the fragment studied (ranging from 5.02% in cob to 0.26% divergence/My in LSU-A, but also on analytical conditions. Alternative choices of clock model, partitioning scheme, treatment of ambiguous characters, and outgroup inclusion resulted in rate increments ranging from 28% (HUWE1 to 1000% (LSU-B and ITS2 and increments in the TMRCA of Carabus ranging from 8.4% (cox1-A to 540% (ITS2. Results support an origin of the genus Carabus during the Oligocene in the Eurasian continent followed by a Miocene differentiation that originated all main extant lineages. Conclusions The combination of several genes is proposed as the best strategy to minimise both the idiosyncratic behaviors of individual markers and the effect of analytical aspects in rate and age estimations. Our results highlight the importance of estimating rates of molecular evolution for each specific dataset, selecting for optimal clock and partitioning models as well as other methodological issues

  3. Acceptance test procedure for the 105-KW isolation barrier leak rate

    International Nuclear Information System (INIS)

    McCracken, K.J.

    1995-01-01

    This acceptance test procedure shall be used to: First establish a basin water loss rate prior to installation of the two isolation barriers between the main basin and the discharge chute in K-Basin West. Second, perform an acceptance test to verify an acceptable leakage rate through the barrier seals. This Acceptance Test Procedure (ATP) has been prepared in accordance with CM-6-1 EP 4.2, Standard Engineering Practices

  4. In-core flow rate distribution measurement test of the JOYO irradiation core

    International Nuclear Information System (INIS)

    Suzuki, Toshihiro; Isozaki, Kazunori; Suzuki, Soju

    1996-01-01

    A flow rate distribution measurement test was carried out for the JOYO irradiation core (the MK-II core) after the 29th duty cycle operation. The main object of the test is to confirm the proper flow rate distribution at the final phase of the MK-II core. The each flow rate at the outlet of subassemblies was measured by the permanent magnetic flowmeter inserted avail of fuel exchange hole in the rotating plug. This is third test in the MK-II core, after 10 years absence from the final test (1985). Total of 550 subassemblies were exchanged and accumulated reactor operation time reached up to 38,000 hours from the previous test. As a conclusion, it confirmed that the flow rate distribution has been kept suitable in the final phase of the MK-II core. (author)

  5. Inverse methods for the mechanical characterization of materials at high strain rates

    Directory of Open Access Journals (Sweden)

    Casas-Rodriguez J.P.

    2012-08-01

    Full Text Available Mechanical material characterization represents a research challenge. Furthermore, special attention is directed to material characterization at high strain rates as the mechanical properties of some materials are influenced by the rate of loading. Diverse experimental techniques at high strain rates are available, such as the drop-test, the Taylor impact test or the Split Hopkinson pressure bar among others. However, the determination of the material parameters associated to a given mathematical constitutive model from the experimental data is a complex and indirect problem. This paper presents a material characterization methodology to determine the material parameters of a given material constitutive model from a given high strain rate experiment. The characterization methodology is based on an inverse technique in which an inverse problem is formulated and solved as an optimization procedure. The input of the optimization procedure is the characteristic signal from the high strain rate experiment. The output of the procedure is the optimum set of material parameters determined by fitting a numerical simulation to the high strain rate experimental signal.

  6. Uncertainties and quantification of common cause failure rates and probabilities for system analyses

    International Nuclear Information System (INIS)

    Vaurio, Jussi K.

    2005-01-01

    Simultaneous failures of multiple components due to common causes at random times are modelled by constant multiple-failure rates. A procedure is described for quantification of common cause failure (CCF) basic event probabilities for system models using plant-specific and multiple-plant failure-event data. Methodology is presented for estimating CCF-rates from event data contaminated with assessment uncertainties. Generalised impact vectors determine the moments for the rates of individual systems or plants. These moments determine the effective numbers of events and observation times to be input to a Bayesian formalism to obtain plant-specific posterior CCF-rates. The rates are used to determine plant-specific common cause event probabilities for the basic events of explicit fault tree models depending on test intervals, test schedules and repair policies. Three methods are presented to determine these probabilities such that the correct time-average system unavailability can be obtained with single fault tree quantification. Recommended numerical values are given and examples illustrate different aspects of the methodology

  7. Heart Rate Measures of Flight Test and Evaluation

    National Research Council Canada - National Science Library

    Bonner, Malcolm A; Wilson, Glenn F

    2001-01-01

    .... Because flying is a complex task, several measures are required to derive the best evaluation. This article describes the use of heart rate to augment the typical performance and subjective measures used in test and evaluation...

  8. Sulfonylurea herbicides – methodological challenges in setting aquatic limit values

    DEFF Research Database (Denmark)

    Rosenkrantz, Rikke Tjørnhøj; Baun, Anders; Kusk, Kresten Ole

    according to the EU Water Framework Directive, the resulting Water Quality Standards (WQSs) are below the analytical quantification limit, making it difficult to verify compliance with the limit values. However, several methodological concerns may be raised in relation to the very low effect concentrations...... and rimsulfuron. The following parameters were varied during testing: pH, exposure duration, temperature and light/dark cycle. Preliminary results show that a decrease in pH causes an increase in toxicity for all compounds. Exposure to a high concentration for 24 hours caused a reduction in growth rate, from...... for setting limit values for SUs or if more detailed information should be gained by taking methodological considerations into account....

  9. Open-Rate Controlled Experiment in E-Mail Marketing Campaigns

    Directory of Open Access Journals (Sweden)

    Antun Biloš

    2016-06-01

    Full Text Available Purpose – The main purpose of this paper is to test the controlled experiment (A/B split methodology in B2C oriented e-mail marketing campaigns. Design/Methodology/Approach – E-mail marketing techniques have been a substantial part of e-marketing methodology since the early Internet days of the mid-1990s. From the very beginning of Internet utilization for business purposes, e-mail was one of the most widely used communication techniques in B2B and B2C markets alike. Due to high volumes of spamming and progression of online communication clutter, some practitioners began to question the usability of e-mail as a marketing communication channel, while others embarked on working on improving the message itself. Efforts were invested into improving message quality, as well as into better understanding user expectations. One of the most commonly used techniques to test specific e-mail message elements is the controlled experiment. Findings and implications – This paper explores several types of controlled experiments in a specific Croatian B2C market. Tests were run to determine subscriber behavior towards several newsletter components, including sending time, sending day, sender’s name, and subject line. Open and click rates for tested campaigns, and several other metrics were investigated using MailChimp software. An N − 1 two-proportion test using an adjusted Wald confidence interval around the difference in the proportions was used for comparing the open-rate measure in the controlled experiments between subjects. Limitation – Controlled experiments (A/B split tests showed a lot of potential as a way of measuring behavior and preferences of subscribers, although several apparent limitations (the data-set scope, comparability issues indicated a clear need for standardization on a managerial and scientific level. Originality – This paper provides an up-to-date e-mail marketing effectiveness literature review, describes and tests the

  10. Standard test method for measurement of fatigue crack growth rates

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2015-01-01

    1.1 This test method covers the determination of fatigue crack growth rates from near-threshold to Kmax controlled instability. Results are expressed in terms of the crack-tip stress-intensity factor range (ΔK), defined by the theory of linear elasticity. 1.2 Several different test procedures are provided, the optimum test procedure being primarily dependent on the magnitude of the fatigue crack growth rate to be measured. 1.3 Materials that can be tested by this test method are not limited by thickness or by strength so long as specimens are of sufficient thickness to preclude buckling and of sufficient planar size to remain predominantly elastic during testing. 1.4 A range of specimen sizes with proportional planar dimensions is provided, but size is variable to be adjusted for yield strength and applied force. Specimen thickness may be varied independent of planar size. 1.5 The details of the various specimens and test configurations are shown in Annex A1-Annex A3. Specimen configurations other than t...

  11. High Strain Rate Testing of Welded DOP-26 Iridium

    Energy Technology Data Exchange (ETDEWEB)

    Schneibel, J. H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Miller, R. G. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Carmichael, C. A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Fox, E. E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Ulrich, G. B. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); George, E. P. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-08-01

    The iridium alloy DOP-26 is used to produce Clad Vent Set cups that protect the radioactive fuel in radioisotope thermoelectric generators (RTGs) which provide electric power for spacecraft and rovers. In a previous study, the tensile properties of DOP-26 were measured over a wide range of strain rates and temperatures and reported in ORNL/TM-2007/81. While that study established the properties of the base material, the fabrication of the heat sources requires welding, and the mechanical properties of welded DOP-26 have not been extensively characterized in the past. Therefore, this study was undertaken to determine the mechanical properties of DOP-26 specimens containing a transverse weld in the center of their gage sections. Tensile tests were performed at room temperature, 750, 900, and 1090°C and engineering strain rates of 1×10-3 and 10 s-1. Room temperature testing was performed in air, while testing at elevated temperatures was performed in a vacuum better than 1×10-4 Torr. The welded specimens had a significantly higher yield stress, by up to a factor of ~2, than the non-welded base material. The yield stress did not depend on the strain rate except at 1090°C, where it was slightly higher for the faster strain rate. The ultimate tensile stress, on the other hand, was significantly higher for the faster strain rate at temperatures of 750°C and above. At 750°C and above, the specimens deformed at 1×10-3 s-1 showed pronounced necking resulting sometimes in perfect chisel-edge fracture. The specimens deformed at 10 s-1 exhibited this fracture behavior only at the highest test temperature, 1090°C. Fracture occurred usually in the fusion zone of the weld and was, in most cases, primarily intergranular.

  12. Methodological aspects of crossover and maximum fat-oxidation rate point determination.

    Science.gov (United States)

    Michallet, A-S; Tonini, J; Regnier, J; Guinot, M; Favre-Juvin, A; Bricout, V; Halimi, S; Wuyam, B; Flore, P

    2008-11-01

    Indirect calorimetry during exercise provides two metabolic indices of substrate oxidation balance: the crossover point (COP) and maximum fat oxidation rate (LIPOXmax). We aimed to study the effects of the analytical device, protocol type and ventilatory response on variability of these indices, and the relationship with lactate and ventilation thresholds. After maximum exercise testing, 14 relatively fit subjects (aged 32+/-10 years; nine men, five women) performed three submaximum graded tests: one was based on a theoretical maximum power (tMAP) reference; and two were based on the true maximum aerobic power (MAP). Gas exchange was measured concomitantly using a Douglas bag (D) and an ergospirometer (E). All metabolic indices were interpretable only when obtained by the D reference method and MAP protocol. Bland and Altman analysis showed overestimation of both indices with E versus D. Despite no mean differences between COP and LIPOXmax whether tMAP or MAP was used, the individual data clearly showed disagreement between the two protocols. Ventilation explained 10-16% of the metabolic index variations. COP was correlated with ventilation (r=0.96, P<0.01) and the rate of increase in blood lactate (r=0.79, P<0.01), and LIPOXmax correlated with the ventilation threshold (r=0.95, P<0.01). This study shows that, in fit healthy subjects, the analytical device, reference used to build the protocol and ventilation responses affect metabolic indices. In this population, and particularly to obtain interpretable metabolic indices, we recommend a protocol based on the true MAP or one adapted to include the transition from fat to carbohydrate. The correlation between metabolic indices and lactate/ventilation thresholds suggests that shorter, classical maximum progressive exercise testing may be an alternative means of estimating these indices in relatively fit subjects. However, this needs to be confirmed in patients who have metabolic defects.

  13. Optical Methods For Automatic Rating Of Engine Test Components

    Science.gov (United States)

    Pritchard, James R.; Moss, Brian C.

    1989-03-01

    In recent years, increasing commercial and legislative pressure on automotive engine manufacturers, including increased oil drain intervals, cleaner exhaust emissions and high specific power outputs, have led to increasing demands on lubricating oil performance. Lubricant performance is defined by bench engine tests run under closely controlled conditions. After test, engines are dismantled and the parts rated for wear and accumulation of deposit. This rating must be consistently carried out in laboratories throughout the world in order to ensure lubricant quality meeting the specified standards. To this end, rating technicians evaluate components, following closely defined procedures. This process is time consuming, inaccurate and subject to drift, requiring regular recalibration of raters by means of international rating workshops. This paper describes two instruments for automatic rating of engine parts. The first uses a laser to determine the degree of polishing of the engine cylinder bore, caused by the reciprocating action of piston. This instrument has been developed to prototype stage by the NDT Centre at Harwell under contract to Exxon Chemical, and is planned for production within the next twelve months. The second instrument uses red and green filtered light to determine the type, quality and position of deposit formed on the piston surfaces. The latter device has undergone feasibility study, but no prototype exists.

  14. Errors of car wheels rotation rate measurement using roller follower on test benches

    Science.gov (United States)

    Potapov, A. S.; Svirbutovich, O. A.; Krivtsov, S. N.

    2018-03-01

    The article deals with rotation rate measurement errors, which depend on the motor vehicle rate, on the roller, test benches. Monitoring of the vehicle performance under operating conditions is performed on roller test benches. Roller test benches are not flawless. They have some drawbacks affecting the accuracy of vehicle performance monitoring. Increase in basic velocity of the vehicle requires increase in accuracy of wheel rotation rate monitoring. It determines the degree of accuracy of mode identification for a wheel of the tested vehicle. To ensure measurement accuracy for rotation velocity of rollers is not an issue. The problem arises when measuring rotation velocity of a car wheel. The higher the rotation velocity of the wheel is, the lower the accuracy of measurement is. At present, wheel rotation frequency monitoring on roller test benches is carried out by following-up systems. Their sensors are rollers following wheel rotation. The rollers of the system are not kinematically linked to supporting rollers of the test bench. The roller follower is forced against the wheels of the tested vehicle by means of a spring-lever mechanism. Experience of the test bench equipment operation has shown that measurement accuracy is satisfactory at small rates of vehicles diagnosed on roller test benches. With a rising diagnostics rate, rotation velocity measurement errors occur in both braking and pulling modes because a roller spins about a tire tread. The paper shows oscillograms of changes in wheel rotation velocity and rotation velocity measurement system’s signals when testing a vehicle on roller test benches at specified rates.

  15. Interest Rate Risk Management using Duration Gap Methodology

    Directory of Open Access Journals (Sweden)

    Dan Armeanu

    2008-01-01

    Full Text Available The world for financial institutions has changed during the last 20 years, and become riskier and more competitive-driven. After the deregulation of the financial market, banks had to take on extensive risk in order to earn sufficient returns. Interest rate volatility has increased dramatically over the past twenty-five years and for that an efficient management of this interest rate risk is strong required. In the last years banks developed a variety of methods for measuring and managing interest rate risk. From these the most frequently used in real banking life and recommended by Basel Committee are based on: Reprising Model or Funding Gap Model, Maturity Gap Model, Duration Gap Model, Static and Dynamic Simulation. The purpose of this article is to give a good understanding of duration gap model used for managing interest rate risk. The article starts with a overview of interest rate risk and explain how this type of risk should be measured and managed within an asset-liability management. Then the articles takes a short look at methods for measuring interest rate risk and after that explains and demonstrates how can be used Duration Gap Model for managing interest rate risk in banks.The world for financial institutions has changed during the last 20 years, and become riskier and more competitive-driven. After the deregulation of the financial market, banks had to take on extensive risk in order to earn sufficient returns. Interest rate volatility has increased dramatically over the past twenty-five years and for that an efficient management of this interest rate risk is strong required. In the last years banks developed a variety of methods for measuring and managing interest rate risk. From these the most frequently used in real banking life and recommended by Basel Committee are based on: Reprising Model or Funding Gap Model, Maturity Gap Model, Duration Gap Model, Static and Dynamic Simulation. The purpose of this article is to give a

  16. A Methodology for Evaluation of Inservice Test Intervals for Pumps and Motor-Operated Valves

    International Nuclear Information System (INIS)

    Cox, D.F.; Haynes, H.D.; McElhaney, K.L.; Otaduy, P.J.; Staunton, R.H.; Vesely, W.E.

    1999-01-01

    Recent nuclear industry reevaluation of component inservice testing (IST) requirements is resulting in requests for IST interval extensions and changes to traditional IST programs. To evaluate these requests, long-term component performance and the methods for mitigating degradation need to be understood. Determining the appropriate IST intervals, along with component testing, monitoring, trending, and maintenance effects, has become necessary. This study provides guidelines to support the evaluation of IST intervals for pumps and motor-operated valves (MOVs). It presents specific engineering information pertinent to the performance and monitoring/testing of pumps and MOVs, provides an analytical methodology for assessing the bounding effects of aging on component margin behavior, and identifies basic elements of an overall program to help ensure component operability. Guidance for assessing probabilistic methods and the risk importance and safety consequences of the performance of pumps and MOVs has not been specifically included within the scope of this report, but these elements may be included in licensee change requests

  17. Assessment of current structural design methodology for high-temperature reactors based on failure tests

    International Nuclear Information System (INIS)

    Corum, J.M.; Sartory, W.K.

    1985-01-01

    A mature design methodology, consisting of inelastic analysis methods, provided in Department of Energy guidelines, and failure criteria, contained in ASME Code Case N-47, exists in the United States for high-temperature reactor components. The objective of this paper is to assess the adequacy of this overall methodology by comparing predicted inelastic deformations and lifetimes with observed results from structural failure tests and from an actual service failure. Comparisons are presented for three types of structural situations: (1) nozzle-to-spherical shell specimens, where stresses at structural discontinuities lead to cracking, (2) welded structures, where metallurgical discontinuities play a key role in failures, and (3) thermal shock loadings of cylinders and pipes, where thermal discontinuities can lead to failure. The comparison between predicted and measured inelastic responses are generally reasonalbly good; quantities are sometimes overpredicted somewhat, and, sometimes underpredicted. However, even seemingly small discrepancies can have a significant effect on structural life, and lifetimes are not always as closely predicted. For a few cases, the lifetimes are substantially overpredicted, which raises questions regarding the adequacy of existing design margins

  18. Improvement in post test accident analysis results prediction for the test no. 2 in PSB test facility by applying UMAE methodology

    International Nuclear Information System (INIS)

    Dubey, S.K.; Petruzzi, A.; Giannotti, W.; D'Auria, F.

    2006-01-01

    This paper mainly deals with the improvement in the post test accident analysis results prediction for the test no. 2, 'Total loss of feed water with failure of HPIS pumps and operator actions on primary and secondary circuit depressurization', carried-out on PSB integral test facility in May 2005. This is one the most complicated test conducted in PSB test facility. The prime objective of this test is to provide support for the verification of the accident management strategies for NPPs and also to verify the correctness of some safety systems operating only during accident. The objective of this analysis is to assess the capability to reproduce the phenomena occurring during the selected tests and to quantify the accuracy of the code calculation qualitatively and quantitatively for the best estimate code Relap5/mod3.3 by systematically applying all the procedures lead by Uncertainty Methodology based on Accuracy Extrapolation (UMAE), developed at University of Pisa. In order to achieve these objectives test facility nodalisation qualification for both 'steady state level' and 'on transient level' are demonstrated. For the 'steady state level' qualification compliance to acceptance criteria established in UMAE has been checked for geometrical details and thermal hydraulic parameters. The following steps have been performed for evaluation of qualitative qualification of 'on transient level': visual comparisons between experimental and calculated relevant parameters time trends; list of comparison between experimental and code calculation resulting time sequence of significant events; identification/verification of CSNI phenomena validation matrix; use of the Phenomenological Windows (PhW), identification of Key Phenomena and Relevant Thermal-hydraulic Aspects (RTA). A successful application of the qualitative process constitutes a prerequisite to the application of the quantitative analysis. For quantitative accuracy of code prediction Fast Fourier Transform Based

  19. A review on leakage rate tests for containment isolation systems

    International Nuclear Information System (INIS)

    Kim, In Goo; Kim, Hho Jung

    1992-01-01

    Wide experiences in operating containment isolation systems have been accumulated in Korea since 1978. Hence, it becomes necessary to review the operating data in order to confirm the integrity of containments with about 50 reactor-years of experience and to establish the future direction to the containment test program. The objectives of present work are to collect, consolidate and assess the leakage rate data, and then to find out dominant leakage paths and factors affecting integrated leakage rate test. General trends of overall leakage show that more careful surveillance during pre-operational test can reduce the containment leakage. Dominant leakage paths are found to be through air locks and large-sized valves, such as butterfly valves of purge lines, so that weighted surveillance and inspection on these dominant leakage paths can considerably reduce the containment leakage. The atmosphere stabilization are found to be the most important to obtain the reliable result. In order to get well stabilized atmosphere, temperature and flow rate of compressed air should be kept constant and it is preferable not to operate fan cooler during pressurizing the containment for test

  20. Application of a Bayesian model for the quantification of the European methodology for qualification of non-destructive testing

    International Nuclear Information System (INIS)

    Gandossi, Luca; Simola, Kaisa; Shepherd, Barrie

    2010-01-01

    The European methodology for qualification of non-destructive testing is a well-established approach adopted by nuclear utilities in many European countries. According to this methodology, qualification is based on a combination of technical justification and practical trials. The methodology is qualitative in nature, and it does not give explicit guidance on how the evidence from the technical justification and results from trials should be weighted. A Bayesian model for the quantification process was presented in a previous paper, proposing a way to combine the 'soft' evidence contained in a technical justification with the 'hard' evidence obtained from practical trials. This paper describes the results of a pilot study in which such a Bayesian model was applied to two realistic Qualification Dossiers by experienced NDT qualification specialists. At the end of the study, recommendations were made and a set of guidelines was developed for the application of the Bayesian model.

  1. Hardware test program for evaluation of baseline range/range rate sensor concept

    Science.gov (United States)

    Pernic, E.

    1985-01-01

    The test program Phase II effort provides additional design information in terms of range and range rate (R/R) sensor performance when observing and tracking a typical spacecraft target. The target used in the test program was a one-third scale model of the Hubble Space Telescope (HST) available at the MSFC test site where the tests were performed. A modified Bendix millimeter wave radar served as the R/R sensor test bed for evaluation of range and range rate tracking performance, and generation of radar signature characteristics of the spacecraft target. A summary of program test results and conclusions are presented along with detailed description of the Bendix test bed radar with accompaning instrumentation. The MSFC test site and facilities are described. The test procedures used to establish background levels, and the calibration procedures used in the range accuracy tests and RCS (radar cross section) signature measurements, are presented and a condensed version of the daily log kept during the 5 September through 17 September test period is also presented. The test program results are given starting with the RCS signature measurements, then continuing with range measurement accuracy test results and finally the range and range rate tracking accuracy test results.

  2. A cointegration approach to forecasting freight rates in the dry bulk shipping sector

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans); A.W. Veenstra (Albert)

    1997-01-01

    textabstractIn this paper, a vector autoregressive model is developed for a sample of ocean dry bulk freight rates. Although the series of freight rates are themselves found to be non-stationary, thus precluding the use of many modelling methodologies, evidence provided by cointegration tests points

  3. Convective Heat Transfer Scaling of Ignition Delay and Burning Rate with Heat Flux and Stretch Rate in the Equivalent Low Stretch Apparatus

    Science.gov (United States)

    Olson, Sandra

    2011-01-01

    To better evaluate the buoyant contributions to the convective cooling (or heating) inherent in normal-gravity material flammability test methods, we derive a convective heat transfer correlation that can be used to account for the forced convective stretch effects on the net radiant heat flux for both ignition delay time and burning rate. The Equivalent Low Stretch Apparatus (ELSA) uses an inverted cone heater to minimize buoyant effects while at the same time providing a forced stagnation flow on the sample, which ignites and burns as a ceiling fire. Ignition delay and burning rate data is correlated with incident heat flux and convective heat transfer and compared to results from other test methods and fuel geometries using similarity to determine the equivalent stretch rates and thus convective cooling (or heating) rates for those geometries. With this correlation methodology, buoyant effects inherent in normal gravity material flammability test methods can be estimated, to better apply the test results to low stretch environments relevant to spacecraft material selection.

  4. A Risk Analysis Methodology to Address Human and Organizational Factors in Offshore Drilling Safety: With an Emphasis on Negative Pressure Test

    Science.gov (United States)

    Tabibzadeh, Maryam

    According to the final Presidential National Commission report on the BP Deepwater Horizon (DWH) blowout, there is need to "integrate more sophisticated risk assessment and risk management practices" in the oil industry. Reviewing the literature of the offshore drilling industry indicates that most of the developed risk analysis methodologies do not fully and more importantly, systematically address the contribution of Human and Organizational Factors (HOFs) in accident causation. This is while results of a comprehensive study, from 1988 to 2005, of more than 600 well-documented major failures in offshore structures show that approximately 80% of those failures were due to HOFs. In addition, lack of safety culture, as an issue related to HOFs, have been identified as a common contributing cause of many accidents in this industry. This dissertation introduces an integrated risk analysis methodology to systematically assess the critical role of human and organizational factors in offshore drilling safety. The proposed methodology in this research focuses on a specific procedure called Negative Pressure Test (NPT), as the primary method to ascertain well integrity during offshore drilling, and analyzes the contributing causes of misinterpreting such a critical test. In addition, the case study of the BP Deepwater Horizon accident and their conducted NPT is discussed. The risk analysis methodology in this dissertation consists of three different approaches and their integration constitutes the big picture of my whole methodology. The first approach is the comparative analysis of a "standard" NPT, which is proposed by the author, with the test conducted by the DWH crew. This analysis contributes to identifying the involved discrepancies between the two test procedures. The second approach is a conceptual risk assessment framework to analyze the causal factors of the identified mismatches in the previous step, as the main contributors of negative pressure test

  5. Erythrocyte Sedimentation Rate (ESR): MedlinePlus Lab Test Information

    Science.gov (United States)

    ... K. Brunner & Suddarth's Handbook of Laboratory and Diagnostic Tests. 2 nd Ed, Kindle. Philadelphia: Wolters Kluwer Health, Lippincott Williams & Wilkins; c2014. Erythrocyte Sedimentation Rate (ESR); p. 267– ...

  6. Methodology for featuring and assessing extreme climatic events

    International Nuclear Information System (INIS)

    Malleron, N.; Bernardara, P.; Benoit, M.; Parey, S.; Perret, C.

    2013-01-01

    The setting up of a nuclear power plant on a particular site requires the assessment of risks linked to extreme natural events like flooding or earthquakes. As a consequence of the Fukushima accident EDF proposes to take into account even rarer events in order to improve the robustness of the facility all over its operating life. This article presents the methodology used by EDF to analyse a set of data in a statistical way in order to extract extreme values. This analysis is based on the theory of extreme values and is applied to the extreme values of the flow rate in the case of a river overflowing. This methodology is made of 6 steps: 1) selection of the event, of its featuring parameter and of its probability, for instance the question is what is the flow rate of a flooding that has a probability of 10 -3 to happen, 2) to collect data over a long period of time (or to recover data from past periods), 3) to extract extreme values from the data, 4) to find an adequate statistical law that fits the spreading of the extreme values, 5) the selected statistical law must be validated through visual or statistical tests, and 6) the computation of the flow rate of the event itself. (A.C.)

  7. SLUDGE BATCH 4 BASELINE MELT RATE FURNACE AND SLURRY-FED MELT RATE FURNACE TESTS WITH FRITS 418 AND 510 (U)

    International Nuclear Information System (INIS)

    Smith, M; Timothy Jones, T; Donald02 Miller, D

    2007-01-01

    Several Slurry-Fed Melt Rate Furnace (SMRF) tests with earlier projections of the Sludge Batch 4 (SB4) composition have been performed.1,2 The first SB4 SMRF test used Frits 418 and 320, however it was found after the test that the REDuction/OXidation (REDOX) correlation at that time did not have the proper oxidation state for manganese. Because the manganese level in the SB4 sludge was higher than previous sludge batches tested, the impact of the higher manganese oxidation state was greater. The glasses were highly oxidized and very foamy, and therefore the results were inconclusive. After resolving this REDOX issue, Frits 418, 425, and 503 were tested in the SMRF with the updated baseline SB4 projection. Based on dry-fed Melt Rate Furnace (MRF) tests and the above mentioned SMRF tests, two previous frit recommendations were made by the Savannah River National Laboratory (SRNL) for processing of SB4 in the Defense Waste Processing Facility (DWPF). The first was Frit 503 based on the June 2006 composition projections.3 The recommendation was changed to Frit 418 as a result of the October 2006 composition projections (after the Tank 40 decant was implemented as part of the preparation plan). However, the start of SB4 processing was delayed due to the control room consolidation outage and the repair of the valve box in the Tank 51 to Tank 40 transfer line. These delays resulted in changes to the projected SB4 composition. Due to the slight change in composition and based on preliminary dry-fed MRF testing, SRNL believed that Frit 510 would increase throughput in processing SB4 in DWPF. Frit 418, which was used in processing Sludge Batch 3 (SB3), was a viable candidate and available in DWPF. Therefore, it was used during the initial SB4 processing. Due to the potential for higher melt rates with Frit 510, SMRF tests with the latest SB4 composition (1298 canisters) and Frits 510 and 418 were performed at a targeted waste loading (WL) of 35%. The '1298 canisters

  8. Temporal Patterns in Sheep Fetal Heart Rate Variability Correlate to Systemic Cytokine Inflammatory Response: A Methodological Exploration of Monitoring Potential Using Complex Signals Bioinformatics.

    Directory of Open Access Journals (Sweden)

    Christophe L Herry

    Full Text Available Fetal inflammation is associated with increased risk for postnatal organ injuries. No means of early detection exist. We hypothesized that systemic fetal inflammation leads to distinct alterations of fetal heart rate variability (fHRV. We tested this hypothesis deploying a novel series of approaches from complex signals bioinformatics. In chronically instrumented near-term fetal sheep, we induced an inflammatory response with lipopolysaccharide (LPS injected intravenously (n = 10 observing it over 54 hours; seven additional fetuses served as controls. Fifty-one fHRV measures were determined continuously every 5 minutes using Continuous Individualized Multi-organ Variability Analysis (CIMVA. CIMVA creates an fHRV measures matrix across five signal-analytical domains, thus describing complementary properties of fHRV. We implemented, validated and tested methodology to obtain a subset of CIMVA fHRV measures that matched best the temporal profile of the inflammatory cytokine IL-6. In the LPS group, IL-6 peaked at 3 hours. For the LPS, but not control group, a sharp increase in standardized difference in variability with respect to baseline levels was observed between 3 h and 6 h abating to baseline levels, thus tracking closely the IL-6 inflammatory profile. We derived fHRV inflammatory index (FII consisting of 15 fHRV measures reflecting the fetal inflammatory response with prediction accuracy of 90%. Hierarchical clustering validated the selection of 14 out of 15 fHRV measures comprising FII. We developed methodology to identify a distinctive subset of fHRV measures that tracks inflammation over time. The broader potential of this bioinformatics approach is discussed to detect physiological responses encoded in HRV measures.

  9. 78 FR 78275 - Alcohol and Drug Testing: Determination of Minimum Random Testing Rates for 2014

    Science.gov (United States)

    2013-12-26

    ...-11213, Notice No. 17] Alcohol and Drug Testing: Determination of Minimum Random Testing Rates for 2014... December 26, 2013. FOR FURTHER INFORMATION CONTACT: Jerry Powers, FRA Drug and Alcohol Program Manager, W38...-493-6313); or Sam Noe, FRA Drug and Alcohol Program Specialist, (telephone 615-719- 2951). Issued in...

  10. 21 CFR 864.6700 - Erythrocyte sedimentation rate test.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Erythrocyte sedimentation rate test. 864.6700 Section 864.6700 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Manual Hematology Devices § 864.6700 Erythrocyte...

  11. Reactor building integrity testing: A novel approach at Gentilly 2 - principles and methodology

    International Nuclear Information System (INIS)

    Collins, N.; Lafreniere, P.

    1991-01-01

    In 1987, Hydro-Quebec embarked on an ambitious development program to provide the Gentilly 2 nuclear power station with an effective, yet practical reactor building Integrity Test. The Gentilly 2 Integrity Test employs an innovative approach based on the reference volume concept. It is identified as the Temperature Compensation Method (TCM) System. This configuration has been demonstrated at both high and low test pressure and has achieved extraordinary precision in the leak rate measurement. The Gentilly 2 design allows the Integrity Test to be performed at a nominal 3 kPa(g) test pressure during an (11) hour period with the reactor at full power. The reactor building Pressure Test by comparison, is typically performed at high pressure 124 kPa(g)) in a 7 day window during an annual outage. The Integrity Test was developed with the goal of demonstrating containment availability. Specifically it was purported to detect a leak or hole in the 'bottled-up' reactor building greater in magnitude than an equivalent pipe of 25 mm diameter. However it is considered feasible that the high precision of the Gentilly 2 TCM System Integrity Test and a stable reactor building leak characteristic will constitute sufficient grounds for the reduction of the Pressure Test frequency. It is noted that only the TCM System has, to this date, allowed a relevant determination of the reactor building leak rate at a nominal test pressure of 3 kPa(g). Classical method tests at low pressure have lead to inconclusive results due to the high lack of precision

  12. Performance-based improvement of the leakage rate test program for the reactor containment of HTTR. Adoption of revised test programs containing 'Type A, Type B and Type C tests'

    International Nuclear Information System (INIS)

    Kondo, Masaaki; Emori, Koichi; Sekita, Kenji; Furusawa, Takayuki; Hayakawa, Masato; Kozawa, Takayuki; Aono, Tetsuya; Kuroha, Misao; Ouchi, Hiroshi; Kimishima, Satoru

    2008-10-01

    The reactor containment of HTTR is periodically tested to confirm leak-tight integrity by conducting overall integrated leakage rate tests, so-called 'Type A tests,' in accordance with a standard testing method provided in Japan Electric Association Code (JEAC) 4203. 'Type A test' is identified as a basic one for measuring whole leakage rates for reactor containments, it takes, however, much of cost and time of preparation, implementation and restoration of itself. Therefore, in order to upgrade the maintenance technology of HTTR, the containment leakage rate test program for HTTR was revised by adopting efficient and economical alternatives including Type B and Type C tests' which intend to measure leakage rates for containment penetrations and isolation valves, respectively. In JEAC4203-2004, following requirements are specified for adopting an alternative program: upward trend of the overall integrated leakage rate due to aging affection should not be recognized; performance criterion for combined leakage rate, that is a summation of local leakage rates evaluated by Type B and Type C tests and converted to whole leakage rates, should be established; the criterion of the combined leakage rate should be satisfied as well as of the overall integrated leakage rate; correlation between the overall integrated and combined leakage rates should be recognized. Considering the historical performances, policies of conforming to the forgoing requirements and of carrying out the revised test program were developed, which were accepted by the regulatory agency. This report presents an outline of the leakage rate tests for the reactor containment of HTTR, identifies practical issues of conventional Type A tests, and describes the conforming and implementing policies mentioned above. (author)

  13. Methodology for Life Testing of Refractory Metal / Sodium Heat Pipes

    International Nuclear Information System (INIS)

    Martin, James J.; Reid, Robert S.

    2006-01-01

    This work establishes an approach to generate carefully controlled data to find heat pipe operating life with material-fluid combinations capable of extended operation. To accomplish this goal acceleration is required to compress 10 years of operational life into 3 years of laboratory testing through a combination of increased temperature and mass fluence. Specific test series have been identified, based on American Society for Testing and Materials (ASTM) specifications, to investigate long-term corrosion rates. The refractory metal selected for demonstration purposes is a molybdenum-44.5% rhenium alloy formed by powder metallurgy. The heat pipes each have an annular crescent wick formed by hot isostatic pressing of molybdenum-rhenium wire mesh. The heat pipes are filled by vacuum distillation with purity sampling of the completed assembly. Round-the-clock heat pipe tests with 6-month destructive and non-destructive inspection intervals are conducted to identify the onset and level of corrosion. Non-contact techniques are employed to provide power to the evaporator (radio frequency induction heating at 1 to 5 kW per heat pipe) and calorimetry at the condenser (static gas gap coupled water cooled calorimeter). The planned operating temperature range extends from 1123 to 1323 K. Accomplishments before project cancellation included successful development of the heat pipe wick fabrication technique, establishment of all engineering designs, baseline operational test requirements, and procurement/assembly of supporting test hardware systems. (authors)

  14. LHC-rate beam test of CMS pixel barrel modules

    International Nuclear Information System (INIS)

    Erdmann, W.; Hoermann, Ch.; Kotlinski, D.; Horisberger, R.; Kaestli, H. Chr.; Gabathuler, K.; Bertl, W.; Meier, B.; Langenegger, U.; Trueeb, P.; Rohe, T.

    2007-01-01

    Modules for the CMS pixel barrel detector have been operated in a high rate pion beam at PSI in order to verify under LHC-like conditions the final module design for the production. The test beam provided charged particle rates up to 10 8 cm -2 s -1 over the full module area. Bunch structure and randomized high trigger rates simulated realistic operation. A four layer telescope made of single pixel readout chip assemblies provided tracking needed for the determination of the modules hit reconstruction efficiency. The performance of the modules has been shown to be adequate for the CMS pixel barrel

  15. A Physics-Based Engineering Methodology for Calculating Soft Error Rates of Bulk CMOS and SiGe Heterojunction Bipolar Transistor Integrated Circuits

    Science.gov (United States)

    Fulkerson, David E.

    2010-02-01

    This paper describes a new methodology for characterizing the electrical behavior and soft error rate (SER) of CMOS and SiGe HBT integrated circuits that are struck by ions. A typical engineering design problem is to calculate the SER of a critical path that commonly includes several circuits such as an input buffer, several logic gates, logic storage, clock tree circuitry, and an output buffer. Using multiple 3D TCAD simulations to solve this problem is too costly and time-consuming for general engineering use. The new and simple methodology handles the problem with ease by simple SPICE simulations. The methodology accurately predicts the measured threshold linear energy transfer (LET) of a bulk CMOS SRAM. It solves for circuit currents and voltage spikes that are close to those predicted by expensive 3D TCAD simulations. It accurately predicts the measured event cross-section vs. LET curve of an experimental SiGe HBT flip-flop. The experimental cross section vs. frequency behavior and other subtle effects are also accurately predicted.

  16. Dynamic Brazilian Test of Rock Under Intermediate Strain Rate: Pendulum Hammer-Driven SHPB Test and Numerical Simulation

    Science.gov (United States)

    Zhu, W. C.; Niu, L. L.; Li, S. H.; Xu, Z. H.

    2015-09-01

    The tensile strength of rock subjected to dynamic loading constitutes many engineering applications such as rock drilling and blasting. The dynamic Brazilian test of rock specimens was conducted with the split Hopkinson pressure bar (SHPB) driven by pendulum hammer, in order to determine the indirect tensile strength of rock under an intermediate strain rate ranging from 5.2 to 12.9 s-1, which is achieved when the incident bar is impacted by pendulum hammer with different velocities. The incident wave excited by pendulum hammer is triangular in shape, featuring a long rising time, and it is considered to be helpful for achieving a constant strain rate in the rock specimen. The dynamic indirect tensile strength of rock increases with strain rate. Then, the numerical simulator RFPA-Dynamics, a well-recognized software for simulating the rock failure under dynamic loading, is validated by reproducing the Brazilian test of rock when the incident stress wave retrieved at the incident bar is input as the boundary condition, and then it is employed to study the Brazilian test of rock under the higher strain rate. Based on the numerical simulation, the strain-rate dependency of tensile strength and failure pattern of the Brazilian disc specimen under the intermediate strain rate are numerically simulated, and the associated failure mechanism is clarified. It is deemed that the material heterogeneity should be a reason for the strain-rate dependency of rock.

  17. [*C]octanoic acid breath test to measure gastric emptying rate of solids.

    Science.gov (United States)

    Maes, B D; Ghoos, Y F; Rutgeerts, P J; Hiele, M I; Geypens, B; Vantrappen, G

    1994-12-01

    We have developed a breath test to measure solid gastric emptying using a standardized scrambled egg test meal (250 kcal) labeled with [14C]octanoic acid or [13C]octanoic acid. In vitro incubation studies showed that octanoic acid is a reliable marker of the solid phase. The breath test was validated in 36 subjects by simultaneous radioscintigraphic and breath test measurements. Nine healthy volunteers were studied after intravenous administration of 200 mg erythromycin and peroral administration of 30 mg propantheline, respectively. Erythromycin significantly enhanced gastric emptying, while propantheline significantly reduced gastric emptying rates. We conclude that the [*C]octanoic breath test is a promising and reliable test for measuring the gastric emptying rate of solids.

  18. A Methodology to Detect and Characterize Uplift Phenomena in Urban Areas Using Sentinel-1 Data

    Directory of Open Access Journals (Sweden)

    Roberta Bonì

    2018-04-01

    Full Text Available This paper presents a methodology to exploit the Persistent Scatterer Interferometry (PSI time series acquired by Sentinel-1 sensors for the detection and characterization of uplift phenomena in urban areas. The methodology has been applied to the Tower Hamlets Council area of London (United Kingdom using Sentinel-1 data covering the period 2015–2017. The test area is a representative high-urbanized site affected by geohazards due to natural processes such as compaction of recent deposits, and also anthropogenic causes due to groundwater management and engineering works. The methodology has allowed the detection and characterization of a 5 km2 area recording average uplift rates of 7 mm/year and a maximum rate of 18 mm/year in the period May 2015–March 2017. Furthermore, the analysis of the Sentinel-1 time series highlights that starting from August 2016 uplift rates began to decrease. A comparison between the uplift rates and urban developments as well as geological, geotechnical, and hydrogeological factors suggests that the ground displacements occur in a particular geological context and are mainly attributed to the swelling of clayey soils. The detected uplift could be attributed to a transient effect of the groundwater rebound after completion of dewatering works for the recent underground constructions.

  19. High Strain Rate Tensile Testing of Silver Nanowires: Rate-Dependent Brittle-to-Ductile Transition.

    Science.gov (United States)

    Ramachandramoorthy, Rajaprakash; Gao, Wei; Bernal, Rodrigo; Espinosa, Horacio

    2016-01-13

    The characterization of nanomaterials under high strain rates is critical to understand their suitability for dynamic applications such as nanoresonators and nanoswitches. It is also of great theoretical importance to explore nanomechanics with dynamic and rate effects. Here, we report in situ scanning electron microscope (SEM) tensile testing of bicrystalline silver nanowires at strain rates up to 2/s, which is 2 orders of magnitude higher than previously reported in the literature. The experiments are enabled by a microelectromechanical system (MEMS) with fast response time. It was identified that the nanowire plastic deformation has a small activation volume (ductile failure mode transition was observed at a threshold strain rate of 0.2/s. Transmission electron microscopy (TEM) revealed that along the nanowire, dislocation density and spatial distribution of plastic regions increase with increasing strain rate. Furthermore, molecular dynamic (MD) simulations show that deformation mechanisms such as grain boundary migration and dislocation interactions are responsible for such ductility. Finally, the MD and experimental results were interpreted using dislocation nucleation theory. The predicted yield stress values are in agreement with the experimental results for strain rates above 0.2/s when ductility is pronounced. At low strain rates, random imperfections on the nanowire surface trigger localized plasticity, leading to a brittle-like failure.

  20. Comparison of heat-testing methodology.

    Science.gov (United States)

    Bierma, Mark M; McClanahan, Scott; Baisden, Michael K; Bowles, Walter R

    2012-08-01

    Patients with irreversible pulpitis occasionally present with a chief complaint of sensitivity to heat. To appropriately diagnose the offending tooth, a variety of techniques have been developed to reproduce this chief complaint. Such techniques cause temperature increases that are potentially damaging to the pulp. Newer electronic instruments control the temperature of a heat-testing tip that is placed directly against a tooth. The aim of this study was to determine which method produced the most consistent and safe temperature increase within the pulp. This consistency facilitates the clinician's ability to differentiate between a normal pulp and irreversible pulpitis. Four operators applied the following methods to each of 4 extracted maxillary premolars (for a total of 16 trials per method): heated gutta-percha, heated ball burnisher, hot water, and a System B unit or Elements unit with a heat-testing tip. Each test was performed for 60 seconds, and the temperatures were recorded via a thermocouple in the pulp chamber. Analysis of the data was performed by using the intraclass correlation coefficient. The least consistent warming was found with hot water. The heat-testing tip also demonstrated greater consistency between operators compared with the other methods. Hot water and the heated ball burnisher caused temperature increases high enough to damage pulp tissue. The Elements unit with a heat-testing tip provides the most consistent warming of the dental pulp. Copyright © 2012 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  1. The specification of cross exchange rate equations used to test Purchasing Power Parity

    OpenAIRE

    Hunter, J; Simpson, M

    2004-01-01

    The Article considers the speciÞcation of models used to test Pur- chasing Power Parity when applied to cross exchange rates. SpeciÞcally, conventional dynamic models used to test stationarity of the real exchange rate are likely to be misspeciÞed, except when the parameters of each ex- change rate equation are the same

  2. Relative User Ratings of MMPI-2 Computer-Based Test Interpretations

    Science.gov (United States)

    Williams, John E.; Weed, Nathan C.

    2004-01-01

    There are eight commercially available computer-based test interpretations (CBTIs) for the Minnesota Multiphasic Personality Inventory-2 (MMPI-2), of which few have been empirically evaluated. Prospective users of these programs have little scientific data to guide choice of a program. This study compared ratings of these eight CBTIs. Test users…

  3. Soft error rate analysis methodology of multi-Pulse-single-event transients

    International Nuclear Information System (INIS)

    Zhou Bin; Huo Mingxue; Xiao Liyi

    2012-01-01

    As transistor feature size scales down, soft errors in combinational logic because of high-energy particle radiation is gaining more and more concerns. In this paper, a combinational logic soft error analysis methodology considering multi-pulse-single-event transients (MPSETs) and re-convergence with multi transient pulses is proposed. In the proposed approach, the voltage pulse produced at the standard cell output is approximated by a triangle waveform, and characterized by three parameters: pulse width, the transition time of the first edge, and the transition time of the second edge. As for the pulse with the amplitude being smaller than the supply voltage, the edge extension technique is proposed. Moreover, an efficient electrical masking model comprehensively considering transition time, delay, width and amplitude is proposed, and an approach using the transition times of two edges and pulse width to compute the amplitude of pulse is proposed. Finally, our proposed firstly-independently-propagating-secondly-mutually-interacting (FIP-SMI) is used to deal with more practical re-convergence gate with multi transient pulses. As for MPSETs, a random generation model of MPSETs is exploratively proposed. Compared to the estimates obtained using circuit level simulations by HSpice, our proposed soft error rate analysis algorithm has 10% errors in SER estimation with speed up of 300 when the single-pulse-single-event transient (SPSET) is considered. We have also demonstrated the runtime and SER decrease with the increment of P0 using designs from the ISCAS-85 benchmarks. (authors)

  4. CATHARE code development and assessment methodologies

    International Nuclear Information System (INIS)

    Micaelli, J.C.; Barre, F.; Bestion, D.

    1995-01-01

    The CATHARE thermal-hydraulic code has been developed jointly by Commissariat a l'Energie Atomique (CEA), Electricite de France (EdF), and Framatorne for safety analysis. Since the beginning of the project (September 1979), development and assessment activities have followed a methodology supported by two series of experimental tests: separate effects tests and integral effects tests. The purpose of this paper is to describe this methodology, the code assessment status, and the evolution to take into account two new components of this program: the modeling of three-dimensional phenomena and the requirements of code uncertainty evaluation

  5. Methodology of diagnostic tests in hepatology

    DEFF Research Database (Denmark)

    Christensen, Erik

    2009-01-01

    The performance of diagnostic tests can be assessed by a number of methods. These include sensitivity, specificity,positive and negative predictive values, likelihood ratios and receiver operating characteristic (ROC) curves. This paper describes the methods and explains which information...... they provide. Sensitivity and specificity provides measures of the diagnostic accuracy of a test in diagnosing the condition. The positive and negative predictive values estimate the probability of the condition from the test-outcome and the condition's prevalence. The likelihood ratios bring together......' and plotting sensitivity as a function of 1-specificity. The ROC-curve can be used to define optimal cut-off values for a test, to assess the diagnostic accuracy of the test, and to compare the usefulness of different tests in the same patients. Under certain conditions it may be possible to utilize a test...

  6. Split-Hopkinson Pressure Bar: an experimental technique for high strain rate tests

    International Nuclear Information System (INIS)

    Sharma, S.; Chavan, V.M.; Agrawal, R.G.; Patel, R.J.; Kapoor, R.; Chakravartty, J.K.

    2011-06-01

    Mechanical properties of materials are, in general, strain rate dependent, i.e. they respond differently at quasi-static and higher strain rate condition. The Split-Hopkinson Pressure Bar (SHPB), also referred to as Kolsky bar is a commonly used setup for high strain rate testing. SHPB is suitable for high strain rate test in strain rate range of 10 2 to 10 4 s -1 . These high strain rate data are required for safety and structural integrity assessment of structures subjected to dynamic loading. As high strain rate data are not easily available in open literature need was felt for setting up such high strain rate testing machine. SHPB at BARC was designed and set-up inhouse jointly by Refuelling Technology Division and Mechanical Metallurgy Division, at Hall no. 3, BARC. A number of conceptual designs for SHPB were thought of and the optimized design was worked out. The challenges of precision tolerance, straightness in bars and design and proper functioning of pneumatic gun were met. This setup has been used extensively to study the high strain rate material behavior. This report introduces the SHPB in general and the setup at BARC in particular. The history of development of SHPB, the basic formulations of one dimensional wave propagation, the relations between the wave velocity, particle velocity and elastic strain in a one dimensional bar, and the equations used to obtain the final stress vs. strain curves are described. The calibration of the present setup, the pre-test calculations and the posttest analysis of data are described. Finally some of the experimental results on different materials such as Cu, SS305, SA516 and Zr, at room temperature and elevated temperatures are presented. (author)

  7. Predictability of Exchange Rates in Sri Lanka: A Test of the Efficient Market Hypothesis

    OpenAIRE

    Guneratne B Wickremasinghe

    2007-01-01

    This study examined the validity of the weak and semi-strong forms of the efficient market hypothesis (EMH) for the foreign exchange market of Sri Lanka. Monthly exchange rates for four currencies during the floating exchange rate regime were used in the empirical tests. Using a battery of tests, empirical results indicate that the current values of the four exchange rates can be predicted from their past values. Further, the tests of semi-strong form efficiency indicate that exchange rate pa...

  8. A Bayesian hierarchical model with novel prior specifications for estimating HIV testing rates.

    Science.gov (United States)

    An, Qian; Kang, Jian; Song, Ruiguang; Hall, H Irene

    2016-04-30

    Human immunodeficiency virus (HIV) infection is a severe infectious disease actively spreading globally, and acquired immunodeficiency syndrome (AIDS) is an advanced stage of HIV infection. The HIV testing rate, that is, the probability that an AIDS-free HIV infected person seeks a test for HIV during a particular time interval, given no previous positive test has been obtained prior to the start of the time, is an important parameter for public health. In this paper, we propose a Bayesian hierarchical model with two levels of hierarchy to estimate the HIV testing rate using annual AIDS and AIDS-free HIV diagnoses data. At level one, we model the latent number of HIV infections for each year using a Poisson distribution with the intensity parameter representing the HIV incidence rate. At level two, the annual numbers of AIDS and AIDS-free HIV diagnosed cases and all undiagnosed cases stratified by the HIV infections at different years are modeled using a multinomial distribution with parameters including the HIV testing rate. We propose a new class of priors for the HIV incidence rate and HIV testing rate taking into account the temporal dependence of these parameters to improve the estimation accuracy. We develop an efficient posterior computation algorithm based on the adaptive rejection metropolis sampling technique. We demonstrate our model using simulation studies and the analysis of the national HIV surveillance data in the USA. Copyright © 2015 John Wiley & Sons, Ltd.

  9. Audit of Trichomonas vaginalis test requesting by community referrers after a change from culture to molecular testing, including a cost analysis.

    Science.gov (United States)

    Bissessor, Liselle; Wilson, Janet; McAuliffe, Gary; Upton, Arlo

    2017-06-16

    Trichomonas vaginalis (TV) prevalence varies among different communities and peoples. The availability of robust molecular platforms for the detection of TV has advanced diagnosis; however, molecular tests are more costly than phenotypic methodologies, and testing all urogenital samples is costly. We recently replaced culture methods with the Aptima Trichomonas vaginalis nucleic acid amplification test on specific request and as reflex testing by the laboratory, and have audited this change. Data were collected from August 2015 (microbroth culture and microscopy) and August 2016 (Aptima TV assay) including referrer, testing volumes, results and test cost estimates. In August 2015, 10,299 vaginal swabs, and in August 2016, 2,189 specimens (urogenital swabs and urines), were tested. The positivity rate went from 0.9% to 5.3%, and overall more TV infections were detected in 2016. The number needed to test and cost for one positive TV result respectively was 111 and $902.55 in 2015, and 19 and $368.92 in 2016. Request volumes and positivity rates differed among referrers. The methodology change was associated with higher overall detection of TV, and reductions in the numbers needed to test/cost for one TV diagnosis. Our audit suggests that there is room for improvement with TV test requesting in our community.

  10. Inverse modeling of emissions for local photooxidant pollution: Testing a new methodology with kriging constraints

    Directory of Open Access Journals (Sweden)

    I. Pison

    2006-07-01

    Full Text Available A new methodology for the inversion of anthropogenic emissions at a local scale is tested. The inversion constraints are provided by a kriging technique used in air quality forecast in the Paris area, which computes an analyzed concentration field from network measurements and the first-guess simulation of a CTM. The inverse developed here is based on the CHIMERE model and its adjoint to perform 4-D integration. The methodology is validated on synthetic cases inverting emission fluxes. It is shown that the information provided by the analyzed concentrations is sufficient to reach a mathematically acceptable solution to the optimization, even when little information is available in the measurements. As compared to the use of measurements alone or of measurements and a background matrix, the use of kriging leads to a more homogeneous distribution of the corrections, both in space and time. Moreover, it is then possible to double the accuracy of the inversion by performing two kriging-optimization cycles. Nevertheless, kriging analysis cannot compensate for a very important lack of information in the measurements.

  11. Study of creep behaviour in P-doped copper with slow strain rate tensile tests

    International Nuclear Information System (INIS)

    Xuexing Yao; Sandstroem, Rolf

    2000-08-01

    Pure copper with addition of phosphorous is planned to be used to construct the canisters for spent nuclear fuel. The copper canisters can be exposed to a creep deformation up to 2-4% at temperatures in services. The ordinary creep strain tests with dead weight loading are generally employed to study the creep behaviour; however, it is reported that an initial plastic deformation of 5-15% takes place when loading the creep specimens at lower temperatures. The slow strain rate tensile test is an alternative to study creep deformation behaviour of materials. Ordinary creep test and slow strain rate tensile test can give the same information in the secondary creep stage. The advantage of the tensile test is that the starting phase is much more controlled than in a creep test. In a tensile test the initial deformation behaviour can be determined and the initial strain of less than 5% can be modelled. In this study slow strain rate tensile tests at strain rate of 10 -4 , 10 -5 , 10 -6 , and 10 -7 /s at 75, 125 and 175 degrees C have been performed on P-doped pure Cu to supplement creep data from conventional creep tests. The deformation behaviour has successfully been modelled. It is shown that the slow strain rate tensile tests can be implemented to study the creep deformation behaviours of pure Cu

  12. From SNOMED CT to Uberon: Transferability of evaluation methodology between similarly structured ontologies.

    Science.gov (United States)

    Elhanan, Gai; Ochs, Christopher; Mejino, Jose L V; Liu, Hao; Mungall, Christopher J; Perl, Yehoshua

    2017-06-01

    To examine whether disjoint partial-area taxonomy, a semantically-based evaluation methodology that has been successfully tested in SNOMED CT, will perform with similar effectiveness on Uberon, an anatomical ontology that belongs to a structurally similar family of ontologies as SNOMED CT. A disjoint partial-area taxonomy was generated for Uberon. One hundred randomly selected test concepts that overlap between partial-areas were matched to a same size control sample of non-overlapping concepts. The samples were blindly inspected for non-critical issues and presumptive errors first by a general domain expert whose results were then confirmed or rejected by a highly experienced anatomical ontology domain expert. Reported issues were subsequently reviewed by Uberon's curators. Overlapping concepts in Uberon's disjoint partial-area taxonomy exhibited a significantly higher rate of all issues. Clear-cut presumptive errors trended similarly but did not reach statistical significance. A sub-analysis of overlapping concepts with three or more relationship types indicated a much higher rate of issues. Overlapping concepts from Uberon's disjoint abstraction network are quite likely (up to 28.9%) to exhibit issues. The results suggest that the methodology can transfer well between same family ontologies. Although Uberon exhibited relatively few overlapping concepts, the methodology can be combined with other semantic indicators to expand the process to other concepts within the ontology that will generate high yields of discovered issues. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Poster - 44: Development and implementation of a comprehensive end-to-end testing methodology for linac-based frameless SRS QA using a modified commercial stereotactic anthropomorphic phantom

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Derek; Mutanga, Theodore [University of Toronto, Carlo Fidani Peel Regional Cancer Center (Canada)

    2016-08-15

    Purpose: An end-to-end testing methodology was designed to evaluate the overall SRS treatment fidelity, incorporating all steps in the linac-based frameless radiosurgery treatment delivery process. The study details our commissioning experience of the Steev (CIRS, Norfolk, VA) stereotactic anthropomorphic head phantom including modification, test design, and baseline measurements. Methods: Repeated MR and CT scans were performed with interchanging inserts. MR-CT fusion accuracy was evaluated and the insert spatial coincidence was verified on CT. Five non-coplanar arcs delivered a prescription dose to a 15 mm spherical CTV with 2 mm PTV margin. Following setup, CBCT-based shifts were applied as per protocol. Sequential measurements were performed by interchanging inserts without disturbing the setup. Spatial and dosimetric accuracy was assessed by a combination of CBCT hidden target, radiochromic film, and ion chamber measurements. To facilitate film registration, the film insert was modified in-house by etching marks. Results: MR fusion error and insert spatial coincidences were within 0.3 mm. Both CBCT and film measurements showed spatial displacements of 1.0 mm in similar directions. Both coronal and sagittal films reported 2.3 % higher target dose relative to the treatment plan. The corrected ion chamber measurement was similarly greater by 1.0 %. The 3 %/2 mm gamma pass rate was 99% for both films Conclusions: A comprehensive end-to-end testing methodology was implemented for our SRS QA program. The Steev phantom enabled realistic evaluation of the entire treatment process. Overall spatial and dosimetric accuracy of the delivery were 1 mm and 3 % respectively.

  14. Methodology for testing a system for remote monitoring and control on auxiliary machines in electric vehicles

    Directory of Open Access Journals (Sweden)

    Dimitrov Vasil

    2017-01-01

    Full Text Available A laboratory system for remote monitoring and control of an asynchronous motor controlled by a soft starter and contemporary measuring and control devices has been developed and built. This laboratory system is used for research and in teaching. A study of the principles of operation, setting up and examination of intelligent energy meters, soft starters and PLC has been made as knowledge of the relevant software products is necessary. This is of great importance because systems for remote monitoring and control of energy consumption, efficiency and proper operation of the controlled objects are very often used in different spheres of industry, in building automation, transport, electricity distribution network, etc. Their implementation in electric vehicles for remote monitoring and control on auxiliary machines is also possible and very useful. In this paper, a methodology of tests is developed and some experiments are presented. Thus, an experimental verification of the developed methodology is made.

  15. Refinement of the wedge bar technique for compression tests at intermediate strain rates

    Directory of Open Access Journals (Sweden)

    Stander M.

    2012-08-01

    Full Text Available A refined development of the wedge-bar technique [1] for compression tests at intermediate strain rates is presented. The concept uses a wedge mechanism to compress small cylindrical specimens at strain rates in the order of 10s−1 to strains of up to 0.3. Co-linear elastic impact principles are used to accelerate the actuation mechanism from rest to test speed in under 300μs while maintaining near uniform strain rates for up to 30 ms, i.e. the transient phase of the test is less than 1% of the total test duration. In particular, a new load frame, load cell and sliding anvil designs are presented and shown to significantly reduce the noise generated during testing. Typical dynamic test results for a selection of metals and polymers are reported and compared with quasistatic and split Hopkinson pressure bar results.

  16. A Proposal for a Methodology to Develop a Cyber-Attack Penetration Test Scenario Including NPPs Safety

    Energy Technology Data Exchange (ETDEWEB)

    Lee, In Hyo [KAIST, Daejeon (Korea, Republic of); Son, Han Seong [Joongbu Univ., Geumsan (Korea, Republic of); Kim, Si Won [Korea Institute of Nuclear Nonproliferation and Control, Daejeon (Korea, Republic of); Kang, Hyun Gook [Rensselaer Polytechnic Institute, Troy (United States)

    2016-10-15

    Penetration test is a method to evaluate the cyber security of NPPs; so, this approach was performed in some studies. Because they focused on vulnerability finding or test bed construction, scenario based approach was not performed. However, to test the cyber security of NPPs, a proper test scenario should be needed. Ahn et al. developed cyber-attack scenarios but those scenarios couldn't be applied in penetration test because they developed the scenarios based on past incidents of NPPs induced by cyber-attack. That is, those scenarios only covered scenarios which were happened before; so, they couldn't cover other various scenarios and couldn't reflect them into a penetration test. In this study, a method to develop a cyber-attack penetration test scenario of NPPs especially focused on safety point of view is suggested. To evaluate the cyber security of NPPs, penetration test can be a possible way. In this study, a method to develop a penetration test scenario was explained. Especially, the goal of hacker was focused on nuclear fuel integrity deterioration. So, in the methodology, Level 1 PSA results were utilized to reflect plant safety into the security. From the PSA results, basic event was post processed and possible cyber-attacks were reviewed with vulnerabilities of digital control system.

  17. A Proposal for a Methodology to Develop a Cyber-Attack Penetration Test Scenario Including NPPs Safety

    International Nuclear Information System (INIS)

    Lee, In Hyo; Son, Han Seong; Kim, Si Won; Kang, Hyun Gook

    2016-01-01

    Penetration test is a method to evaluate the cyber security of NPPs; so, this approach was performed in some studies. Because they focused on vulnerability finding or test bed construction, scenario based approach was not performed. However, to test the cyber security of NPPs, a proper test scenario should be needed. Ahn et al. developed cyber-attack scenarios but those scenarios couldn't be applied in penetration test because they developed the scenarios based on past incidents of NPPs induced by cyber-attack. That is, those scenarios only covered scenarios which were happened before; so, they couldn't cover other various scenarios and couldn't reflect them into a penetration test. In this study, a method to develop a cyber-attack penetration test scenario of NPPs especially focused on safety point of view is suggested. To evaluate the cyber security of NPPs, penetration test can be a possible way. In this study, a method to develop a penetration test scenario was explained. Especially, the goal of hacker was focused on nuclear fuel integrity deterioration. So, in the methodology, Level 1 PSA results were utilized to reflect plant safety into the security. From the PSA results, basic event was post processed and possible cyber-attacks were reviewed with vulnerabilities of digital control system

  18. Monitoring HIV Testing in the United States: Consequences of Methodology Changes to National Surveys.

    Directory of Open Access Journals (Sweden)

    Michelle M Van Handel

    Full Text Available In 2011, the National Health Interview Survey (NHIS, an in-person household interview, revised the human immunodeficiency virus (HIV section of the survey and the Behavioral Risk Factor Surveillance System (BRFSS, a telephone-based survey, added cellphone numbers to its sampling frame. We sought to determine how these changes might affect assessment of HIV testing trends.We used linear regression with pairwise contrasts with 2003-2013 data from NHIS and BRFSS to compare percentages of persons aged 18-64 years who reported HIV testing in landline versus cellphone-only households before and after 2011, when NHIS revised its in-person questionnaire and BRFSS added cellphone numbers to its telephone-based sample.In NHIS, the percentage of persons in cellphone-only households increased 13-fold from 2003 to 2013. The percentage ever tested for HIV was 6%-10% higher among persons in cellphone-only than landline households. The percentage ever tested for HIV increased significantly from 40.2% in 2003 to 45.0% in 2010, but was significantly lower in 2011 (40.6% and 2012 (39.7%. In BRFSS, the percentage ever tested decreased significantly from 45.9% in 2003 to 40.2% in 2010, but increased to 42.9% in 2011 and 43.5% in 2013.HIV testing estimates were lower after NHIS questionnaire changes but higher after BRFSS methodology changes. Data before and after 2011 are not comparable, complicating assessment of trends.

  19. Systematic review of communication partner training in aphasia: methodological quality.

    Science.gov (United States)

    Cherney, Leora R; Simmons-Mackie, Nina; Raymer, Anastasia; Armstrong, Elizabeth; Holland, Audrey

    2013-10-01

    Twenty-three studies identified from a previous systematic review examining the effects of communication partner training on persons with aphasia and their communication partners were evaluated for methodological quality. Two reviewers rated the studies on defined methodological quality criteria relevant to each study design. There were 11 group studies, seven single-subject participant design studies, and five qualitative studies. Quality scores were derived for each study. The mean inter-rater reliability of scores for each study design ranged from 85-93%, with Cohen's Kappa indicating substantial agreement between raters. Methodological quality of research on communication partner training in aphasia was highly varied. Overall, group studies employed the least rigorous methodology as compared to single subject and qualitative research. Only two of 11 group studies complied with more than half of the quality criteria. No group studies reported therapist blinding and only one group study reported participant blinding. Across all types of studies, the criterion of treatment fidelity was most commonly omitted. Failure to explicitly report certain methodological quality criteria may account for low ratings. Using methodological rating scales specific to the type of study design may help improve the methodological quality of aphasia treatment studies, including those on communication partner training.

  20. Interest Rate Risk Management using Duration Gap Methodology

    Directory of Open Access Journals (Sweden)

    Dan Armeanu

    2008-01-01

    should be measured and managed within an asset-liability management. Then the articles takes a short look at methods for measuring interest rate risk and after that explains and demonstrates how can be used Duration Gap Model for managing interest rate risk in banks.

  1. Development of methodology for evaluating and monitoring steam generator feedwater nozzle cracking in PWRs

    International Nuclear Information System (INIS)

    Shvarts, S.; Gerber, D.A.; House, K.; Hirschberg, P.

    1994-01-01

    The objective of this paper is to describe a methodology for evaluating and monitoring steam generator feedwater nozzle cracking in PWR plants. This methodology is based in part on plant test data obtained from a recent Diablo Canyon Power Plant (DCPP) Unit 1 heatup. Temperature sensors installed near the nozzle-to-pipe weld were monitored during the heatup, along with operational parameters such as auxiliary feedwater (AFW) flow rate and steam generator temperature. A thermal stratification load definition was developed from this data. Steady state characteristics of this data were used in a finite element analysis to develop relationship between AFW flow and stratification interface level. Fluctuating characteristics of this data were used to determine transient parameters through the application of a Green's Function approach. The thermal stratification load definition from the test data was used in a three-dimensional thermal stress analysis to determine stress cycling and consequent fatigue damage or crack growth during AFW flow fluctuations. The implementation of the developed methodology in the DCPP and Sequoyah Nuclear Plant (SNP) fatigue monitoring systems is described

  2. Parents rate the ratings: a test of the validity of the American movie, television, and video game ratings.

    Science.gov (United States)

    Walsh, D A; Gentile, D A; Van Brederode, T M

    2002-02-01

    Numerous studies have documented the potential effects on young audiences of violent content in media products, including movies, television programs, and computer and video games. Similar studies have evaluated the effects associated with sexual content and messages. Cumulatively, these effects represent a significant public health risk for increased aggressive and violent behavior, spread of sexually transmitted diseases, and pediatric pregnancy. In partial response to these risks and to public and legislative pressure, the movie, television, and gaming industries have implemented ratings systems intended to provide information about the content and appropriate audiences for different films, shows, and games. We conducted a panel study to test the validity of the current movie, television, and video game rating systems. Participants used the KidScore media evaluation tool, which evaluates films, television shows, and video and computer games on 10 aspects, including the appropriateness of the media product for children on the basis of age. Results revealed that when an entertainment industry rates a product as inappropriate for children, parent raters agree that it is inappropriate for children. However, parent raters disagree with industry usage of many of the ratings designating material suitable for children of different ages. Products rated as appropriate for adolescents are of the greatest concern. The level of disagreement varies from industry to industry and even from rating to rating. Analysis indicates that the amount of violent content and portrayals of violence are the primary markers for disagreement between parent raters and industry ratings. Short-term and long-term recommendations are suggested.

  3. A dose to curie conversion methodology

    International Nuclear Information System (INIS)

    Stowe, P.A.

    1987-01-01

    Development of the computer code RadCAT (Radioactive waste Classification And Tracking) has led to the development of a simple dose rate to curie content conversion methodology for containers with internally distributed radioactive material. It was determined early on that, if possible, the computerized dose rate to curie evaluation model employed in RadCAT should yield the same results as the hand method utilized and specified in plant procedures. A review of current industry practices indicated two distinct types of computational methodologies are presently in use. The most common methods are computer based calculations utilizing complex mathematical models specifically established for various containers geometries. This type of evaluation is tedious, however, and does not lend itself to repetition by hand. The second method of evaluation, therefore, is simplified expressions that sacrifice accuracy for ease of computation, and generally over estimate container curie content. To meet the aforementioned criterion current computer based models were deemed unacceptably complex and hand computational methods to be too inaccurate for serious consideration. The contact dose rate/curie content analysis methodology presented herein provides an equation that is easy to use in hand calculations yet provides accuracy equivalent to other computer based computations

  4. Variation of strain rate sensitivity index of a superplastic aluminum alloy in different testing methods

    Science.gov (United States)

    Majidi, Omid; Jahazi, Mohammad; Bombardier, Nicolas; Samuel, Ehab

    2017-10-01

    The strain rate sensitivity index, m-value, is being applied as a common tool to evaluate the impact of the strain rate on the viscoplastic behaviour of materials. The m-value, as a constant number, has been frequently taken into consideration for modeling material behaviour in the numerical simulation of superplastic forming processes. However, the impact of the testing variables on the measured m-values has not been investigated comprehensively. In this study, the m-value for a superplastic grade of an aluminum alloy (i.e., AA5083) has been investigated. The conditions and the parameters that influence the strain rate sensitivity for the material are compared with three different testing methods, i.e., monotonic uniaxial tension test, strain rate jump test and stress relaxation test. All tests were conducted at elevated temperature (470°C) and at strain rates up to 0.1 s-1. The results show that the m-value is not constant and is highly dependent on the applied strain rate, strain level and testing method.

  5. ALARA cost/benefit analysis at Union Electric company using the ARP/AI methodology

    International Nuclear Information System (INIS)

    Williams, M.C.

    1987-01-01

    This paper describes the development of a specific method for justification of expenditures associated with reducing occupational radiation exposure to as low as reasonably achievable (ALARA). This methodology is based on the concepts of the Apparebt Reduction Potential (ARP) and Achievability index (AI) as described in NUREG/CR-0446, Union Eletric's corporate planning model and the EPRI Model for dose rate buildup with reactor operating life. The ARP provides a screening test to determine if there is a need for ALARA expenditures based on actual or predicted exposure rates and/or dose experience. The AI is a means of assessing all costs and all benefits, even though they are expressed in different units of measurement such as person-rem and dollars, to determine if ALARA expenditures are justified and their value. This method of cost/benefit analysis can be applied by any company or organization utilizing site-specific exposure and dose rate data, and incorporating consideration of administrative exposure controls which may vary from organization to organization. Specific example cases are presented and compared to other methodologies for ALARA cost/benefit analysis

  6. WTP Waste Feed Qualification: Hydrogen Generation Rate Measurement Apparatus Testing Report

    Energy Technology Data Exchange (ETDEWEB)

    Stone, M. E. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Newell, J. D. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Smith, T. E. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Pareizs, J. M. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2016-06-01

    The generation rate of hydrogen gas in the Hanford tank waste will be measured during the qualification of the staged tank waste for processing in the Hanford Tank Waste Treatment and Immobilization Plant. Based on a review of past practices in measurement of the hydrogen generation, an apparatus to perform this measurement has been designed and tested for use during waste feed qualification. The hydrogen generation rate measurement apparatus (HGRMA) described in this document utilized a 100 milliliter sample in a continuously-purged, continuously-stirred vessel, with measurement of hydrogen concentration in the vent gas. The vessel and lid had a combined 220 milliliters of headspace. The vent gas system included a small condenser to prevent excessive evaporative losses from the sample during the test, as well as a demister and filter to prevent particle migration from the sample to the gas chromatography system. The gas chromatograph was an on line automated instrument with a large-volume sample-injection system to allow measurement of very low hydrogen concentrations. This instrument automatically sampled the vent gas from the hydrogen generation rate measurement apparatus every five minutes and performed data regression in real time. The fabrication of the hydrogen generation rate measurement apparatus was in accordance with twenty three (23) design requirements documented in the conceptual design package, as well as seven (7) required developmental activities documented in the task plan associated with this work scope. The HGRMA was initially tested for proof of concept with physical simulants, and a remote demonstration of the system was performed in the Savannah River National Laboratory Shielded Cells Mockup Facility. Final verification testing was performed using non-radioactive simulants of the Hanford tank waste. Three different simulants were tested to bound the expected rheological properties expected during waste feed qualification testing. These

  7. Methodology for Speech Assessment in the Scandcleft Project-An International Randomized Clinical Trial on Palatal Surgery

    DEFF Research Database (Denmark)

    Willadsen, Elisabeth

    2009-01-01

    Objective: To present the methodology for speech assessment in the Scandcleft project and discuss issues from a pilot study. Design: Description of methodology and blinded test for speech assessment. Speech samples and instructions for data collection and analysis for comparisons of speech outcomes...... across five included languages were developed and tested. Participants and Materials: Randomly selected video recordings of 10 5-year-old children from each language (n = 50) were included in the project. Speech material consisted of test consonants in single words, connected speech, and syllable chains......-sum and the overall rating of VPC was 78%. Conclusions: Pooling data of speakers of different languages in the same trial and comparing speech outcome across trials seems possible if the assessment of speech concerns consonants and is confined to speech units that are phonetically similar across languages. Agreed...

  8. Test Method for High β Particle Emission Rate of 63Ni Source Plate

    OpenAIRE

    ZHANG Li-feng

    2015-01-01

    For the problem of measurement difficulties of β particle emission rate of Ni-63 source plate used for Ni-63 betavoltaic battery, a relative test method of scintillation current method was erected according to the measurement principle of scintillation detector.β particle emission rate of homemade Ni-63 source plate was tested by the method, and the test results were analysed and evaluated, it was initially thought that scintillation current method was a feasible way of testing β particle emi...

  9. Dosimetric methodology for extremities of individuals occupationally exposed to beta radiation using the optically stimulated luminescence technique

    International Nuclear Information System (INIS)

    Pinto, Teresa Cristina Nathan Outeiro

    2010-01-01

    A dosimetric methodology was established for the determination of extremity doses of individuals occupationally exposed to beta radiation, using Al 2 O 3 :C detectors and the optically stimulated luminescence (OSL) reader system microStar, Landauer. The main parts of the work were: characterization of the dosimetric material Al 2 O 3 :C using the OSL technique; establishment of the dose evaluation methodology; dose rate determination of beta radiation sources; application of the established method in a practical test with individuals occupationally exposed to beta radiation during a calibration simulation of clinical applicators; validation of the methodology by the comparison between the dose results of the practical test using the OSL and the thermoluminescence (TL) techniques. The results show that both the OSL Al-2O 3 :C detectors and the technique may be utilized for individual monitoring of extremities and beta radiation. (author)

  10. Implementation of Prognostic Methodologies to Cryogenic Propellant Loading Test-bed

    Data.gov (United States)

    National Aeronautics and Space Administration — Prognostics methodologies determine the health state of a system and predict the end of life and remaining useful life. This information enables operators to take...

  11. 42 CFR 413.312 - Methodology for calculating rates.

    Science.gov (United States)

    2010-10-01

    ... Determined Payment Rates for Low-Volume Skilled Nursing Facilities, for Cost Reporting Periods Beginning... routine service cost limits; (ii) A wage index to adjust for area wage differences; and (iii) The most... of rates published in the Federal Register under the authority of § 413.320, CMS announces the wage...

  12. Methodology for the Study of the Envelope Airtightness of Residential Buildings in Spain: A Case Study

    Directory of Open Access Journals (Sweden)

    Feijó-Muñoz Jesús

    2018-03-01

    Full Text Available Air leakage and its impact on the energy performance of dwellings has been broadly studied in countries with cold climates in Europe, US, and Canada. However, there is a lack of knowledge in this field in Mediterranean countries. Current Spanish building regulations establish ventilation rates based on ideal airtight envelopes, causing problems of over-ventilation and substantial energy losses. The aim of this paper is to develop a methodology that allows the characterization of the envelope of the housing stock in Spain in order to adjust ventilation rates taking into consideration air leakage. A methodology that is easily applicable to other countries that consider studying the airtightness of the envelope and its energetic behaviour improvement is proposed. A statistical sampling method has been established to determine the dwellings to be tested, considering relevant variables concerning airtightness: climate zone, year of construction, and typology. The air leakage rate is determined using a standardized building pressurization technique according to European Standard EN 13829. A representative case study has been presented as an example of the implementation of the designed methodology and results are compared to preliminary values obtained from the database.

  13. A proposed hardness assurance test methodology for bipolar linear circuits and devices in a space ionizing radiation environment

    International Nuclear Information System (INIS)

    Pease, R.L.; Brown, D.B.; Cohn, L.

    1997-01-01

    A hardness assurance test approach has been developed for bipolar linear circuits and devices in space. It consists of a screen for dose rate sensitivity and a characterization test method to develop the conditions for a lot acceptance test at high dose rate

  14. Testing the Control of Mineral Supply Rates on Chemical Erosion Rates in the Klamath Mountains

    Science.gov (United States)

    West, N.; Ferrier, K.

    2016-12-01

    The relationship between rates of chemical erosion and mineral supply is central to many problems in Earth science, including how tightly Earth's climate should be coupled to tectonics, how strongly nutrient supply to soils and streams depends on soil production, and how much lithology affects landscape evolution. Despite widespread interest in this relationship, there remains no consensus on how closely coupled chemical erosion rates should be to mineral supply rates. To address this, we have established a network of field sites in the Klamath Mountains along a latitudinal transect that spans an expected gradient in mineral supply rates associated with the geodynamic response to the migration of the Mendocino Triple Junction. Here, we present new measurements of regolith geochemistry and topographic analyses that will be compared with cosmogenic 10Be measurements to test hypotheses about supply-limited and kinetically-limited chemical erosion on granodioritic ridgetops. Previous studies in this area suggest a balance between rock uplift rates and basin wide erosion rates, implying the study ridgetops may have adjusted to an approximate steady state. Preliminary data are consistent with a decrease in chemical depletion fraction (CDF) with increasing ridgetop curvature. To the extent that ridgetop curvature reflects ridgetop erosion rates, this implies that chemical erosion rates at these sites are influenced by both mineral supply rates and dissolution kinetics.

  15. Characterization of strain rate sensitivity and activation volume using the indentation relaxation test

    International Nuclear Information System (INIS)

    Xu Baoxing; Chen Xi; Yue Zhufeng

    2010-01-01

    We present the possibility of extracting the strain rate sensitivity, activation volume and Helmholtz free energy (for dislocation activation) using just one indentation stress relaxation test, and the approach is demonstrated with polycrystalline copper. The Helmholtz free energy measured from indentation relaxation agrees well with that from the conventional compression relaxation test, which validates the proposed approach. From the indentation relaxation test, the measured indentation strain rate sensitivity exponent is found to be slightly larger, and the indentation activation volume much smaller, than their counterparts from the compression test. The results indicate the involvement of multiple dislocation mechanisms in the indentation test.

  16. Specificity and false positive rates of the Test of Memory Malingering, Rey 15-item Test, and Rey Word Recognition Test among forensic inpatients with intellectual disabilities.

    Science.gov (United States)

    Love, Christopher M; Glassmire, David M; Zanolini, Shanna Jordan; Wolf, Amanda

    2014-10-01

    This study evaluated the specificity and false positive (FP) rates of the Rey 15-Item Test (FIT), Word Recognition Test (WRT), and Test of Memory Malingering (TOMM) in a sample of 21 forensic inpatients with mild intellectual disability (ID). The FIT demonstrated an FP rate of 23.8% with the standard quantitative cutoff score. Certain qualitative error types on the FIT showed promise and had low FP rates. The WRT obtained an FP rate of 0.0% with previously reported cutoff scores. Finally, the TOMM demonstrated low FP rates of 4.8% and 0.0% on Trial 2 and the Retention Trial, respectively, when applying the standard cutoff score. FP rates are reported for a range of cutoff scores and compared with published research on individuals diagnosed with ID. Results indicated that although the quantitative variables on the FIT had unacceptably high FP rates, the TOMM and WRT had low FP rates, increasing the confidence clinicians can place in scores reflecting poor effort on these measures during ID evaluations. © The Author(s) 2014.

  17. Towards standardized testing methodologies for optical properties of components in concentrating solar thermal power plants

    Science.gov (United States)

    Sallaberry, Fabienne; Fernández-García, Aránzazu; Lüpfert, Eckhard; Morales, Angel; Vicente, Gema San; Sutter, Florian

    2017-06-01

    Precise knowledge of the optical properties of the components used in the solar field of concentrating solar thermal power plants is primordial to ensure their optimum power production. Those properties are measured and evaluated by different techniques and equipment, in laboratory conditions and/or in the field. Standards for such measurements and international consensus for the appropriate techniques are in preparation. The reference materials used as a standard for the calibration of the equipment are under discussion. This paper summarizes current testing methodologies and guidelines for the characterization of optical properties of solar mirrors and absorbers.

  18. Efficient preliminary floating offshore wind turbine design and testing methodologies and application to a concrete spar design

    OpenAIRE

    Matha, Denis; Sandner, Frank; Molins i Borrell, Climent; Campos Hortigüela, Alexis; Cheng, Po Wen

    2015-01-01

    The current key challenge in the floating offshore wind turbine industry and research is on designing economic floating systems that can compete with fixed-bottom offshore turbines in terms of levelized cost of energy. The preliminary platform design, as well as early experimental design assessments, are critical elements in the overall design process. In this contribution, a brief review of current floating offshore wind turbine platform pre-design and scaled testing methodologies is provide...

  19. Field Test Evaluation of Effect on Cone Resistance Caused by Change in Penetration Rate

    DEFF Research Database (Denmark)

    Poulsen, Rikke; Nielsen, Benjaminn Nordahl; Ibsen, Lars Bo

    2012-01-01

    in the laboratory. A change in the measured cone resistance occurs by lowering the penetration rate. This is caused by the changes in drainage conditions. Compared to the normal penetration rate of 20 mm/s, this paper illustrates that lowering the penetration rate leads to an increase in the cone resistance from 1......This paper presents how a change in cone penetration rate affects the measured cone resistance during cone penetration testing in silty soils. Regardless of soil, type the standard rate of penetration is 20 mm/s and it is generally accepted that undrained penetration occurs in clay while drained...... penetration occurs in sand. In intermediate soils such as silty soils, the standard cone penetration rate may result in drainage conditions varying from undrained to partially or fully drained conditions. Field cone penetrations tests have been conducted with different penetration rates on a test site...

  20. Performance and Abuse Testing of 5 Year Old Low Rate and Medium Rate Lithium Thionyl Chloride Cells

    Science.gov (United States)

    Frerker, Rick; Zhang, Wenlin; Jeevarajan, Judith; Bragg, Bobby J.

    2001-01-01

    Most cells survived the 3 amp (A) over-discharge at room temperature for 2 hours. The cell that failed was the LTC-114 after high rate discharge of 500 mA similar to the results of the 1 A over-discharge test. Most cells opened during 0.05 Ohm short circuit test without incident but three LTC-111 cells exploded apparently due to a lack of a thermal cutoff switch. The LTC-114 cells exposed to a hard short of 0.05 Ohms recovered but the LTC-114 cells exposed to a soft short of 1 Ohm did not. This is probably due to the activation of a resetable fuse during a hard short. Fresh cells tend to survive exposure to higher temperatures than cells previously discharged at high rate (1 Amp). LTC-111 cells tend to vent at lower temperatures than the all LTC-114 cells and the LTC-115 cells that were previously discharged at rates exceeding 1 Amp.

  1. Inner volume leakage during integrated leakage rate testing

    International Nuclear Information System (INIS)

    Glover, J.P.

    1987-01-01

    During an integrated leak rate test (ILRT), the containment structure is maintained at test pressure with most penetrations isolated. Since penetrations typically employ dual isolation, the possibility exists for the inner isolation to leak while the outer holds. In this case, the ILRT instrumentation system would indicate containment out-leakage when, in fact, only the inner volume between closures is being pressurized. The problem is compounded because this false leakage is not readily observable outside of containment by standard leak inspection techniques. The inner volume leakage eventually subsides after the affected volumes reach test pressure. Depending on the magnitude of leakage and the size of the volumes, equalization could occur prior to the end of the pretest stabilization period, or significant false leakages may persist throughout the entire test. Two simple analyses were performed to quantify the effects of inside volume leakages. First, a lower bound for the equalization time was found. A second analysis was performed to find an approximate upper bound for the stabilization time. The results of both analyses are shown

  2. Testing linear growth rate formulas of non-scale endogenous growth models

    NARCIS (Netherlands)

    Ziesemer, Thomas

    2017-01-01

    Endogenous growth theory has produced formulas for steady-state growth rates of income per capita which are linear in the growth rate of the population. Depending on the details of the models, slopes and intercepts are positive, zero or negative. Empirical tests have taken over the assumption of

  3. Testing a SEA methodology for the energy sector: a waste incineration tax proposal

    International Nuclear Information System (INIS)

    Nilsson, Maans; Bjoerklund, Anna; Finnveden, Goeran; Johansson, Jessica

    2005-01-01

    Most Strategic Environmental Assessment (SEA) research has been preoccupied with SEA as a procedure and there are relatively few developments and tests of analytical methodologies. This paper applies and tests an analytical framework for an energy sector SEA. In a case study on a policy proposal for waste-to-energy taxation in Sweden, it studies changes in the energy system as a result of implementing the suggested tax by testing three analytical pathways: an LCA pathway, a site-dependent pathway, and a qualitative pathway. In addition, several valuation methods are applied. The assessment indicates that there are some overall environmental benefits to introducing a tax, but that benefits are modest compared to the potential. The methods are discussed in relation to characteristics for effective policy learning and knowledge uptake. The application shows that in many ways they complement each other rather than substitute for each other. The qualitative pathway is useful for raising awareness and getting a comprehensive view of environmental issues, but has limited potential for decision support. The precision increased as we went to LCA and to site-dependent analysis, and a hierarchy emerged in which the qualitative pathway filled rudimentary functions whereas the site-dependent analysis gave more advanced decision support. All methods had limited potential in supporting a choice between alternatives unless data was aggregated through a valuation exercise

  4. Testing for long-range dependence in the Brazilian term structure of interest rates

    International Nuclear Information System (INIS)

    Cajueiro, Daniel O.; Tabak, Benjamin M.

    2009-01-01

    This paper presents empirical evidence of fractional dynamics in interest rates for different maturities for Brazil. A variation of a newly developed test for long-range dependence, the V/S statistic, with a post-blackening bootstrap is employed. Results suggest that Brazilian interest rates possess strong long-range dependence in volatility, even when considering the structural break in 1999. These findings imply that the development of policy models that give rise to long-range dependence in interest rates' volatility could be very useful. The long-short-term interest rates spread has strong long-range dependence, which suggests that traditional tests of expectation hypothesis of the term structure of interest rates may be misspecified.

  5. Integrating Test-Form Formatting into Automated Test Assembly

    Science.gov (United States)

    Diao, Qi; van der Linden, Wim J.

    2013-01-01

    Automated test assembly uses the methodology of mixed integer programming to select an optimal set of items from an item bank. Automated test-form generation uses the same methodology to optimally order the items and format the test form. From an optimization point of view, production of fully formatted test forms directly from the item pool using…

  6. Certification Testing Methodology for Composite Structure. Volume 2. Methodology Development

    Science.gov (United States)

    1986-10-01

    parameter, sample size and fa- tigue test duration. The required input are 1. Residual strength Weibull shape parameter ( ALPR ) 2. Fatigue life Weibull shape...INPUT STRENGTH ALPHA’) READ(*,*) ALPR ALPRI = 1.O/ ALPR WRITE(*, 2) 2 FORMAT( 2X, ’PLEASE INPUT LIFE ALPHA’) READ(*,*) ALPL ALPLI - 1.0/ALPL WRITE(*, 3...3 FORMAT(2X,’PLEASE INPUT SAMPLE SIZE’) READ(*,*) N AN - N WRITE(*,4) 4 FORMAT(2X,’PLEASE INPUT TEST DURATION’) READ(*,*) T RALP - ALPL/ ALPR ARGR - 1

  7. Failure modes induced by natural radiation environments on DRAM memories: study, test methodology and mitigation technique

    International Nuclear Information System (INIS)

    Bougerol, A.

    2011-05-01

    DRAMs are frequently used in space and aeronautic systems. Their sensitivity to cosmic radiations have to be known in order to satisfy reliability requirements for critical applications. These evaluations are traditionally done with particle accelerators. However, devices become more complex with technology integration. Therefore new effects appear, inducing longer and more expensive tests. There is a complementary solution: the pulsed laser, which triggers similar effects as particles. Thanks to these two test tools, main DRAM radiation failure modes were studied: SEUs (Single Event Upset) in memory blocks, and SEFIs (Single Event Functional Interrupt) in peripheral circuits. This work demonstrates the influence of test patterns on SEU and SEFI sensitivities depending on technology used. In addition, this study identifies the origin of the most frequent type of SEFIs. Moreover, laser techniques were developed to quantify sensitive surfaces of the different effects. This work led to a new test methodology for industry, in order to optimize test cost and efficiency using both pulsed laser beams and particle accelerators. Finally, a new fault tolerant technique is proposed: based on DRAM cell radiation immunity when discharged, this technique allows to correct all bits of a logic word. (author)

  8. Mirror-mark tests performed on jackdaws reveal potential methodological problems in the use of stickers in avian mark-test studies.

    Directory of Open Access Journals (Sweden)

    Manuel Soler

    Full Text Available Some animals are capable of recognizing themselves in a mirror, which is considered to be demonstrated by passing the mark test. Mirror self-recognition capacity has been found in just a few mammals having very large brains and only in one bird, the magpie (Pica pica. The results obtained in magpies have enormous biological and cognitive implications because the fact that magpies were able to pass the mark test meant that this species is at the same cognitive level with great apes, that mirror self-recognition has evolved independently in the magpie and great apes (which diverged 300 million years ago, and that the neocortex (which is not present in the bird's brains is not a prerequisite for mirror self-recognition as previously believed. Here, we have replicated the experimental design used on magpies to determine whether jackdaws (Corvus monedula are also capable of mirror self-recognition by passing the mark test. We found that our nine jackdaws showed a very high interest towards the mirror and exhibited self-contingent behavior as soon as mirrors were introduced. However, jackdaws were not able to pass the mark test: both sticker-directed actions and sticker removal were performed with a similar frequency in both the cardboard (control and the mirror conditions. We conclude that our jackdaws' behaviour raises non-trivial questions about the methodology used in the avian mark test. Our study suggests that the use of self-adhesive stickers on sensitive throat feathers may open the way to artefactual results because birds might perceive the stickers tactilely.

  9. Establishing a Ballistic Test Methodology for Documenting the Containment Capability of Small Gas Turbine Engine Compressors

    Science.gov (United States)

    Heady, Joel; Pereira, J. Michael; Ruggeri, Charles R.; Bobula, George A.

    2009-01-01

    A test methodology currently employed for large engines was extended to quantify the ballistic containment capability of a small turboshaft engine compressor case. The approach involved impacting the inside of a compressor case with a compressor blade. A gas gun propelled the blade into the case at energy levels representative of failed compressor blades. The test target was a full compressor case. The aft flange was rigidly attached to a test stand and the forward flange was attached to a main frame to provide accurate boundary conditions. A window machined in the case allowed the projectile to pass through and impact the case wall from the inside with the orientation, direction and speed that would occur in a blade-out event. High-peed, digital-video cameras provided accurate velocity and orientation data. Calibrated cameras and digital image correlation software generated full field displacement and strain information at the back side of the impact point.

  10. Demonstration of load rating capabilities through physical load testing : Sioux County bridge case study.

    Science.gov (United States)

    2013-08-01

    The objective of this work, Pilot Project - Demonstration of Capabilities and Benefits of Bridge Load Rating through Physical Testing, was to demonstrate the capabilities for load testing and rating bridges in Iowa, study the economic benefit of perf...

  11. Demonstration of load rating capabilities through physical load testing : Johnson County bridge case study.

    Science.gov (United States)

    2013-08-01

    The objective of this work, Pilot Project - Demonstration of Capabilities and Benefits of Bridge Load Rating through Physical Testing, was to demonstrate the capabilities for load testing and rating bridges in Iowa, study the economic benefit of perf...

  12. Demonstration of load rating capabilities through physical load testing : Ida County bridge case study.

    Science.gov (United States)

    2013-08-01

    The objective of this work, Pilot Project - Demonstration of Capabilities and Benefits of Bridge Load Rating through Physical Testing, was to demonstrate the capabilities for load testing and rating bridges in Iowa, study the economic benefit of perf...

  13. 105-KE Basin isolation barrier leak rate test analytical development. Revision 1

    International Nuclear Information System (INIS)

    Irwin, J.J.

    1995-01-01

    This document provides an analytical development in support of the proposed leak rate test of the 105-KE Basin. The analytical basis upon which the K-basin leak test results will be used to determine the basin leakage rates is developed in this report. The leakage of the K-Basin isolation barriers under postulated accident conditions will be determined from the test results. There are two fundamental flow regimes that may exist in the postulated K-Basin leakage: viscous laminar and turbulent flow. An analytical development is presented for each flow regime. The basic geometry and nomenclature of the postulated leak paths are denoted

  14. HIV Risks, Testing, and Treatment in the Former Soviet Union: Challenges and Future Directions in Research and Methodology.

    Science.gov (United States)

    Saadat, Victoria M

    2015-01-01

    The dissolution of the USSR resulted in independence for constituent republics but left them battling an unstable economic environment and healthcare. Increases in injection drug use, prostitution, and migration were all widespread responses to this transition and have contributed to the emergence of an HIV epidemic in the countries of former Soviet Union. Researchers have begun to identify the risks of HIV infection as well as the barriers to HIV testing and treatment in the former Soviet Union. Significant methodological challenges have arisen and need to be addressed. The objective of this review is to determine common threads in HIV research in the former Soviet Union and provide useful recommendations for future research studies. In this systematic review of the literature, Pubmed was searched for English-language studies using the key search terms "HIV", "AIDS", "human immunodeficiency virus", "acquired immune deficiency syndrome", "Central Asia", "Kazakhstan", "Kyrgyzstan", "Uzbekistan", "Tajikistan", "Turkmenistan", "Russia", "Ukraine", "Armenia", "Azerbaijan", and "Georgia". Studies were evaluated against eligibility criteria for inclusion. Thirty-nine studies were identified across the two main topic areas of HIV risk and barriers to testing and treatment, themes subsequently referred to as "risk" and "barriers". Study design was predominantly cross-sectional. The most frequently used sampling methods were peer-to-peer and non-probabilistic sampling. The most frequently reported risks were condom misuse, risky intercourse, and unsafe practices among injection drug users. Common barriers to testing included that testing was inconvenient, and that results would not remain confidential. Frequent barriers to treatment were based on a distrust in the treatment system. The findings of this review reveal methodological limitations that span the existing studies. Small sample size, cross-sectional design, and non-probabilistic sampling methods were frequently

  15. A Methodological Report: Adapting the 505 Change-of-Direction Speed Test Specific to American Football.

    Science.gov (United States)

    Lockie, Robert G; Farzad, Jalilvand; Orjalo, Ashley J; Giuliano, Dominic V; Moreno, Matthew R; Wright, Glenn A

    2017-02-01

    Lockie, RG, Jalilvand, F, Orjalo, AJ, Giuliano, DV, Moreno, MR, and Wright, GA. A methodological report: Adapting the 505 change-of-direction speed test specific to American football. J Strength Cond Res 31(2): 539-547, 2017-The 505 involves a 10-m sprint past a timing gate, followed by a 180° change-of-direction (COD) performed over 5 m. This methodological report investigated an adapted 505 (A505) designed to be football-specific by changing the distances to 10 and 5 yd. Twenty-five high school football players (6 linemen [LM]; 8 quarterbacks, running backs, and linebackers [QB/RB/LB]; 11 receivers and defensive backs [R/DB]) completed the A505 and 40-yd sprint. The difference between A505 and 0 to 10-yd time determined the COD deficit for each leg. In a follow-up session, 10 subjects completed the A505 again and 10 subjects completed the 505. Reliability was analyzed by t-tests to determine between-session differences, typical error (TE), and coefficient of variation. Test usefulness was examined via TE and smallest worthwhile change (SWC) differences. Pearson's correlations calculated relationships between the A505 and 505, and A505 and COD deficit with the 40-yd sprint. A 1-way analysis of variance (p ≤ 0.05) derived between-position differences in the A505 and COD deficit. There were no between-session differences for the A505 (p = 0.45-0.76; intraclass correlation coefficient = 0.84-0.95; TE = 2.03-4.13%). Additionally, the A505 was capable of detecting moderate performance changes (SWC0.5 > TE). The A505 correlated with the 505 and 40-yard sprint (r = 0.58-0.92), suggesting the modified version assessed similar qualities. Receivers and defensive backs were faster than LM in the A505 for both legs, and right-leg COD deficit. Quarterbacks, running backs, and linebackers were faster than LM in the right-leg A505. The A505 is reliable, can detect moderate performance changes, and can discriminate between football position groups.

  16. Groundwater flow simulations in support of the Local Scale Hydrogeological Description developed within the Laxemar Methodology Test Project

    International Nuclear Information System (INIS)

    Follin, Sven; Svensson, Urban

    2002-05-01

    The deduced Site Descriptive Model of the Laxemar area has been parameterised from a hydraulic point of view and subsequently put into practice in terms of a numerical flow model. The intention of the subproject has been to explore the adaptation of a numerical flow model to site-specific surface and borehole data, and to identify potential needs for development and improvement in the planned modelling methodology and tools. The experiences made during this process and the outcome of the simulations have been presented to the methodology test project group in course of the project. The discussion and conclusions made in this particular report concern two issues mainly, (i) the use of numerical simulations as a means of gaining creditability, e.g. discrimination between alternative geological models, and (ii) calibration and conditioning of probabilistic (Monte Carlo) realisations

  17. Reliability demonstration methodology for products with Gamma Process by optimal accelerated degradation testing

    International Nuclear Information System (INIS)

    Zhang, Chunhua; Lu, Xiang; Tan, Yuanyuan; Wang, Yashun

    2015-01-01

    For products with high reliability and long lifetime, accelerated degradation testing (ADT) may be adopted during product development phase to verify whether its reliability satisfies the predetermined level within feasible test duration. The actual degradation from engineering is usually a strictly monotonic process, such as fatigue crack growth, wear, and erosion. However, the method for reliability demonstration by ADT with monotonic degradation process has not been investigated so far. This paper proposes a reliability demonstration methodology by ADT for this kind of product. We first apply Gamma process to describe the monotonic degradation. Next, we present a reliability demonstration method by converting the required reliability level into allowable cumulative degradation in ADT and comparing the actual accumulative degradation with the allowable level. Further, we suggest an analytical optimal ADT design method for more efficient reliability demonstration by minimizing the asymptotic variance of decision variable in reliability demonstration under the constraints of sample size, test duration, test cost, and predetermined decision risks. The method is validated and illustrated with example on reliability demonstration of alloy product, and is applied to demonstrate the wear reliability within long service duration of spherical plain bearing in the end. - Highlights: • We present a reliability demonstration method by ADT for products with monotonic degradation process, which may be applied to verify reliability with long service life for products with monotonic degradation process within feasible test duration. • We suggest an analytical optimal ADT design method for more efficient reliability demonstration, which differs from the existed optimal ADT design for more accurate reliability estimation by different objective function and different constraints. • The methods are applied to demonstrate the wear reliability within long service duration of

  18. Can citizen science produce good science? Testing the OPAL Air Survey methodology, using lichens as indicators of nitrogenous pollution

    International Nuclear Information System (INIS)

    Tregidgo, Daniel J.; West, Sarah E.; Ashmore, Mike R.

    2013-01-01

    Citizen science is having increasing influence on environmental monitoring as its advantages are becoming recognised. However methodologies are often simplified to make them accessible to citizen scientists. We tested whether a recent citizen science survey (the OPAL Air Survey) could detect trends in lichen community composition over transects away from roads. We hypothesised that the abundance of nitrophilic lichens would decrease with distance from the road, while that of nitrophobic lichens would increase. The hypothesised changes were detected along strong pollution gradients, but not where the road source was relatively weak, or background pollution relatively high. We conclude that the simplified OPAL methodology can detect large contrasts in nitrogenous pollution, but it may not be able to detect more subtle changes in pollution exposure. Similar studies are needed in conjunction with the ever-growing body of citizen science work to ensure that the limitations of these methods are fully understood. -- Highlights: •We investigated the validity of a simplified citizen science methodology. •Lichen abundance data were used to indicate nitrogenous air pollution. •Significant changes were detected beside busy roads with low background pollution. •The methodology detected major, but not subtle, contrasts in pollution. •Sensitivity of citizen science methods to environmental change must be evaluated. -- A simplified lichen biomonitoring method used for citizen science can detect the impact of nitrogenous air pollution from local roads

  19. Development of methodology for certification of Type B shipping containers using analytical and testing techniques

    International Nuclear Information System (INIS)

    Sharp, R.R.; Varley, D.T.

    1993-01-01

    The use of multidisciplinary teams to develop Type B shipping containers improves the quality and reliability of these reusable packagings. Including the people involved in all aspects of the design, certification and use of the package leads to more innovative, user-friendly containers. Concurrent use of testing and analysis allows engineers to more fully characterize a shipping container's responses to the environments given in the regulations, and provides a strong basis for certification. The combination of the input and output of these efforts should provide a general methodology that designers of Type B radioactive material shipping containers can utilize to optimize and certify their designs. (J.P.N.)

  20. Testing the methodology for site descriptive modelling. Application for the Laxemar area

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, Johan [JA Streamflow AB, Aelvsjoe (Sweden); Berglund, Johan [SwedPower AB, Stockholm (Sweden); Follin, Sven [SF Geologic AB, Stockholm (Sweden); Hakami, Eva [Itasca Geomekanik AB, Stockholm (Sweden); Halvarson, Jan [Swedish Nuclear Fuel and Waste Management Co, Stockholm (Sweden); Hermanson, Jan [Golder Associates AB, Stockholm (Sweden); Laaksoharju, Marcus [Geopoint (Sweden); Rhen, Ingvar [Sweco VBB/VIAK, Stockholm (Sweden); Wahlgren, C.H. [Sveriges Geologiska Undersoekning, Uppsala (Sweden)

    2002-08-01

    A special project has been conducted where the currently available data from the Laxemar area, which is part of the Simpevarp site, have been evaluated and interpreted into a Site Descriptive Model covering: geology, hydrogeology, hydrogeochemistry and rock mechanics. Description of the surface ecosystem has been omitted, since it was re-characterised in another, parallel, project. Furthermore, there has been no evaluation of transport properties. The project is primarily a methodology test. The lessons learnt will be implemented in the Site Descriptive Modelling during the coming site investigation. The intent of the project has been to explore whether available methodology for Site Descriptive Modelling based on surface and borehole data is adequate and to identify potential needs for development and improvement in the methodology. The project has developed, with limitations in scope, a Site Descriptive Model in local scale, corresponding to the situation after completion of the Initial Site Investigations for the Laxemar area (i.e. 'version 1.2' using the vocabulary of the general execution program for the site investigations). The Site Descriptive Model should be reasonable, but should not be regarded as a 'real' model. There are limitations both in input data and in the scope of the analysis. The measured (primary) data constitute a wide range of different measurement results including data from two deep core drilled boreholes. These data both need to be checked for consistency and to be interpreted into a format more amenable for three-dimensional modelling. Examples of such evaluations are estimation of surface geology, lineament interpretation, geological single hole interpretation, hydrogeological single hole interpretation and assessment of hydrogeochemical data. Furthermore, while cross discipline interpretation is encouraged there is also a need for transparency. This means that the evaluations first are made within each discipline

  1. Policy Implications for Continuous Employment Decisions of High School Principals: An Alternative Methodological Approach for Using High-Stakes Testing Outcomes

    Science.gov (United States)

    Young, I. Phillip; Fawcett, Paul

    2013-01-01

    Several teacher models exist for using high-stakes testing outcomes to make continuous employment decisions for principals. These models are reviewed, and specific flaws are noted if these models are retrofitted for principals. To address these flaws, a different methodology is proposed on the basis of actual field data. Specially addressed are…

  2. The Term Structure of Interest Rates and its Impact on the Liability Adequacy Test for Insurance Companies in Brazil

    Directory of Open Access Journals (Sweden)

    Antonio Aurelio Duarte

    2015-08-01

    Full Text Available The Brazilian regulation for applying the Liability Adequacy Test (LAT to technical provisions in insurance companies requires that the current estimate is discounted by a term structure of interest rates (hereafter TSIR. This article aims to analyze the LAT results, derived from the use of various models to build the TSIR: the cubic spline interpolation technique, Svensson's model (adopted by the regulator and Vasicek's model. In order to achieve the objective proposed, the exchange rates of BM&FBOVESPA trading days were used to model the ETTJ and, consequently, to discount the cash flow of the insurance company. The results indicate that: (i LAT is sensitive to the choice of the model used to build the TSIR; (ii this sensitivity increases with cash flow longevity; (iii the adoption of an ultimate forward rate (UFR for the Brazilian insurance market should be evaluated by the regulator, in order to stabilize the trajectory of the yield curve at longer maturities. The technical provision is among the main solvency items of insurance companies and the LAT result is a significant indicator of the quality of this provision, as this evaluates its sufficiency or insufficiency. Thus, this article bridges a gap in the Brazilian actuarial literature, introducing the main methodologies available for modeling the yield curve and a practical application to analyze the impact of its choice on LAT.

  3. A facility for the test of large area muon chambers at high rates

    CERN Document Server

    Agosteo, S; Belli, G; Bonifas, A; Carabelli, V; Gatignon, L; Hessey, N P; Maggi, M; Peigneux, J P; Reithler, H; Silari, Marco; Vitulo, P; Wegner, M

    2000-01-01

    Operation of large area muon detectors at the future Large Hadron Collider (LHC) will be characterized by large sustained hit rates over the whole area, reaching the range of kHz/\\scm. We describe a dedicated test zone built at CERN to test the performance and the aging of the muon chambers currently under development. A radioactive source delivers photons causing the sustained rate of random hits, while a narrow beam of high energy muons is used to directly calibrate the detector performance. A system of remotely controlled lead filters serves to vary the rate of photons over four orders of magnitude, to allow the study of performance as a function of rate.

  4. A facility for the test of large-area muon chambers at high rates

    Energy Technology Data Exchange (ETDEWEB)

    Agosteo, S.; Altieri, S.; Belli, G.; Bonifas, A.; Carabelli, V.; Gatignon, L.; Hessey, N.; Maggi, M.; Peigneux, J.-P.; Reithler, H. E-mail: hans.reithler@cern.ch; Silari, M.; Vitulo, P.; Wegner, M

    2000-09-21

    Operation of large-area muon detectors at the future Large Hadron Collider (LHC) will be characterized by large sustained hit rates over the whole area, reaching the range of kHz cm{sup -2}. We describe a dedicated test zone built at CERN to test the performance and the aging of the muon chambers currently under development. A radioactive source delivers photons causing the sustained rate of random hits, while a narrow beam of high-energy muons is used to directly calibrate the detector performance. A system of remotely controlled lead filters serves to vary the rate of photons over four orders of magnitude, to allow the study of performance as a function of rate. (authors)

  5. Standard practice for measurement of the glass dissolution rate using the single-pass flow-through test method

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This practice describes a single-pass flow-through (SPFT) test method that can be used to measure the dissolution rate of a homogeneous silicate glass, including nuclear waste glasses, in various test solutions at temperatures less than 100°C. Tests may be conducted under conditions in which the effects from dissolved species on the dissolution rate are minimized to measure the forward dissolution rate at specific values of temperature and pH, or to measure the dependence of the dissolution rate on the concentrations of various solute species. 1.2 Tests are conducted by pumping solutions in either a continuous or pulsed flow mode through a reaction cell that contains the test specimen. Tests must be conducted at several solution flow rates to evaluate the effect of the flow rate on the glass dissolution rate. 1.3 This practice excludes static test methods in which flow is simulated by manually removing solution from the reaction cell and replacing it with fresh solution. 1.4 Tests may be conducted wit...

  6. Reactor building pressure proof test (PPT) and leak rate test (LRT) of Qinshan phase III (CANDU) project

    International Nuclear Information System (INIS)

    Gu Jun; Shi Jinqi; Fan Fuping

    2004-12-01

    As the first reactor building (R/B) without stainless steel liner in china, TQNPC studied the containment characteristics, such as strong concrete absorb/release air effect, poor containment penetration. etc. And carefully prepared test scheme and emergency response, creatively introduced the instrument air self-supply system in reactor building, developed the special measurement and analysis system for PPT and LRT, organized work under high-pressure on large-scale in the test. Finally got the containment leak rate result and the test-cost-time value is the best in all same type tests. (authors)

  7. Testing the methodology for site descriptive modelling. Application for the Laxemar area

    International Nuclear Information System (INIS)

    Andersson, Johan; Berglund, Johan; Follin, Sven; Hakami, Eva; Halvarson, Jan; Hermanson, Jan; Laaksoharju, Marcus; Rhen, Ingvar; Wahlgren, C.H.

    2002-08-01

    A special project has been conducted where the currently available data from the Laxemar area, which is part of the Simpevarp site, have been evaluated and interpreted into a Site Descriptive Model covering: geology, hydrogeology, hydrogeochemistry and rock mechanics. Description of the surface ecosystem has been omitted, since it was re-characterised in another, parallel, project. Furthermore, there has been no evaluation of transport properties. The project is primarily a methodology test. The lessons learnt will be implemented in the Site Descriptive Modelling during the coming site investigation. The intent of the project has been to explore whether available methodology for Site Descriptive Modelling based on surface and borehole data is adequate and to identify potential needs for development and improvement in the methodology. The project has developed, with limitations in scope, a Site Descriptive Model in local scale, corresponding to the situation after completion of the Initial Site Investigations for the Laxemar area (i.e. 'version 1.2' using the vocabulary of the general execution program for the site investigations). The Site Descriptive Model should be reasonable, but should not be regarded as a 'real' model. There are limitations both in input data and in the scope of the analysis. The measured (primary) data constitute a wide range of different measurement results including data from two deep core drilled boreholes. These data both need to be checked for consistency and to be interpreted into a format more amenable for three-dimensional modelling. Examples of such evaluations are estimation of surface geology, lineament interpretation, geological single hole interpretation, hydrogeological single hole interpretation and assessment of hydrogeochemical data. Furthermore, while cross discipline interpretation is encouraged there is also a need for transparency. This means that the evaluations first are made within each discipline and after this

  8. Effect of the Torrance Creative Thinking Test on Heart Rate Signal Features

    Directory of Open Access Journals (Sweden)

    Zakeri S.

    2016-02-01

    Full Text Available Aims: Showing a meta-cognitive aspect, creativity is related to higher mental processes such as thinking, intelligence, imagination, and information process. There are many studies on the physiological bases of creativity. The aim of this study was to investigate the effects of creative thinking on the heart rate signal. Materials & Methods: In this semi-experimental study, 52 medical engineering, electrical, and control students of Sahand University were studied in 2012. The subjects were selected via accessible sampling method. To assess the level of the students’ creative thinking, Torrance Tests of Creative Thinking (B form; figural was used. Before and during creative thinking test, heart signal in the rest position was recorded by 1000Hz sampling frequency. Data was analyzed using Wilcoxon non-parametric test.   Findings: There was an increase in the mean heart power amplitude during creative thinking than the rest position. However, passing time and conducting the last sessions of the creativity test, it showed a reduction. There was an increase in the heart rate in persons with high creativity than those with low creativity. In addition, based on the test scores, there was a higher creativity level in females and three-lingual persons than males and bi-lingual persons, respectively. There was an increase in the heart rate in females than males (p=0.0398. Nevertheless, there was no significant difference between three-lingual and bilingual persons (p>0.05.    Conclusion: Creative thinking results in an increase in the heart rate in persons with high creativity than persons with low creativity. Therefore, the creativity level can be detected via heart signal. 

  9. Reporting and methodological quality of survival analysis in articles published in Chinese oncology journals.

    Science.gov (United States)

    Zhu, Xiaoyan; Zhou, Xiaobin; Zhang, Yuan; Sun, Xiao; Liu, Haihua; Zhang, Yingying

    2017-12-01

    Survival analysis methods have gained widespread use in the filed of oncology. For achievement of reliable results, the methodological process and report quality is crucial. This review provides the first examination of methodological characteristics and reporting quality of survival analysis in articles published in leading Chinese oncology journals.To examine methodological and reporting quality of survival analysis, to identify some common deficiencies, to desirable precautions in the analysis, and relate advice for authors, readers, and editors.A total of 242 survival analysis articles were included to be evaluated from 1492 articles published in 4 leading Chinese oncology journals in 2013. Articles were evaluated according to 16 established items for proper use and reporting of survival analysis.The application rates of Kaplan-Meier, life table, log-rank test, Breslow test, and Cox proportional hazards model (Cox model) were 91.74%, 3.72%, 78.51%, 0.41%, and 46.28%, respectively, no article used the parametric method for survival analysis. Multivariate Cox model was conducted in 112 articles (46.28%). Follow-up rates were mentioned in 155 articles (64.05%), of which 4 articles were under 80% and the lowest was 75.25%, 55 articles were100%. The report rates of all types of survival endpoint were lower than 10%. Eleven of 100 articles which reported a loss to follow-up had stated how to treat it in the analysis. One hundred thirty articles (53.72%) did not perform multivariate analysis. One hundred thirty-nine articles (57.44%) did not define the survival time. Violations and omissions of methodological guidelines included no mention of pertinent checks for proportional hazard assumption; no report of testing for interactions and collinearity between independent variables; no report of calculation method of sample size. Thirty-six articles (32.74%) reported the methods of independent variable selection. The above defects could make potentially inaccurate

  10. Determining the ventilation and aerosol deposition rates from routine indoor-air measurements.

    Science.gov (United States)

    Halios, Christos H; Helmis, Costas G; Deligianni, Katerina; Vratolis, Sterios; Eleftheriadis, Konstantinos

    2014-01-01

    Measurement of air exchange rate provides critical information in energy and indoor-air quality studies. Continuous measurement of ventilation rates is a rather costly exercise and requires specific instrumentation. In this work, an alternative methodology is proposed and tested, where the air exchange rate is calculated by utilizing indoor and outdoor routine measurements of a common pollutant such as SO2, whereas the uncertainties induced in the calculations are analytically determined. The application of this methodology is demonstrated, for three residential microenvironments in Athens, Greece, and the results are also compared against ventilation rates calculated from differential pressure measurements. The calculated time resolved ventilation rates were applied to the mass balance equation to estimate the particle loss rate which was found to agree with literature values at an average of 0.50 h(-1). The proposed method was further evaluated by applying a mass balance numerical model for the calculation of the indoor aerosol number concentrations, using the previously calculated ventilation rate, the outdoor measured number concentrations and the particle loss rates as input values. The model results for the indoors' concentrations were found to be compared well with the experimentally measured values.

  11. Random breath testing in Queensland and Western Australia: examination of how the random breath testing rate influences alcohol related traffic crash rates.

    Science.gov (United States)

    Ferris, Jason; Mazerolle, Lorraine; King, Mark; Bates, Lyndel; Bennett, Sarah; Devaney, Madonna

    2013-11-01

    In this paper we explore the relationship between monthly random breath testing (RBT) rates (per 1000 licensed drivers) and alcohol-related traffic crash (ARTC) rates over time, across two Australian states: Queensland and Western Australia. We analyse the RBT, ARTC and licensed driver rates across 12 years; however, due to administrative restrictions, we model ARTC rates against RBT rates for the period July 2004 to June 2009. The Queensland data reveals that the monthly ARTC rate is almost flat over the five year period. Based on the results of the analysis, an average of 5.5 ARTCs per 100,000 licensed drivers are observed across the study period. For the same period, the monthly rate of RBTs per 1000 licensed drivers is observed to be decreasing across the study with the results of the analysis revealing no significant variations in the data. The comparison between Western Australia and Queensland shows that Queensland's ARTC monthly percent change (MPC) is 0.014 compared to the MPC of 0.47 for Western Australia. While Queensland maintains a relatively flat ARTC rate, the ARTC rate in Western Australia is increasing. Our analysis reveals an inverse relationship between ARTC RBT rates, that for every 10% increase in the percentage of RBTs to licensed driver there is a 0.15 decrease in the rate of ARTCs per 100,000 licenced drivers. Moreover, in Western Australia, if the 2011 ratio of 1:2 (RBTs to annual number of licensed drivers) were to double to a ratio of 1:1, we estimate the number of monthly ARTCs would reduce by approximately 15. Based on these findings we believe that as the number of RBTs conducted increases the number of drivers willing to risk being detected for drinking driving decreases, because the perceived risk of being detected is considered greater. This is turn results in the number of ARTCs diminishing. The results of this study provide an important evidence base for policy decisions for RBT operations. Copyright © 2013 Elsevier Ltd. All

  12. Methodological quality of systematic reviews on influenza vaccination.

    Science.gov (United States)

    Remschmidt, Cornelius; Wichmann, Ole; Harder, Thomas

    2014-03-26

    There is a growing body of evidence on the risks and benefits of influenza vaccination in various target groups. Systematic reviews are of particular importance for policy decisions. However, their methodological quality can vary considerably. To investigate the methodological quality of systematic reviews on influenza vaccination (efficacy, effectiveness, safety) and to identify influencing factors. A systematic literature search on systematic reviews on influenza vaccination was performed, using MEDLINE, EMBASE and three additional databases (1990-2013). Review characteristics were extracted and the methodological quality of the reviews was evaluated using the assessment of multiple systematic reviews (AMSTAR) tool. U-test, Kruskal-Wallis test, chi-square test, and multivariable linear regression analysis were used to assess the influence of review characteristics on AMSTAR-score. Fourty-six systematic reviews fulfilled the inclusion criteria. Average methodological quality was high (median AMSTAR-score: 8), but variability was large (AMSTAR range: 0-11). Quality did not differ significantly according to vaccination target group. Cochrane reviews had higher methodological quality than non-Cochrane reviews (p=0.001). Detailed analysis showed that this was due to better study selection and data extraction, inclusion of unpublished studies, and better reporting of study characteristics (all p<0.05). In the adjusted analysis, no other factor, including industry sponsorship or journal impact factor had an influence on AMSTAR score. Systematic reviews on influenza vaccination showed large differences regarding their methodological quality. Reviews conducted by the Cochrane collaboration were of higher quality than others. When using systematic reviews to guide the development of vaccination recommendations, the methodological quality of a review in addition to its content should be considered. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Methodologies for certification of transuranic waste packages

    International Nuclear Information System (INIS)

    Christensen, R.N.; Kok, K.D.

    1980-10-01

    The objective of this study was to postulate methodologies for certification that a waste package is acceptable for disposal in a licensed geologic repository. Within the context of this report, certification means the overall process which verifies that a waste package meets the criteria or specifications established for acceptance for disposal in a repository. The overall methodology for certification will include (1) certifying authorities, (2) tests and procedures, and (3) documentation and quality assurance programs. Each criterion will require a methodology that is specific to that criterion. In some cases, different waste forms will require a different methodology. The purpose of predicting certification methodologies is to provide additional information as to what changes, if any, are needed for the TRU waste in storage

  14. Regional labour market research on participation rates

    NARCIS (Netherlands)

    Elhorst, J.P.

    1996-01-01

    This article reviews the methodology of 17 empirical studies in which the participation rate has been estimated with the help of regional data. After defining and pointing our the orientation of regional labour market research on participation rates, three methodological issues dominate the

  15. Pre-service proof pressure and leak rate tests for the Qinshan CANDU project reactor buildings

    International Nuclear Information System (INIS)

    Petrunik, K.J.; Khan, A.; Ricciuti, R.; Ivanov, A.; Chen, S.

    2003-01-01

    The Qinshan CANDU Project Reactor Buildings (Units 1 and 2) have been successfully tested for the Pre-Service Proof Pressure and Integrated Leak Rate Tests. The Unit 1 tests took place from May 3 to May 9, 2002 and from May 22 to May 25, 2002, and the Unit 2 tests took place from January 21 to January 27, 2003. This paper discusses the significant steps taken at minimum cost on the Qinshan CANDU Project, which has resulted in a) very good leak rate (0.21%) for Unit 1 and excellent leak rate (0.130%) for Unit 2; b) continuous monitoring of the structural behaviour during the Proof Pressure Test, thus eliminating any repeat of the structural test due to lack of data; and c) significant schedule reduction achieved for these tests in Unit 2. (author)

  16. Portal monitor evaluation and test procedure

    International Nuclear Information System (INIS)

    Johnson, L.O.; Gupta, V.P.; Stevenson, R.L.; Rich, B.L.

    1983-10-01

    The purpose was to develop techniques and procedures to allow users to measure performance and sensitivity of portal monitors. Additionally, a methodology was developed to assist users in optimizing monitor performance. The two monitors tested utilized thin-window gas-flow proportional counters sensitive to beta and gamma radiation. Various tests were performed: a) background count rate and the statistical variability, b) detector efficiency at different distances, c) moving source sensitivity for various size sources and speeds, and d) false alarm rates at different background levels. A model was developed for the moving source measurements to compare the experimental data with measured results, and to test whether it is possible to adequately model the behavior of a portal monitor's response to a moving source. The model results were compared with the actual test results. A procedure for testing portal monitors is also given. 1 reference, 9 figures, 8 tables

  17. Enhanced low dose rate radiation effect test on typical bipolar devices

    International Nuclear Information System (INIS)

    Liu Minbo; Chen Wei; Yao Zhibin; He Baoping; Huang Shaoyan; Sheng Jiangkun; Xiao Zhigang; Wang Zujun

    2014-01-01

    Two types of bipolar transistors and nine types bipolar integrated circuit were selected in the irradiation experiment at different "6"0Co γ dose rate. The base current of bipolar transistor and input bias current of amplifier and comparator was measured, low dose enhance factor of test device was obtained. The results show that bipolar device have enhanced low dose rate sensitivity, enhancement factor of bipolar integrated circuit was bigger than that of transistor, and enhanced low dose rate sensitivity greatly varied with different structure and process of bipolar device. (authors)

  18. Tensile strength of concrete under static and intermediate strain rates: Correlated results from different testing methods

    International Nuclear Information System (INIS)

    Wu Shengxing; Chen Xudong; Zhou Jikai

    2012-01-01

    Highlights: ► Tensile strength of concrete increases with increase in strain rate. ► Strain rate sensitivity of tensile strength of concrete depends on test method. ► High stressed volume method can correlate results from various test methods. - Abstract: This paper presents a comparative experiment and analysis of three different methods (direct tension, splitting tension and four-point loading flexural tests) for determination of the tensile strength of concrete under low and intermediate strain rates. In addition, the objective of this investigation is to analyze the suitability of the high stressed volume approach and Weibull effective volume method to the correlation of the results of different tensile tests of concrete. The test results show that the strain rate sensitivity of tensile strength depends on the type of test, splitting tensile strength of concrete is more sensitive to an increase in the strain rate than flexural and direct tensile strength. The high stressed volume method could be used to obtain a tensile strength value of concrete, free from the influence of the characteristics of tests and specimens. However, the Weibull effective volume method is an inadequate method for describing failure of concrete specimens determined by different testing methods.

  19. Simplified method of ''push-pull'' test data analysis for determining in situ reaction rate coefficients

    International Nuclear Information System (INIS)

    Haggerty, R.; Schroth, M.H.; Istok, J.D.

    1998-01-01

    The single-well, ''''push-pull'''' test method is useful for obtaining information on a wide variety of aquifer physical, chemical, and microbiological characteristics. A push-pull test consists of the pulse-type injection of a prepared test solution into a single monitoring well followed by the extraction of the test solution/ground water mixture from the same well. The test solution contains a conservative tracer and one or more reactants selected to investigate a particular process. During the extraction phase, the concentrations of tracer, reactants, and possible reaction products are measured to obtain breakthrough curves for all solutes. This paper presents a simplified method of data analysis that can be used to estimate a first-order reaction rate coefficient from these breakthrough curves. Rate coefficients are obtained by fitting a regression line to a plot of normalized concentrations versus elapsed time, requiring no knowledge of aquifer porosity, dispersivity, or hydraulic conductivity. A semi-analytical solution to the advective-dispersion equation is derived and used in a sensitivity analysis to evaluate the ability of the simplified method to estimate reaction rate coefficients in simulated push-pull tests in a homogeneous, confined aquifer with a fully-penetrating injection/extraction well and varying porosity, dispersivity, test duration, and reaction rate. A numerical flow and transport code (SUTRA) is used to evaluate the ability of the simplified method to estimate reaction rate coefficients in simulated push-pull tests in a heterogeneous, unconfined aquifer with a partially penetrating well. In all cases the simplified method provides accurate estimates of reaction rate coefficients; estimation errors ranged from 0.1 to 8.9% with most errors less than 5%

  20. Using An Adapter To Perform The Chalfant-Style Containment Vessel Periodic Maintenance Leak Rate Test

    International Nuclear Information System (INIS)

    Loftin, B.; Abramczyk, G.; Trapp, D.

    2011-01-01

    Recently the Packaging Technology and Pressurized Systems (PT and PS) organization at the Savannah River National Laboratory was asked to develop an adapter for performing the leak-rate test of a Chalfant-style containment vessel. The PT and PS organization collaborated with designers at the Department of Energy's Pantex Plant to develop the adapter currently in use for performing the leak-rate testing on the containment vessels. This paper will give the history of leak-rate testing of the Chalfant-style containment vessels, discuss the design concept for the adapter, give an overview of the design, and will present results of the testing done using the adapter.

  1. Supplement to a Methodology for Succession Planning for Technical Experts

    Energy Technology Data Exchange (ETDEWEB)

    Kirk, Bernadette Lugue [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Cain, Ronald A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Agreda, Carla L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-11-01

    This report complements A Methodology for Succession Planning for Technical Experts (Ron Cain, Shaheen Dewji, Carla Agreda, Bernadette Kirk, July 2017), which describes a draft methodology for identifying and evaluating the loss of key technical skills at nuclear operations facilities. This report targets the methodology for identifying critical skills, and the methodology is tested through interviews with selected subject matter experts.

  2. To determine the slow shearing rate for consolidation drained shear box tests

    Science.gov (United States)

    Jamalludin, Damanhuri; Ahmad, Azura; Nordin, Mohd Mustaqim Mohd; Hashim, Mohamad Zain; Ibrahim, Anas; Ahmad, Fauziah

    2017-08-01

    Slope failures always occur in Malaysia especially during the rainy seasons. They cause damage to properties and fatalities. In this study, a total of 24 one dimensional consolidation tests were carried out on soil samples taken from 16 slope failures in Penang Island and in Baling, Kedah. The slope failures in Penang Island are within the granitic residual soil while in Baling, Kedah they are situated within the sedimentary residual soil. Most of the disturbed soil samples were taken at 100mm depth from the existing soil surface while some soil samples were also taken at 400, 700 and 1000mm depths from the existing soil surface. They were immediately placed in 2 layers of plastic bag to prevent moisture loss. Field bulk density tests were also carried out at all the locations where soil samples were taken. The field bulk density results were later used to re-compact the soil samples for the consolidation tests. The objective of the research is to determine the slow shearing rate to be used in consolidated drained shear box for residual soils taken from slope failures so that the effective shear strength parameters can be determined. One dimensional consolidation tests were used to determine the slow shearing rate. The slow shearing rate found in this study to be used in the consolidated drained shear box tests especially for Northern Malaysian residual soils was 0.286mm/minute.

  3. Smart integrated containment leakage rate test system using wireless communication

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Dong Hwan; Lee, Sang Yong; Kim, Jung Sun; Kim, Gun Soo; Kim, Jong Myeong; Ahn, Jong Han [Research and Development Center, Ulsan (Korea, Republic of)

    2012-10-15

    Integrated Leakage Rate Test (ILRT) is the important test the confidentiality and integrity of the containment building, which is the last barrier when Design basis accidents (DBA) of Nuclear Power plant occur. Since the result of this test is the basis to guarantee the safety of nuclear power plants, the test process, test procedure, and the test equipment are required to have high reliability. The test devices previously used have been products of VOLUMERTRICS and GRAFTEL of USA. These devices have been inconvenient to calibrate and use. Thus improved devices needed to be developed to remove the inconveniences, to verify the safety of Korean nuclear power plants with Korea's own technology, and to secure core technology. A new leak test system was developed by domestic technology for that purpose and needed to be verified. In this paper, technical details of the newly developed easy to use and highly reliable measuring test device, which is in operation at the nuclear power plant sites, will be introduced. State of art technology was applied to the device to address the shortcomings of previous US made devices and the difficulties to use on site.

  4. Precarious Rock Methodology for Seismic Hazard: Physical Testing, Numerical Modeling and Coherence Studies

    Energy Technology Data Exchange (ETDEWEB)

    Anooshehpoor, Rasool; Purvance, Matthew D.; Brune, James N.; Preston, Leiph A.; Anderson, John G.; Smith, Kenneth D.

    2006-09-29

    This report covers the following projects: Shake table tests of precarious rock methodology, field tests of precarious rocks at Yucca Mountain and comparison of the results with PSHA predictions, study of the coherence of the wave field in the ESF, and a limited survey of precarious rocks south of the proposed repository footprint. A series of shake table experiments have been carried out at the University of Nevada, Reno Large Scale Structures Laboratory. The bulk of the experiments involved scaling acceleration time histories (uniaxial forcing) from 0.1g to the point where the objects on the shake table overturned a specified number of times. The results of these experiments have been compared with numerical overturning predictions. Numerical predictions for toppling of large objects with simple contact conditions (e.g., I-beams with sharp basal edges) agree well with shake-table results. The numerical model slightly underpredicts the overturning of small rectangular blocks. It overpredicts the overturning PGA for asymmetric granite boulders with complex basal contact conditions. In general the results confirm the approximate predictions of previous studies. Field testing of several rocks at Yucca Mountain has approximately confirmed the preliminary results from previous studies, suggesting that he PSHA predictions are too high, possibly because the uncertainty in the mean of the attenuation relations. Study of the coherence of wavefields in the ESF has provided results which will be very important in design of the canisters distribution, in particular a preliminary estimate of the wavelengths at which the wavefields become incoherent. No evidence was found for extreme focusing by lens-like inhomogeneities. A limited survey for precarious rocks confirmed that they extend south of the repository, and one of these has been field tested.

  5. Framed bit error rate testing for 100G ethernet equipment

    DEFF Research Database (Denmark)

    Rasmussen, Anders; Ruepp, Sarah Renée; Berger, Michael Stübert

    2010-01-01

    rate. As the need for 100 Gigabit Ethernet equipment rises, so does the need for equipment, which can properly test these systems during development, deployment and use. This paper presents early results from a work-in-progress academia-industry collaboration project and elaborates on the challenges...

  6. Validation of multi-body modelling methodology for reconfigurable underwater robots

    DEFF Research Database (Denmark)

    Nielsen, M.C.; Eidsvik, O. A.; Blanke, Mogens

    2016-01-01

    This paper investigates the problem of employing reconfigurable robots in an underwater setting. The main results presented is the experimental validation of a modelling methodology for a system consisting of N dynamically connected robots with heterogeneous dynamics. Two distinct types...... of experiments are performed, a series of hydrostatic free-decay tests and a series of open-loop trajectory tests. The results are compared to a simulation based on the modelling methodology. The modelling methodology shows promising results for usage with systems composed of reconfigurable underwater modules...

  7. Engineering analysis of mass flow rate for turbine system control and design

    International Nuclear Information System (INIS)

    Yoo, Yong H.; Suh, Kune Y.

    2011-01-01

    Highlights: → A computer code is written to predict the steam mass flow rate through valves. → A test device is built to study the steam flow characteristics in the control valve. → Mass flow based methodology eases the programming and experimental procedures. → The methodology helps express the characteristics of each device of a turbine system. → The results can commercially be used for design and operation of the turbine system. - Abstract: The mass flow rate is determined in the steam turbine system by the area formed between the stem disk and the seat of the control valve. For precise control the steam mass flow rate should be known given the stem lift. However, since the thermal hydraulic characteristics of steam coming from the generator or boiler are changed going through each device, it is hard to accurately predict the steam mass flow rate. Thus, to precisely determine the steam mass flow rate, a methodology and theory are developed in designing the turbine system manufactured for the nuclear and fossil power plants. From the steam generator or boiler to the first bunch of turbine blades, the steam passes by a stop valve, a control valve and the first nozzle, each of which is connected with piping. The corresponding steam mass flow rate can ultimately be computed if the thermal and hydraulic conditions are defined at the stop valve, control valve and pipes. The steam properties at the inlet of each device are changed at its outlet due to geometry. The Compressed Adiabatic Massflow Analysis (CAMA) computer code is written to predict the steam mass flow rate through valves. The Valve Engineered Layout Operation (VELO) test device is built to experimentally study the flow characteristics of steam flowing inside the control valve with the CAMA input data. The Widows' Creek type control valve was selected as reference. CAMA is expected to be commercially utilized to accurately design and operate the turbine system for fossil as well as nuclear power

  8. HIV Risks, Testing, and Treatment in the Former Soviet Union: Challenges and Future Directions in Research and Methodology

    Directory of Open Access Journals (Sweden)

    Victoria M. Saadat

    2016-01-01

    Full Text Available Background. The dissolution of the USSR resulted in independence for constituent republics but left them battling an unstable economic environment and healthcare. Increases in injection drug use, prostitution, and migration were all widespread responses to this transition and have contributed to the emergence of an HIV epidemic in the countries of former Soviet Union. Researchers have begun to identify the risks of HIV infection as well as the barriers to HIV testing and treatment in the former Soviet Union. Significant methodological challenges have arisen and need to be addressed. The objective of this review is to determine common threads in HIV research in the former Soviet Union and provide useful recommendations for future research studies.Methods. In this systematic review of the literature, Pubmed was searched for English-language studies using the key search terms “HIV”, “AIDS”, “human immunodeficiency virus”, “acquired immune deficiency syndrome”, “Central Asia”, “Kazakhstan”, “Kyrgyzstan”, “Uzbekistan”, “Tajikistan”, “Turkmenistan”, “Russia”, “Ukraine”, “Armenia”, “Azerbaijan”, and “Georgia”. Studies were evaluated against eligibility criteria for inclusion.Results. Thirty-nine studies were identified across the two main topic areas of HIV risk and barriers to testing and treatment, themes subsequently referred to as “risk” and “barriers”. Study design was predominantly cross-sectional. The most frequently used sampling methods were peer-to-peer and non-probabilistic sampling. The most frequently reported risks were condom misuse, risky intercourse, and unsafe practices among injection drug users.  Common barriers to testing included that testing was inconvenient, and that results would not remain confidential.  Frequent barriers to treatment were based on a distrust in the treatment system. Conclusion. The findings of this review reveal methodological limitations

  9. Formalizing the ISDF Software Development Methodology

    Directory of Open Access Journals (Sweden)

    Mihai Liviu DESPA

    2015-01-01

    Full Text Available The paper is aimed at depicting the ISDF software development methodology by emphasizing quality management and software development lifecycle. The ISDF methodology was built especially for innovative software development projects. The ISDF methodology was developed empirically by trial and error in the process of implementing multiple innovative projects. The research process began by analysing key concepts like innovation and software development and by settling the important dilemma of what makes a web application innovative. Innovation in software development is presented from the end-user, project owner and project manager’s point of view. The main components of a software development methodology are identified. Thus a software development methodology should account for people, roles, skills, teams, tools, techniques, processes, activities, standards, quality measuring tools, and team values. Current software development models are presented and briefly analysed. The need for a dedicated innovation oriented software development methodology is emphasized by highlighting shortcomings of current software development methodologies when tackling innovation. The ISDF methodology is presented in the context of developing an actual application. The ALHPA application is used as a case study for emphasizing the characteristics of the ISDF methodology. The development life cycle of the ISDF methodology includes research, planning, prototyping, design, development, testing, setup and maintenance. Artefacts generated by the ISDF methodology are presented. Quality is managed in the ISDF methodology by assessing compliance, usability, reliability, repeatability, availability and security. In order to properly asses each quality component a dedicated indicator is built. A template for interpreting each indicator is provided. Conclusions are formulated and new related research topics are submitted for debate.

  10. Effects of strain rate, test temperature and test environment on tensile properties of vandium alloys

    International Nuclear Information System (INIS)

    Gubbi, A.N.; Rowcliffe, A.F.; Eatherly, W.S.; Gibson, L.T.

    1996-01-01

    Tensile testing was carried out on SS-3 tensile specimens punched from 0.762-mm-thick sheets of the large heat of V-4Cr-4Ti and small heats of V-3Cr-3Ti and V-6Cr-6Ti. The tensile specimens were annealed at 1000 degrees for 2 h to obtain a fully recrystallized, fine grain microstructure with a grain size in the range of 10-19 μm. Room temperature tests at strain rates ranging from 10 -3 to 5 x 10 -1 /s were carried out in air; elevated temperature testing up to 700 degrees C was conducted in a vacuum better than 1 x 10 -5 torr ( -3 Pa). To study the effect of atomic hydrogen on ductility, tensile tests were conducted at room temperature in an ultra high vacuum chamber (UHV) with a hydrogen leak system

  11. Reporting and Methodology of Multivariable Analyses in Prognostic Observational Studies Published in 4 Anesthesiology Journals: A Methodological Descriptive Review.

    Science.gov (United States)

    Guglielminotti, Jean; Dechartres, Agnès; Mentré, France; Montravers, Philippe; Longrois, Dan; Laouénan, Cedric

    2015-10-01

    Prognostic research studies in anesthesiology aim to identify risk factors for an outcome (explanatory studies) or calculate the risk of this outcome on the basis of patients' risk factors (predictive studies). Multivariable models express the relationship between predictors and an outcome and are used in both explanatory and predictive studies. Model development demands a strict methodology and a clear reporting to assess its reliability. In this methodological descriptive review, we critically assessed the reporting and methodology of multivariable analysis used in observational prognostic studies published in anesthesiology journals. A systematic search was conducted on Medline through Web of Knowledge, PubMed, and journal websites to identify observational prognostic studies with multivariable analysis published in Anesthesiology, Anesthesia & Analgesia, British Journal of Anaesthesia, and Anaesthesia in 2010 and 2011. Data were extracted by 2 independent readers. First, studies were analyzed with respect to reporting of outcomes, design, size, methods of analysis, model performance (discrimination and calibration), model validation, clinical usefulness, and STROBE (i.e., Strengthening the Reporting of Observational Studies in Epidemiology) checklist. A reporting rate was calculated on the basis of 21 items of the aforementioned points. Second, they were analyzed with respect to some predefined methodological points. Eighty-six studies were included: 87.2% were explanatory and 80.2% investigated a postoperative event. The reporting was fairly good, with a median reporting rate of 79% (75% in explanatory studies and 100% in predictive studies). Six items had a reporting rate website. Limiting the number of candidate variables, including cases with missing data, and not arbitrarily categorizing continuous variables should be encouraged.

  12. Effects of Classroom Ventilation Rate and Temperature on Students' Test Scores.

    Science.gov (United States)

    Haverinen-Shaughnessy, Ulla; Shaughnessy, Richard J

    2015-01-01

    Using a multilevel approach, we estimated the effects of classroom ventilation rate and temperature on academic achievement. The analysis is based on measurement data from a 70 elementary school district (140 fifth grade classrooms) from Southwestern United States, and student level data (N = 3109) on socioeconomic variables and standardized test scores. There was a statistically significant association between ventilation rates and mathematics scores, and it was stronger when the six classrooms with high ventilation rates that were indicated as outliers were filtered (> 7.1 l/s per person). The association remained significant when prior year test scores were included in the model, resulting in less unexplained variability. Students' mean mathematics scores (average 2286 points) were increased by up to eleven points (0.5%) per each liter per second per person increase in ventilation rate within the range of 0.9-7.1 l/s per person (estimated effect size 74 points). There was an additional increase of 12-13 points per each 1°C decrease in temperature within the observed range of 20-25°C (estimated effect size 67 points). Effects of similar magnitude but higher variability were observed for reading and science scores. In conclusion, maintaining adequate ventilation and thermal comfort in classrooms could significantly improve academic achievement of students.

  13. METHODOLOGICAL PROBLEMS AND WAYS OF CREATION OF THE AIRCRAFT EQUIPMENT TEST AUTOMATED MANAGEMENT SYSTEM

    Directory of Open Access Journals (Sweden)

    Vladimir Michailovich Vetoshkin

    2017-01-01

    Full Text Available The development of new and modernization of existing aviation equipment specimens of different classes are ac- companied and completed by the complex process of ground and flight tests. This phase of aviation equipment life cycle is implemented by means of organizational and technical systems - running centers. The latter include various proving grounds, measuring complex and systems, aircraft, ships, security and flight control offices, information processing laborato- ries and many other elements. The system analysis results of development challenges of the automated control systems of aviation equipment tests operations are presented. The automated control systems are in essence an automated data bank. The key role of development of flight tests automated control system in the process of creation of the automated control sys- tems of aviation equipment tests operations is substantiated. The way of the mobile modular measuring complexes integra- tion and the need for national methodologies and technological standards for database systems design concepts are grounded. Database system, as a central element in this scheme, provides collection, storing and updating of values of the elements described above in pace and the required frequency of the controlled object state monitoring. It is database system that pro- vides the supervisory unit with actual data corresponding to specific moments of time, which concern the state processes, assessments of the progress and results of flight experiments, creating the necessary environment for aviation equipment managing and testing as a whole. The basis for development of subsystems of automated control systems of aviation equip- ment tests operations are conceptual design processes of the respective database system, the implementation effectiveness of which largely determines the level of success and ability to develop the systems being created. Introduced conclusions and suggestions can be used in the

  14. Testing methodologies for quantifying physical models uncertainties. A comparative exercise using CIRCE and IPREM (FFTBM)

    Energy Technology Data Exchange (ETDEWEB)

    Freixa, Jordi, E-mail: jordi.freixa-terradas@upc.edu; Alfonso, Elsa de, E-mail: elsa.de.alfonso@upc.edu; Reventós, Francesc, E-mail: francesc.reventos@upc.edu

    2016-08-15

    Highlights: • Uncertainty of physical models are a key issue in Best estimate plus uncertainty analysis. • Estimation of uncertainties of physical models of thermal hydraulics system codes. • Comparison of CIRCÉ and FFTBM methodologies. • Simulation of reflood experiments in order to evaluate uncertainty of physical models related to the reflood scenario. - Abstract: The increasing importance of Best-Estimate Plus Uncertainty (BEPU) analyses in nuclear safety and licensing processes have lead to several international activities. The latest findings highlighted the uncertainties of physical models as one of the most controversial aspects of BEPU. This type of uncertainties is an important contributor to the total uncertainty of NPP BE calculations. Due to the complexity of estimating this uncertainty, it is often assessed solely by engineering judgment. The present study comprises a comparison of two different state-of-the-art methodologies CIRCÉ and IPREM (FFTBM) capable of quantifying the uncertainty of physical models. Similarities and differences of their results are discussed through the observation of probability distribution functions and envelope calculations. In particular, the analyzed scenario is core reflood. Experimental data from the FEBA and PERICLES test facilities is employed while the thermal hydraulic simulations are carried out with RELAP5/mod3.3. This work is undertaken under the framework of PREMIUM (Post-BEMUSE Reflood Model Input Uncertainty Methods) benchmark.

  15. Investigation of Super Learner Methodology on HIV-1 Small Sample: Application on Jaguar Trial Data.

    Science.gov (United States)

    Houssaïni, Allal; Assoumou, Lambert; Marcelin, Anne Geneviève; Molina, Jean Michel; Calvez, Vincent; Flandre, Philippe

    2012-01-01

    Background. Many statistical models have been tested to predict phenotypic or virological response from genotypic data. A statistical framework called Super Learner has been introduced either to compare different methods/learners (discrete Super Learner) or to combine them in a Super Learner prediction method. Methods. The Jaguar trial is used to apply the Super Learner framework. The Jaguar study is an "add-on" trial comparing the efficacy of adding didanosine to an on-going failing regimen. Our aim was also to investigate the impact on the use of different cross-validation strategies and different loss functions. Four different repartitions between training set and validations set were tested through two loss functions. Six statistical methods were compared. We assess performance by evaluating R(2) values and accuracy by calculating the rates of patients being correctly classified. Results. Our results indicated that the more recent Super Learner methodology of building a new predictor based on a weighted combination of different methods/learners provided good performance. A simple linear model provided similar results to those of this new predictor. Slight discrepancy arises between the two loss functions investigated, and slight difference arises also between results based on cross-validated risks and results from full dataset. The Super Learner methodology and linear model provided around 80% of patients correctly classified. The difference between the lower and higher rates is around 10 percent. The number of mutations retained in different learners also varys from one to 41. Conclusions. The more recent Super Learner methodology combining the prediction of many learners provided good performance on our small dataset.

  16. TEST OF THE CATCH-UP HYPOTHESIS IN AFRICAN AGRICULTURAL GROWTH RATES

    Directory of Open Access Journals (Sweden)

    Kalu Ukpai IFEGWU

    2015-11-01

    Full Text Available The paper tested the catch-up hypothesis in agricultural growth rates of twenty-six African countries. Panel data used was drawn from the Food and Agricultural Organization Statistics (FAOSTAT of the United Nations. The Data Envelopment Analysis Method for measuring productivity was used to estimate productivity growth rates. The cross-section framework consisting of sigma-convergence and beta-convergence was employed to test the catching up process. Catching up is said to exist if the value of beta is negative and significant. Since catching up does not necessarily imply narrowing of national productivity inequalities, sigma-convergence which measures inequality, was estimated for the same variables. The results showed evidence of the catch-up process, but failed to find a narrowing of productivity inequalities among countries.

  17. Methodology to evaluate the crack growth rate by stress corrosion cracking in dissimilar metals weld in simulated environment of PWR nuclear reactor

    International Nuclear Information System (INIS)

    Paula, Raphael G.; Figueiredo, Celia A.; Rabelo, Emerson G.

    2013-01-01

    Inconel alloys weld metal is widely used to join dissimilar metals in nuclear reactors applications. It was recently observed failures of weld components in plants, which have triggered an international effort to determine reliable data on the stress corrosion cracking behavior of this material in reactor environment. The objective of this work is to develop a methodology to determine the crack growth rate caused by stress corrosion in Inconel alloy 182, using the specimen (Compact Tensile) in simulated PWR environment. (author)

  18. Screening radon risks: A methodology for policymakers

    International Nuclear Information System (INIS)

    Eisinger, D.S.; Simmons, R.A.; Lammering, M.; Sotiros, R.

    1991-01-01

    This paper provides an easy-to-use screening methodology to estimate potential excess lifetime lung cancer risk resulting from indoor radon exposure. The methodology was developed under U.S. EPA Office of Policy, Planning, and Evaluation sponsorship of the agency's Integrated Environmental Management Projects (IEMP) and State/Regional Comparative Risk Projects. These projects help policymakers understand and use scientific data to develop environmental problem-solving strategies. This research presents the risk assessment methodology, discusses its basis, and identifies appropriate applications. The paper also identifies assumptions built into the methodology and qualitatively addresses methodological uncertainties, the direction in which these uncertainties could bias analyses, and their relative importance. The methodology draws from several sources, including risk assessment formulations developed by the U.S. EPA's Office of Radiation Programs, the EPA's Integrated Environmental Management Project (Denver), the International Commission on Radiological Protection, and the National Institute for Occupational Safety and Health. When constructed as a spreadsheet program, the methodology easily facilitates analyses and sensitivity studies (the paper includes several sensitivity study options). The methodology will be most helpful to those who need to make decisions concerning radon testing, public education, and exposure prevention and mitigation programs.26 references

  19. [Regional differences of ADHD diagnosis rates in health insurance data from 2005 to 2015 : Methodological considerations and results].

    Science.gov (United States)

    Grobe, Thomas G

    2017-12-01

    Attention deficit hyperactivity disorders (ADHD) are among the most common mental disorders in children and adolescents. For a number of years there has been evidence of regional differences in Germany. This article provides current results on the frequency of diagnosis and treatment and also discusses methodological aspects. The analysis is based on routine data of a statutory health insurance company including annual diagnoses and drug prescriptions from 2005 to 2015 of at least 1.34 million children and adolescents between 0 and 19 years of age. Small-area results of ADHD diagnosis rates and methylphenidate prescriptions are presented with a standardized differentiation according to 413 districts pursuant to territorial status from the end of 2008. From 2005 to 2014, ADHD diagnoses were documented for an increasing proportion of 0 to 19-year-olds in Germany. In 2015 the proportion was 4.2%; boys aged 10 were affected most frequently with a proportion of 11.1%. Regional diagnosis rates vary considerably. Two counties showed diagnosis and prescription rates that were more than twice as high as regionally expected for all years in question; other districts showed rates that were continually lower than expected by at least a third. Analyses on the level of administratively defined districts have some advantages but alternative regional structuring would be desirable due to very heterogeneous population figures. Regarding ADHD diagnoses and documented methylphenidate prescriptions on an outpatient basis, significant regional differences in Germany were detected, for which plausible medical justifications do not yet exist. Specialist discussions seem urgently needed.

  20. Combined methodology for estimating dose rates and health effects from exposure to radioactive pollutants

    Energy Technology Data Exchange (ETDEWEB)

    Dunning, D.E. Jr.; Leggett, R.W.; Yalcintas, M.G.

    1980-12-01

    The work described in the report is basically a synthesis of two previously existing computer codes: INREM II, developed at the Oak Ridge National Laboratory (ORNL); and CAIRD, developed by the Environmental Protection Agency (EPA). The INREM II code uses contemporary dosimetric methods to estimate doses to specified reference organs due to inhalation or ingestion of a radionuclide. The CAIRD code employs actuarial life tables to account for competing risks in estimating numbers of health effects resulting from exposure of a cohort to some incremental risk. The combined computer code, referred to as RADRISK, estimates numbers of health effects in a hypothetical cohort of 100,000 persons due to continuous lifetime inhalation or ingestion of a radionuclide. Also briefly discussed in this report is a method of estimating numbers of health effects in a hypothetical cohort due to continuous lifetime exposure to external radiation. This method employs the CAIRD methodology together with dose conversion factors generated by the computer code DOSFACTER, developed at ORNL; these dose conversion factors are used to estimate dose rates to persons due to radionuclides in the air or on the ground surface. The combination of the life table and dosimetric guidelines for the release of radioactive pollutants to the atmosphere, as required by the Clean Air Act Amendments of 1977.

  1. TESTING TESTS ON ACTIVE GALACTIC NUCLEI MICROVARIABILITY

    International Nuclear Information System (INIS)

    De Diego, Jose A.

    2010-01-01

    Literature on optical and infrared microvariability in active galactic nuclei (AGNs) reflects a diversity of statistical tests and strategies to detect tiny variations in the light curves of these sources. Comparison between the results obtained using different methodologies is difficult, and the pros and cons of each statistical method are often badly understood or even ignored. Even worse, improperly tested methodologies are becoming more and more common, and biased results may be misleading with regard to the origin of the AGN microvariability. This paper intends to point future research on AGN microvariability toward the use of powerful and well-tested statistical methodologies, providing a reference for choosing the best strategy to obtain unbiased results. Light curves monitoring has been simulated for quasars and for reference and comparison stars. Changes for the quasar light curves include both Gaussian fluctuations and linear variations. Simulated light curves have been analyzed using χ 2 tests, F tests for variances, one-way analyses of variance and C-statistics. Statistical Type I and Type II errors, which indicate the robustness and the power of the tests, have been obtained in each case. One-way analyses of variance and χ 2 prove to be powerful and robust estimators for microvariations, while the C-statistic is not a reliable methodology and its use should be avoided.

  2. Flammability Assessment Methodology Program Phase I: Final Report

    Energy Technology Data Exchange (ETDEWEB)

    C. A. Loehr; S. M. Djordjevic; K. J. Liekhus; M. J. Connolly

    1997-09-01

    The Flammability Assessment Methodology Program (FAMP) was established to investigate the flammability of gas mixtures found in transuranic (TRU) waste containers. The FAMP results provide a basis for increasing the permissible concentrations of flammable volatile organic compounds (VOCs) in TRU waste containers. The FAMP results will be used to modify the ''Safety Analysis Report for the TRUPACT-II Shipping Package'' (TRUPACT-II SARP) upon acceptance of the methodology by the Nuclear Regulatory Commission. Implementation of the methodology would substantially increase the number of drums that can be shipped to the Waste Isolation Pilot Plant (WIPP) without repackaging or treatment. Central to the program was experimental testing and modeling to predict the gas mixture lower explosive limit (MLEL) of gases observed in TRU waste containers. The experimental data supported selection of an MLEL model that was used in constructing screening limits for flammable VOC and flammable gas concentrations. The MLEL values predicted by the model for individual drums will be utilized to assess flammability for drums that do not meet the screening criteria. Finally, the predicted MLEL values will be used to derive acceptable gas generation rates, decay heat limits, and aspiration time requirements for drums that do not pass the screening limits. The results of the program demonstrate that an increased number of waste containers can be shipped to WIPP within the flammability safety envelope established in the TRUPACT-II SARP.

  3. Effects of strain rate, test temperature and test environment on tensile properties of vandium alloys

    Energy Technology Data Exchange (ETDEWEB)

    Gubbi, A.N.; Rowcliffe, A.F.; Eatherly, W.S.; Gibson, L.T. [Oak Ridge National Lab., TN (United States)

    1996-10-01

    Tensile testing was carried out on SS-3 tensile specimens punched from 0.762-mm-thick sheets of the large heat of V-4Cr-4Ti and small heats of V-3Cr-3Ti and V-6Cr-6Ti. The tensile specimens were annealed at 1000{degrees} for 2 h to obtain a fully recrystallized, fine grain microstructure with a grain size in the range of 10-19 {mu}m. Room temperature tests at strain rates ranging from 10{sup {minus}3} to 5 x 10{sup {minus}1}/s were carried out in air; elevated temperature testing up to 700{degrees}C was conducted in a vacuum better than 1 x 10{sup {minus}5} torr (<10{sup {minus}3} Pa). To study the effect of atomic hydrogen on ductility, tensile tests were conducted at room temperature in an ultra high vacuum chamber (UHV) with a hydrogen leak system.

  4. Results of a pilot scale melter test to attain higher production rates

    International Nuclear Information System (INIS)

    Elliott, M.L.; Perez, J.M. Jr.; Chapman, C.C.

    1991-01-01

    A pilot-scale melter test was completed as part of the effort to enhance glass production rates. The experiment was designed to evaluate the effects of bulk glass temperature and feed oxide loading. The maximum glass production rate obtained, 86 kg/hr-m 2 , was over 200% better than the previous record for the melter used

  5. Combining rigour with relevance: a novel methodology for testing Chinese herbal medicine.

    Science.gov (United States)

    Flower, Andrew; Lewith, George; Little, Paul

    2011-03-24

    There is a need to develop an evidence base for Chinese herbal medicine (CHM) that is both rigorous and reflective of good practice. This paper proposes a novel methodology to test individualised herbal decoctions using a randomised, double blinded, placebo controlled clinical trial. A feasibility study was conducted to explore the role of CHM in the treatment of endometriosis. Herbal formulae were pre-cooked and dispensed as individual doses in sealed plastic sachets. This permitted the development and testing of a plausible placebo decoction. Participants were randomised at a distant pharmacy to receive either an individualised herbal prescription or a placebo. The trial met the predetermined criteria for good practice. Neither the participants nor the practitioner-researcher could reliably identify group allocation. Of the 28 women who completed the trial, in the placebo group (n=15) 3 women (20%) correctly guessed they were on placebo, 8 (53%) thought they were on herbs and 4 (27%) did not know which group they had been allocated to. In the active group (n=13) 2 (15%) though they were on placebo, 8 (62%) thought they were on herbs and 3 (23%) did not know. Randomisation, double blinding and allocation concealment were successful and the study model appeared to be feasible and effective. It is now possible to subject CHM to rigorous scientific scrutiny without compromising model validity. Improvement in the design of the placebo using food colourings and flavourings instead of dried food will help guarantee the therapeutic inertia of the placebo decoction. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  6. New method of analyzing well tests in fractured wells using sandface pressure and rate data

    Energy Technology Data Exchange (ETDEWEB)

    Osman, M.; Almehaideb, R.; Abou-Kassem, J. [U.A.E. University, Al-Ain (United Arab Emirates)

    1998-05-01

    Analysis of variable flow rate tests has been of special interest recently because in many cases it is impractical to keep a flow rate constant long enough to perform a drawdown test. Further, in many other drawdown and buildup tests, the early data were influenced by wellbore storage effects, and the duration of these effects could be quite long for low-permeability reservoirs. This paper presents a mathematical model which describes drawdown and buildup tests in hydraulically fractured wells. This new method uses a specialized plot approach to analyze the linear flow data and combines it with the superposition of constant-rate solution method for the analysis of psuedoradial flow data. It does not require prior knowledge of the fracture type (uniform-flux or infinite-conductivity); in fact it predicts the fracture type. This method is useful for the analysis of simultaneously measured downhole pressure and sandface rate data. 12 refs., 11 figs., 3 tabs.

  7. Nueva metodología para probar el sistema nervioso autónomo en individuos hipertensos A new methodology by testing the autonomic system activity in hypertensive individuals

    Directory of Open Access Journals (Sweden)

    Daniel A. Botero-Rosas

    2010-12-01

    genesis and the development of Arterial Systemic Hypertension. Because of that, aimed to study the Autonomous Nervous Sistem in this pathology through a new methodology which uses heart rate. Methodology: 45 subjects were selected (12 hypertensive and 31 healthy to check the arterial pressure and heart rate beat by beat (2.5 min in rest and 2.5 min after the orthostatism spurious values were retired from temporal series for interpolation because the lack of heart rate periodicity. Then, sub sampling in 10Hz was realized and a filter that respect the heart rate was applied. Spectral analysis in the temporal series was realized by posterior median and quartile estimation. Finally, hypothesis test with Wilcoxon rank sum test to check statistical differences between groups was performed. Results: The medians of the percentage powers by the high frequencies, pre and post maneuver, in healthy individuals were a few inferior but not with statistical significance when are compared with hypertensive individuals (healthy:42.69 and 32.39; hypertensive:46.91 and 33.99. On the other hand, the same estimator by the low frequencies was a few superior in healthy individuals (healthy: 57.30 and 67.60; hypertensive: 53.09 and 66. Additionally, the difference in the autonomic response between hypertensive individuals was not significative (p>0.01 meanwhile in healthy individuals do. Conclusions: The methodology had demonstrated potential to identify autonomic disfunction in hypertensive. Also, it confirms a lower sympathetic activation in hypertensive individuals when the orthostatic maneuver is done. Salud UIS 2010; 42: 240-247

  8. Research Methodology in Recurrent Pregnancy Loss

    DEFF Research Database (Denmark)

    Christiansen, Ole B

    2014-01-01

    The aim of this article is to highlight pitfalls in research methodology that may explain why studies in recurrent pregnancy loss (RPL) often provide very divergent results. It is hoped that insight into this issue may help clinicians decide which published studies are the most valid. It may help...... researchers to eliminate methodological flaws in future studies, which may hopefully come to some kind of agreement about the usefulness of diagnostic tests and treatments in RPL....

  9. Mobile Usability Testing in Healthcare: Methodological Approaches.

    Science.gov (United States)

    Borycki, Elizabeth M; Monkman, Helen; Griffith, Janessa; Kushniruk, Andre W

    2015-01-01

    The use of mobile devices and healthcare applications is increasing exponentially worldwide. This has lead to the need for the healthcare industry to develop a better understanding of the impact of the usability of mobile software and hardware upon consumer and health professional adoption and use of these technologies. There are many methodological approaches that can be employed in conducting usability evaluation of mobile technologies. More obtrusive approaches to collecting study data may lead to changes in study participant behaviour, leading to study results that are less consistent with how the technologies will be used in the real-world. Alternatively, less obstrusive methods used in evaluating the usability of mobile software and hardware in-situ and laboratory settings can lead to less detailed information being collected about how an individual interacts with both the software and hardware. In this paper we review and discuss several innovative mobile usability evaluation methods on a contiuum from least to most obtrusive and their effects on the quality of the usability data collected. The strengths and limitations of methods are also discussed.

  10. Interventions to Improve Rate of Diabetes Testing Postpartum in Women With Gestational Diabetes Mellitus.

    Science.gov (United States)

    Hamel, Maureen S; Werner, Erika F

    2017-02-01

    Gestational diabetes mellitus (GDM) is one of the most common medical complications of pregnancy. In the USA, four million women are screened annually for GDM in pregnancy in part to improve pregnancy outcomes but also because diagnosis predicts a high risk of future type 2 diabetes mellitus (T2DM). Therefore, among women with GDM, postpartum care should be focused on T2DM prevention. This review describes the current literature aimed to increase postpartum diabetes testing among women with GDM. Data suggest that proactive patient contact via a health educator, a phone call, or even postal mail is associated with higher rates of postpartum diabetes testing. There may also be utility to changing the timing of postpartum diabetes testing. Despite the widespread knowledge regarding the importance of postpartum testing for women with GDM, testing rates remain low. Alternative testing strategies and large randomized trials addressing postpartum testing are warranted.

  11. Effects of Classroom Ventilation Rate and Temperature on Students' Test Scores.

    Directory of Open Access Journals (Sweden)

    Ulla Haverinen-Shaughnessy

    Full Text Available Using a multilevel approach, we estimated the effects of classroom ventilation rate and temperature on academic achievement. The analysis is based on measurement data from a 70 elementary school district (140 fifth grade classrooms from Southwestern United States, and student level data (N = 3109 on socioeconomic variables and standardized test scores. There was a statistically significant association between ventilation rates and mathematics scores, and it was stronger when the six classrooms with high ventilation rates that were indicated as outliers were filtered (> 7.1 l/s per person. The association remained significant when prior year test scores were included in the model, resulting in less unexplained variability. Students' mean mathematics scores (average 2286 points were increased by up to eleven points (0.5% per each liter per second per person increase in ventilation rate within the range of 0.9-7.1 l/s per person (estimated effect size 74 points. There was an additional increase of 12-13 points per each 1°C decrease in temperature within the observed range of 20-25°C (estimated effect size 67 points. Effects of similar magnitude but higher variability were observed for reading and science scores. In conclusion, maintaining adequate ventilation and thermal comfort in classrooms could significantly improve academic achievement of students.

  12. Effects of Classroom Ventilation Rate and Temperature on Students’ Test Scores

    Science.gov (United States)

    2015-01-01

    Using a multilevel approach, we estimated the effects of classroom ventilation rate and temperature on academic achievement. The analysis is based on measurement data from a 70 elementary school district (140 fifth grade classrooms) from Southwestern United States, and student level data (N = 3109) on socioeconomic variables and standardized test scores. There was a statistically significant association between ventilation rates and mathematics scores, and it was stronger when the six classrooms with high ventilation rates that were indicated as outliers were filtered (> 7.1 l/s per person). The association remained significant when prior year test scores were included in the model, resulting in less unexplained variability. Students’ mean mathematics scores (average 2286 points) were increased by up to eleven points (0.5%) per each liter per second per person increase in ventilation rate within the range of 0.9–7.1 l/s per person (estimated effect size 74 points). There was an additional increase of 12–13 points per each 1°C decrease in temperature within the observed range of 20–25°C (estimated effect size 67 points). Effects of similar magnitude but higher variability were observed for reading and science scores. In conclusion, maintaining adequate ventilation and thermal comfort in classrooms could significantly improve academic achievement of students. PMID:26317643

  13. Methods of analysis speech rate: a pilot study.

    Science.gov (United States)

    Costa, Luanna Maria Oliveira; Martins-Reis, Vanessa de Oliveira; Celeste, Letícia Côrrea

    2016-01-01

    To describe the performance of fluent adults in different measures of speech rate. The study included 24 fluent adults, of both genders, speakers of Brazilian Portuguese, who were born and still living in the metropolitan region of Belo Horizonte, state of Minas Gerais, aged between 18 and 59 years. Participants were grouped by age: G1 (18-29 years), G2 (30-39 years), G3 (40-49 years), and G4 (50-59 years). The speech samples were obtained following the methodology of the Speech Fluency Assessment Protocol. In addition to the measures of speech rate proposed by the protocol (speech rate in words and syllables per minute), the rate of speech into phonemes per second and the articulation rate with and without the disfluencies were calculated. We used the nonparametric Friedman test and the Wilcoxon test for multiple comparisons. Groups were compared using the nonparametric Kruskal Wallis. The significance level was of 5%. There were significant differences between measures of speech rate involving syllables. The multiple comparisons showed that all the three measures were different. There was no effect of age for the studied measures. These findings corroborate previous studies. The inclusion of temporal acoustic measures such as speech rate in phonemes per second and articulation rates with and without disfluencies can be a complementary approach in the evaluation of speech rate.

  14. Tornado missile simulation and design methodology. Volume 1: simulation methodology, design applications, and TORMIS computer code. Final report

    International Nuclear Information System (INIS)

    Twisdale, L.A.; Dunn, W.L.

    1981-08-01

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments. Sensitivity analyses have been performed on both the individual models and the integrated methodology, and risk has been assessed for a hypothetical nuclear power plant design case study

  15. A generic semi-implicit coupling methodology for use in RELAP5-3Dcopyright

    International Nuclear Information System (INIS)

    Aumiller, D.L.; Tomlinson, E.T.; Weaver, W.L.

    2000-01-01

    A generic semi-implicit coupling methodology has been developed and implemented in the RELAP5-3Dcopyright computer program. This methodology allows RELAP5-3Dcopyright to be used with other computer programs to perform integrated analyses of nuclear power reactor systems and related experimental facilities. The coupling methodology potentially allows different programs to be used to model different portions of the system. The programs are chosen based on their capability to model the phenomena that are important in the simulation in the various portions of the system being considered. The methodology was demonstrated using a test case in which the test geometry was divided into two parts each of which was solved as a RELAP5-3Dcopyright simulation. This test problem exercised all of the semi-implicit coupling features which were installed in RELAP5-3D0. The results of this verification test case show that the semi-implicit coupling methodology produces the same answer as the simulation of the test system as a single process

  16. Development of Thermal-hydraulic Analysis Methodology for Multi-module Breeding Blankets in K-DEMO

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Geon-Woo; Lee, Jeong-Hun; Park, Goon-Cherl; Cho, Hyoung-Kyu [Seoul National University, Seoul (Korea, Republic of); Im, Kihak [National Fusion Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    In this paper, the purpose of the analyses is to extend the capability of MARS-KS to the entire blanket system which includes a few hundreds of single blanket modules. Afterwards, the plan for the whole blanket system analysis using MARS-KS is introduced and the result of the multiple blanket module analysis is summarized. A thermal-hydraulic analysis code for a nuclear reactor safety, MARS-KS, was applied for the conceptual design of the K-DEMO breeding blanket thermal analysis. Then, a methodology to simulate multiple blanket modules was proposed, which uses a supervisor program to handle each blanket module individually at first and then distribute the flow rate considering pressure drops arises in each module. For a feasibility test of the proposed methodology, 10 outboard blankets in a toroidal field sector were simulated, which are connected with each other through the inlet and outlet common headers. The calculation results of flow rates, pressure drops, and temperatures showed the validity of the calculation and thanks to the parallelization using MPI, almost linear speed-up could be obtained.

  17. Nuclear power plant simulation facility evaluation methodology

    International Nuclear Information System (INIS)

    Haas, P.M.; Carter, R.J.; Laughery, K.R. Jr.

    1985-01-01

    A methodology for evaluation of nuclear power plant simulation facilities with regard to their acceptability for use in the US Nuclear Regulatory Commission (NRC) operator licensing exam is described. The evaluation is based primarily on simulator fidelity, but incorporates some aspects of direct operator/trainee performance measurement. The panel presentation and paper discuss data requirements, data collection, data analysis and criteria for conclusions regarding the fidelity evaluation, and summarize the proposed use of direct performance measurment. While field testing and refinement of the methodology are recommended, this initial effort provides a firm basis for NRC to fully develop the necessary methodology

  18. 77 FR 8178 - Test Procedures for Central Air Conditioners and Heat Pumps: Public Meeting

    Science.gov (United States)

    2012-02-14

    .... EERE-2010-BT-TP-0038] Test Procedures for Central Air Conditioners and Heat Pumps: Public Meeting... methodologies and gather comments on testing residential central air conditioners and heat pumps designed to use... residential central air conditioners and heat pumps that are single phase with rated cooling capacities less...

  19. Results of testing the Grambow rate law for use in HWVP glass durability correlations

    International Nuclear Information System (INIS)

    Kuhn, W.L.; Bunnell, L.R.

    1996-03-01

    A theory based on Grambow's work on hydration of glass as linear function of solution composition was evaluated. Use of Grambow's linear rate law for correlation of durability with glass composition is not recommended. Dissolution rate of the glass was determined using the rate of release of sodium with an ion selective electrode. This method was tested first applying it to initial dissolution rate of several glasses at several temperatures with zero initial concentration of silicic acid. HW39-2, HW39-4, and SRL-202 from Savannah River were tested; there was significant scatter in the data, with the dissolution rates of HW39 glasses and the SRL glass being comparable within this scatter. The dissolution rate of SRL-202 at 80 C and pH 7 for silicic acid concentrations 0, 25, 50, and 100% saturation, was found to decrease dramatically at only 25% of the saturated silicic acid concentration, which does not conform to the linear theory

  20. Proceedings of International monitoring conference 'Development of rehabilitation methodology of environment of the Semipalatinsk region polluted by nuclear tests'

    International Nuclear Information System (INIS)

    2002-01-01

    The aim of the monitoring conference is draw an attention of government, national and international agencies, scientific societies, and local administrations to the ecological problems of Semipalatinsk nuclear test site, to combine the efforts of scientists to solve problems of soil disinfection, purification of surface and ground water from radioactive and heavy metals. It is expected that the knowledge, experience and methodology accumulated on the monitoring conference might be successfully transferred to solve analogous environmental problems of Kazakhstan

  1. Testing the control of mineral supply rates on chemical erosion in the Klamath Mountains

    Science.gov (United States)

    West, N.; Ferrier, K.

    2017-12-01

    The relationship between rates of chemical erosion and mineral supply is central to many problems in Earth science, including the role of tectonics in the global carbon cycle, nutrient supply to soils and streams via soil production, and lithologic controls on landscape evolution. We aim to test the relationship between mineral supply rates and chemical erosion in the forested uplands of the Klamath mountains, along a latitudinal transect of granodioritic plutons that spans an expected gradient in mineral supply rates associated with the geodynamic response to the migration of the Mendocino Triple Junction. We present 10Be-derived erosion rates and Zr-derived chemical depletion factors, as well as bulk soil and rock geochemistry on 10 ridgetops along the transect to test hypotheses about supply-limited and kinetically-limited chemical erosion. Previous studies in this area, comparing basin-averaged erosion rates and modeled uplift rates, suggest this region may be adjusted to an approximate steady state. Our preliminary results suggest that chemical erosion at these sites is influenced by both mineral supply rates and dissolution kinetics.

  2. Development and validation of a Chinese music quality rating test.

    Science.gov (United States)

    Cai, Yuexin; Zhao, Fei; Zheng, Yiqing

    2013-09-01

    The present study aims to develop and validate a Chinese music quality rating test (MQRT). In Experiment 1, 22 music pieces were initially selected and paired as a 'familiar music piece' and 'unfamiliar music piece' based on familiarities amongst the general public in the categories of classical music (6), Chinese folk music (8), and pop music (8). Following the selection criteria, one pair of music pieces from each music category was selected and used for the MQRT in Experiment 2. In Experiment 2, the MQRT was validated using these music pieces in the categories 'Pleasantness', 'Naturalness', 'Fullness', 'Roughness', and 'Sharpness'. Seventy-two adult participants and 30 normal-hearing listeners were recruited in Experiments 1 and 2, respectively. Significant differences between the familiar and unfamiliar music pieces were found in respect of pleasantness rating for folk and pop music pieces as well as in sharpness rating for pop music pieces. The comparison of music category effect on MQRT found significant differences in pleasantness, fullness, and sharpness ratings. The Chinese MQRT developed in the present study is an effective tool for assessing music quality.

  3. Crash test rating and likelihood of major thoracoabdominal injury in motor vehicle crashes: the new car assessment program side-impact crash test, 1998-2010.

    Science.gov (United States)

    Figler, Bradley D; Mack, Christopher D; Kaufman, Robert; Wessells, Hunter; Bulger, Eileen; Smith, Thomas G; Voelzke, Bryan

    2014-03-01

    The National Highway Traffic Safety Administration's New Car Assessment Program (NCAP) implemented side-impact crash testing on all new vehicles since 1998 to assess the likelihood of major thoracoabdominal injuries during a side-impact crash. Higher crash test rating is intended to indicate a safer car, but the real-world applicability of these ratings is unknown. Our objective was to determine the relationship between a vehicle's NCAP side-impact crash test rating and the risk of major thoracoabdominal injury among the vehicle's occupants in real-world side-impact motor vehicle crashes. The National Automotive Sampling System Crashworthiness Data System contains detailed crash and injury data in a sample of major crashes in the United States. For model years 1998 to 2010 and crash years 1999 to 2010, 68,124 occupants were identified in the Crashworthiness Data System database. Because 47% of cases were missing crash severity (ΔV), multiple imputation was used to estimate the missing values. The primary predictor of interest was the occupant vehicle's NCAP side-impact crash test rating, and the outcome of interest was the presence of major (Abbreviated Injury Scale [AIS] score ≥ 3) thoracoabdominal injury. In multivariate analysis, increasing NCAP crash test rating was associated with lower likelihood of major thoracoabdominal injury at high (odds ratio [OR], 0.8; 95% confidence interval [CI], 0.7-0.9; p NCAP side-impact crash test rating is associated with a lower likelihood of major thoracoabdominal trauma. Epidemiologic study, level III.

  4. Munitions and Explosives of Concern Survey Methodology and In-field Testing for Wind Energy Areas on the Atlantic Outer Continental Shelf

    Science.gov (United States)

    DuVal, C.; Carton, G.; Trembanis, A. C.; Edwards, M.; Miller, J. K.

    2017-12-01

    Munitions and explosives of concern (MEC) are present in U.S. waters as a result of past and ongoing live-fire testing and training, combat operations, and sea disposal. To identify MEC that may pose a risk to human safety during development of offshore wind facilities on the Atlantic Outer Continental Shelf (OCS), the Bureau of Ocean Energy Management (BOEM) is preparing to develop guidance on risk analysis and selection processes for methods and technologies to identify MEC in Wind Energy Areas (WEA). This study developed a process for selecting appropriate technologies and methodologies for MEC detection using a synthesis of historical research, physical site characterization, remote sensing technology review, and in-field trials. Personnel were tasked with seeding a portion of the Delaware WEA with munitions surrogates, while a second group of researchers not privy to the surrogate locations tested and optimized the selected methodology to find and identify the placed targets. This in-field trial, conducted in July 2016, emphasized the use of multiple sensors for MEC detection, and led to further guidance for future MEC detection efforts on the Atlantic OCS. An April 2017 follow on study determined the fate of the munitions surrogates after the Atlantic storm season had passed. Using regional hydrodynamic models and incorporating the recommendations from the 2016 field trial, the follow on study examined the fate of the MEC and compared the findings to existing research on munitions mobility, as well as models developed as part of the Office of Naval Research Mine-Burial Program. Focus was given to characterizing the influence of sediment type on surrogate munitions behavior and the influence of mophodynamics and object burial on MEC detection. Supporting Mine-Burial models, ripple bedforms were observed to impede surrogate scour and burial in coarse sediments, while surrogate burial was both predicted and observed in finer sediments. Further, incorporation of

  5. Methodology for determining time-dependent mechanical properties of tuff subjected to near-field repository conditions

    International Nuclear Information System (INIS)

    Blacic, J.D.; Andersen, R.

    1983-01-01

    We have established a methodology to determine the time dependence of strength and transport properties of tuff under conditions appropriate to a nuclear waste repository. Exploratory tests to determine the approximate magnitudes of thermomechanical property changes are nearly complete. In this report we describe the capabilities of an apparatus designed to precisely measure the time-dependent deformation and permeability of tuff at simulated repository conditions. Preliminary tests with this new apparatus indicate that microclastic creep failure of tuff occurs over a narrow strain range with little precursory Tertiary creep behavior. In one test, deformation under conditions of slowly decreasing effective pressure resulted in failure, whereas some strain indicators showed a decreasing rate of strain

  6. Methodology for determining time-dependent mechanical properties of tuff subjected to near-field repository conditions

    Energy Technology Data Exchange (ETDEWEB)

    Blacic, J.D.; Andersen, R.

    1983-01-01

    We have established a methodology to determine the time dependence of strength and transport properties of tuff under conditions appropriate to a nuclear waste repository. Exploratory tests to determine the approximate magnitudes of thermomechanical property changes are nearly complete. In this report we describe the capabilities of an apparatus designed to precisely measure the time-dependent deformation and permeability of tuff at simulated repository conditions. Preliminary tests with this new apparatus indicate that microclastic creep failure of tuff occurs over a narrow strain range with little precursory Tertiary creep behavior. In one test, deformation under conditions of slowly decreasing effective pressure resulted in failure, whereas some strain indicators showed a decreasing rate of strain.

  7. Dependency of Shear Strength on Test Rate in SiC/BSAS Ceramic Matrix Composite at Elevated Temperature

    Science.gov (United States)

    Choi, Sung R.; Bansal, Narottam P.; Gyekenyesi, John P.

    2003-01-01

    Both interlaminar and in-plane shear strengths of a unidirectional Hi-Nicalon(TM) fiber-reinforced barium strontium aluminosilicate (SiC/BSAS) composite were determined at 1100 C in air as a function of test rate using double notch shear test specimens. The composite exhibited a significant effect of test rate on shear strength, regardless of orientation which was either in interlaminar or in in-plane direction, resulting in an appreciable shear-strength degradation of about 50 percent as test rate decreased from 3.3 10(exp -1) mm/s to 3.3 10(exp -5) mm/s. The rate dependency of composite's shear strength was very similar to that of ultimate tensile strength at 1100 C observed in a similar composite (2-D SiC/BSAS) in which tensile strength decreased by about 60 percent when test rate varied from the highest (5 MPa/s) to the lowest (0.005 MPa/s). A phenomenological, power-law slow crack growth formulation was proposed and formulated to account for the rate dependency of shear strength of the composite.

  8. Does Experience Rating Improve Obstetric Practices?

    DEFF Research Database (Denmark)

    Amaral-Garcia, Sofia; Bertoli, Paola; Grembi, Veronica

    We provide an assessment of the introduction of experience rating for medical malpractice insurance using 2002-2009 inpatient discharge records data on deliveries from the Italian Region of Piedmont. Considering experience rating as an increase in medical malpractice pressure, we show that such i...... specification. We show that our results are robust to the different methodologies, and they can be explained in terms of a reduction in the discretion over obstetric decisions ratherthan a change in the risk profile of the patients....... schedules of non economic damages to set compensations for personal injuries and 6 do not. We use this ex-ante policy conditions to distinguish treated from control and implement first a difference in difference analysis, the robustness of which we test through a basic difference in discontinuities...

  9. Somatic and gastrointestinal in vivo biotransformation rates of hydrophobic chemicals in fish.

    Science.gov (United States)

    Lo, Justin C; Campbell, David A; Kennedy, Christopher J; Gobas, Frank A P C

    2015-10-01

    To improve current bioaccumulation assessment methods, a methodology is developed, applied, and investigated for measuring in vivo biotransformation rates of hydrophobic organic substances in the body (soma) and gastrointestinal tract of the fish. The method resembles the Organisation for Economic Co-operation and Development (OECD) 305 dietary bioaccumulation test but includes reference chemicals to determine both somatic and gastrointestinal biotransformation rates of test chemicals. Somatic biotransformation rate constants for the test chemicals ranged between 0 d(-1) and 0.38 (standard error [SE] 0.03)/d(-1) . Gastrointestinal biotransformation rate constants varied from 0 d(-1) to 46 (SE 7) d(-1) . Gastrointestinal biotransformation contributed more to the overall biotransformation in fish than somatic biotransformation for all test substances but 1. Results suggest that biomagnification tests can reveal the full extent of biotransformation in fish. The common presumption that the liver is the main site of biotransformation may not apply to many substances exposed through the diet. The results suggest that the application of quantitative structure-activity relationships (QSARs) for somatic biotransformation rates and hepatic in vitro models to assess the effect of biotransformation on bioaccumulation can underestimate biotransformation rates and overestimate the biomagnification potential of chemicals that are biotransformed in the gastrointestinal tract. With some modifications, the OECD 305 test can generate somatic and gastrointestinal biotransformation data to develop biotransformation QSARs and test in vitro-in vivo biotransformation extrapolation methods. © 2015 SETAC.

  10. Response Surface Methodology: An Extensive Potential to Optimize in vivo Photodynamic Therapy Conditions

    International Nuclear Information System (INIS)

    Tirand, Loraine; Bastogne, Thierry; Bechet, Denise M.Sc.; Linder, Michel; Thomas, Noemie; Frochot, Celine; Guillemin, Francois; Barberi-Heyob, Muriel

    2009-01-01

    Purpose: Photodynamic therapy (PDT) is based on the interaction of a photosensitizing (PS) agent, light, and oxygen. Few new PS agents are being developed to the in vivo stage, partly because of the difficulty in finding the right treatment conditions. Response surface methodology, an empirical modeling approach based on data resulting from a set of designed experiments, was suggested as a rational solution with which to select in vivo PDT conditions by using a new peptide-conjugated PS targeting agent, neuropilin-1. Methods and Materials: A Doehlert experimental design was selected to model effects and interactions of the PS dose, fluence, and fluence rate on the growth of U87 human malignant glioma cell xenografts in nude mice, using a fixed drug-light interval. All experimental results were computed by Nemrod-W software and Matlab. Results: Intrinsic diameter growth rate, a tumor growth parameter independent of the initial volume of the tumor, was selected as the response variable and was compared to tumor growth delay and relative tumor volumes. With only 13 experimental conditions tested, an optimal PDT condition was selected (PS agent dose, 2.80 mg/kg; fluence, 120 J/cm 2 ; fluence rate, 85 mW/cm 2 ). Treatment of glioma-bearing mice with the peptide-conjugated PS agent, followed by the optimized PDT condition showed a statistically significant improvement in delaying tumor growth compared with animals who received the PDT with the nonconjugated PS agent. Conclusions: Response surface methodology appears to be a useful experimental approach for rapid testing of different treatment conditions and determination of optimal values of PDT factors for any PS agent.

  11. Can citizen science produce good science? Testing the OPAL Air Survey methodology, using lichens as indicators of nitrogenous pollution.

    Science.gov (United States)

    Tregidgo, Daniel J; West, Sarah E; Ashmore, Mike R

    2013-11-01

    Citizen science is having increasing influence on environmental monitoring as its advantages are becoming recognised. However methodologies are often simplified to make them accessible to citizen scientists. We tested whether a recent citizen science survey (the OPAL Air Survey) could detect trends in lichen community composition over transects away from roads. We hypothesised that the abundance of nitrophilic lichens would decrease with distance from the road, while that of nitrophobic lichens would increase. The hypothesised changes were detected along strong pollution gradients, but not where the road source was relatively weak, or background pollution relatively high. We conclude that the simplified OPAL methodology can detect large contrasts in nitrogenous pollution, but it may not be able to detect more subtle changes in pollution exposure. Similar studies are needed in conjunction with the ever-growing body of citizen science work to ensure that the limitations of these methods are fully understood. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. Testing nonlinearities between Brazilian exchange rate and inflation volatilities

    Directory of Open Access Journals (Sweden)

    Christiane R. Albuquerque

    2006-12-01

    Full Text Available There are few studies, directly addressing exchange rate and inflation volatilities, and lack of consensus among them. However, this kind of study is necessary, especially under an inflation-targeting system where the monetary authority must know well price behavior. This article analyses the relation between exchange rate and inflation volatilities using a bivariate GARCH model, and therefore modeling conditional volatilities, fact largely unexplored by the literature. We find a semi-concave relation between those series, and this nonlinearity may explain their apparently disconnection under a floating exchange rate system. The article also shows that traditional tests, with non-conditional volatilities, are not robust.Existem poucos estudos, e pouco consenso, sobre a relação entre as volatilidades cambial e da inflação. Todavia, tais estudos são necessários, especialmente em um regime de metas de inflação onde a autoridade monetária deve conhecer detalhadamente o comportamento dos preços. Existem poucos estudos, e pouco consenso, sobre a relação entre as volatilidades cambial e da inflação. Todavia, tais estudos são necessários, especialmente em um regime de metas de inflação onde a autoridade monetária deve conhecer detalhadamente o comportamento dos preços. Este artigo analisa a relação entre aquelas volatilidades usando um modelo Garch bivariado, modelando, portanto, as volatilidades condicionais, enfoque pouco explorado pela literatura. Encontramos uma relação semi-côncava entre as séries, e esta não-linearidade pode explicar o aparente descolamento das mesmas em períodos de regime cambial flutuante. O artigo também mostra que os testes tradicionais, com volatilidades não-condicionais, não são robustos.

  13. 47 CFR 65.800 - Rate base.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 3 2010-10-01 2010-10-01 false Rate base. 65.800 Section 65.800 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES (CONTINUED) INTERSTATE RATE OF RETURN PRESCRIPTION PROCEDURES AND METHODOLOGIES Rate Base § 65.800 Rate base. The rate base shall...

  14. Development of a calibration methodology and tests of kerma area product meters

    International Nuclear Information System (INIS)

    Costa, Nathalia Almeida

    2013-01-01

    The quantity kerma area product (PKA) is important to establish reference levels in diagnostic radiology exams. This quantity can be obtained using a PKA meter. The use of such meters is essential to evaluate the radiation dose in radiological procedures and is a good indicator to make sure that the dose limit to the patient's skin doesn't exceed. Sometimes, these meters come fixed to X radiation equipment, which makes its calibration difficult. In this work, it was developed a methodology for calibration of PKA meters. The instrument used for this purpose was the Patient Dose Calibrator (PDC). It was developed to be used as a reference to check the calibration of PKA and air kerma meters that are used for dosimetry in patients and to verify the consistency and behavior of systems of automatic exposure control. Because it is a new equipment, which, in Brazil, is not yet used as reference equipment for calibration, it was also performed the quality control of this equipment with characterization tests, the calibration and an evaluation of the energy dependence. After the tests, it was proved that the PDC can be used as a reference instrument and that the calibration must be performed in situ, so that the characteristics of each X-ray equipment, where the PKA meters are used, are considered. The calibration was then performed with portable PKA meters and in an interventional radiology equipment that has a PKA meter fixed. The results were good and it was proved the need for calibration of these meters and the importance of in situ calibration with a reference meter. (author)

  15. Multi-axial Creep and the LICON Methodology for Accelerated Creep Testing

    International Nuclear Information System (INIS)

    Bowyer, William H.

    2006-05-01

    The copper-Iron canister for disposal of nuclear waste in the Swedish Programme has a design life exceeding 100,000 years. Whilst the operating temperature (100 deg C max.) and operating stress (50 MPa max.) are modest, the very long design life does require that the likely creep performance of the canister should be investigated. Many studies have been carried out by SKB but these have all involved very short duration tests at relatively high stresses. The process of predicting canister creep life by extrapolation of data from such tests has been challenged for two main reasons. The first is that the deformation and failure mechanisms in the tests employed are different from the mechanism expected under service conditions and the second is that the extrapolation is extreme. It has been recognised that there is usually scope for some increase in test temperatures and stresses which will accelerate the development of creep damage without compromising the use of extrapolation for life prediction. Cane demonstrated that in steels designed for high temperature and pressure applications, conditions of multi-axial stressing could lead to increases or decreases in the rate of damage accumulation without changing the damage mechanism. This provided a third method for accelerating creep testing which has been implemented as the LICON method. This report aims to explain the background to the LICON method and its application to the case of the copper canister. It seems likely that the method could be used to improve our knowledge of the creep resistance of the copper canister. Multiplication factors that may be achieved by the technique could be increased by attention to specimen design but an extensive and targeted programme of data collection on creep of copper would still be needed to implement the method to best advantage

  16. Multi-axial Creep and the LICON Methodology for Accelerated Creep Testing

    Energy Technology Data Exchange (ETDEWEB)

    Bowyer, William H. [Meadow End Farm, Farnham (United Kingdom)

    2006-05-15

    The copper-Iron canister for disposal of nuclear waste in the Swedish Programme has a design life exceeding 100,000 years. Whilst the operating temperature (100 deg C max.) and operating stress (50 MPa max.) are modest, the very long design life does require that the likely creep performance of the canister should be investigated. Many studies have been carried out by SKB but these have all involved very short duration tests at relatively high stresses. The process of predicting canister creep life by extrapolation of data from such tests has been challenged for two main reasons. The first is that the deformation and failure mechanisms in the tests employed are different from the mechanism expected under service conditions and the second is that the extrapolation is extreme. It has been recognised that there is usually scope for some increase in test temperatures and stresses which will accelerate the development of creep damage without compromising the use of extrapolation for life prediction. Cane demonstrated that in steels designed for high temperature and pressure applications, conditions of multi-axial stressing could lead to increases or decreases in the rate of damage accumulation without changing the damage mechanism. This provided a third method for accelerating creep testing which has been implemented as the LICON method. This report aims to explain the background to the LICON method and its application to the case of the copper canister. It seems likely that the method could be used to improve our knowledge of the creep resistance of the copper canister. Multiplication factors that may be achieved by the technique could be increased by attention to specimen design but an extensive and targeted programme of data collection on creep of copper would still be needed to implement the method to best advantage.

  17. Motor operated valve testing and the 'rate of loading' phenomenon

    International Nuclear Information System (INIS)

    Black, B.R.

    1991-01-01

    This paper discusses valve design features which affect the ability to predict motor operated valve (MOV) performance and reviews factors which should be considered when selecting switch settings to limit stem loads. Considerable attention is given to the rate of loading phenomenon which affects the relationship between valve stem thrust and actuator spring pack deflection. Equations are developed, and testing is discussed which permit the construction of an MOV dynamic model. Factors which must be considered when maintaining switch settings correct throughout the life of the plant are discussed. And switch setting acceptance criteria for use with baseline Static and Design Basis testing are suggested

  18. Effect of Control Mode and Test Rate on the Measured Fracture Toughness of Advanced Ceramics

    Science.gov (United States)

    Hausmann, Bronson D.; Salem, Jonathan A.

    2018-01-01

    The effects of control mode and test rate on the measured fracture toughness of ceramics were evaluated by using chevron-notched flexure specimens in accordance with ASTM C1421. The use of stroke control gave consistent results with about 2% (statistically insignificant) variation in measured fracture toughness for a very wide range of rates (0.005 to 0.5 mm/min). Use of strain or crack mouth opening displacement (CMOD) control gave approx. 5% (statistically significant) variation over a very wide range of rates (1 to 80 µm/m/s), with the measurements being a function of rate. However, the rate effect was eliminated by use of dry nitrogen, implying a stress corrosion effect rather than a stability effect. With the use of a nitrogen environment during strain controlled tests, fracture toughness values were within about 1% over a wide range of rates (1 to 80 micons/m/s). CMOD or strain control did allow stable crack extension well past maximum force, and thus is preferred for energy calculations. The effort is being used to confirm recommendations in ASTM Test Method C1421 on fracture toughness measurement.

  19. Research methodology in recurrent pregnancy loss.

    Science.gov (United States)

    Christiansen, Ole B

    2014-03-01

    The aim of this article is to highlight pitfalls in research methodology that may explain why studies in recurrent pregnancy loss (RPL) often provide very divergent results. It is hoped that insight into this issue may help clinicians decide which published studies are the most valid. It may help researchers to eliminate methodological flaws in future studies, which may hopefully come to some kind of agreement about the usefulness of diagnostic tests and treatments in RPL. Copyright © 2014 Elsevier Inc. All rights reserved.

  20. [The maximum heart rate in the exercise test: the 220-age formula or Sheffield's table?].

    Science.gov (United States)

    Mesquita, A; Trabulo, M; Mendes, M; Viana, J F; Seabra-Gomes, R

    1996-02-01

    To determine in the maximum cardiac rate in exercise test of apparently healthy individuals may be more properly estimated through 220-age formula (Astrand) or the Sheffield table. Retrospective analysis of clinical history and exercises test of apparently healthy individuals submitted to cardiac check-up. Sequential sampling of 170 healthy individuals submitted to cardiac check-up between April 1988 and September 1992. Comparison of maximum cardiac rate of individuals studied by the protocols of Bruce and modified Bruce, in interrupted exercise test by fatigue, and with the estimated values by the formulae: 220-age versus Sheffield table. The maximum cardiac heart rate is similar with both protocols. This parameter in normal individuals is better predicted by the 220-age formula. The theoretic maximum cardiac heart rate determined by 220-age formula should be recommended for a healthy, and for this reason the Sheffield table has been excluded from our clinical practice.

  1. Sludge Batch 5 Slurry Fed Melt Rate Furnace Test with Frits 418 and 550

    International Nuclear Information System (INIS)

    Miller, Donald; Pickenheim, Bradley

    2009-01-01

    Based on Melt Rate Furnace (MRF) testing for the Sludge Batch 5 (SB5) projected composition and assessments of the potential frits with reasonable operating windows, the Savannah River National Laboratory (SRNL) recommended Slurry Fed Melt Rate Furnace (SMRF) testing with Frits 418 and 550. DWPF is currently using Frit 418 with SB5 based on SRNL's recommendation due to its ability to accommodate significant sodium variation in the sludge composition. However, experience with high boron containing frits in DWPF indicated a potential advantage for Frit 550 might exist. Therefore, SRNL performed SMRF testing to assess Frit 550's potential advantages. The results of SMRF testing with SB5 simulant indicate that there is no appreciable difference in melt rate between Frit 418 and Frit 550 at a targeted 34 weight % waste loading. Both batches exhibited comparable behavior when delivered through the feed tube by the peristaltic pump. Limited observation of the cold cap during both runs showed no indication of major cold cap mounding. MRF testing, performed after the SMRF runs due to time constraints, with the same two Slurry Mix Evaporator (SME) dried products led to the same conclusion. Although visual observations of the cross-sectioned MRF beakers indicated differences in the appearance of the two systems, the measured melt rates were both ∼0.6 in/hr. Therefore, SRNL does not recommend a change from Frit 418 for the initial SB5 processing in DWPF. Once the actual SB5 composition is known and revised projections of SB5 after the neptunium stream addition and any decants is provided, SRNL will perform an additional compositional window assessment with Frit 418. If requested, SRNL can also include other potential frits in this assessment should processing of SB5 with Frit 418 result in less than desirable melter throughput in DWPF. The frits would then be subjected to melt rate testing at SRNL to determine any potential advantages

  2. Ratings are Overrated!

    Directory of Open Access Journals (Sweden)

    Georgios N. Yannakakis

    2015-07-01

    Full Text Available Are ratings of any use in human-computer interaction and user studies at large? If ratings are of limited use, is there a better alternative for quantitative subjective assessment? Beyond the intrinsic shortcomings of human reporting, there are a number of supplementary limitations and fundamental methodological flaws associated with rating-based questionnaires --- i.e. questionnaires that ask participants to rate their level of agreement with a given statement such as a Likert item. While the effect of these pitfalls has been largely downplayed, recent findings from diverse areas of study question the reliability of using ratings. Rank-based questionnaires --- i.e. questionnaires that ask participants to rank two or more options --- appear as the evident alternative that not only eliminates the core limitations of ratings but also simplifies the use of sound methodologies that yield more reliable models of the underlying reported construct: user emotion, preference, or opinion. This paper solicits recent findings from various disciplines interlinked with psychometrics and offers a quick guide for the use, processing and analysis of rank-based questionnaires for the unique advantages they offer. The paper challenges the traditional state-of-practice in human-computer interaction and psychometrics directly contributing towards a paradigm shift in subjective reporting.

  3. A validity test of movie, television, and video-game ratings.

    Science.gov (United States)

    Walsh, D A; Gentile, D A

    2001-06-01

    Numerous studies have documented the potential effects on young audiences of violent content in media products, including movies, television programs, and computer and video games. Similar studies have evaluated the effects associated with sexual content and messages. Cumulatively, these effects represent a significant public health risk for increased aggressive and violent behavior, spread of sexually transmitted diseases, and pediatric pregnancy. In partial response to these risks and to public and legislative pressure, the movie, television, and gaming industries have implemented ratings systems intended to provide information about the content and appropriate audiences for different films, shows, and games. To test the validity of the current movie-, television-, and video game-rating systems. Panel study. Participants used the KidScore media evaluation tool, which evaluates films, television shows, and video games on 10 aspects, including the appropriateness of the media product for children based on age. When an entertainment industry rates a product as inappropriate for children, parent raters agree that it is inappropriate for children. However, parent raters disagree with industry usage of many of the ratings designating material suitable for children of different ages. Products rated as appropriate for adolescents are of the greatest concern. The level of disagreement varies from industry to industry and even from rating to rating. Analysis indicates that the amount of violent content and portrayals of violence are the primary markers for disagreement between parent raters and industry ratings. As 1 part of a solution to the complex public health problems posed by violent and sexually explicit media products, ratings can have value if used with caution. Parents and caregivers relying on the ratings systems to guide their children's use of media products should continue to monitor content independently. Industry ratings systems should be revised with input

  4. Web survey methodology

    CERN Document Server

    Callegaro, Mario; Vehovar, Asja

    2015-01-01

    Web Survey Methodology guides the reader through the past fifteen years of research in web survey methodology. It both provides practical guidance on the latest techniques for collecting valid and reliable data and offers a comprehensive overview of research issues. Core topics from preparation to questionnaire design, recruitment testing to analysis and survey software are all covered in a systematic and insightful way. The reader will be exposed to key concepts and key findings in the literature, covering measurement, non-response, adjustments, paradata, and cost issues. The book also discusses the hottest research topics in survey research today, such as internet panels, virtual interviewing, mobile surveys and the integration with passive measurements, e-social sciences, mixed modes and business intelligence. The book is intended for students, practitioners, and researchers in fields such as survey and market research, psychological research, official statistics and customer satisfaction research.

  5. New tests of the distal speech rate effect: Examining cross-linguistic generalization

    Directory of Open Access Journals (Sweden)

    Laura eDilley

    2013-12-01

    Full Text Available Recent findings [Dilley and Pitt, 2010. Psych. Science. 21, 1664-1670] have shown that manipulating context speech rate in English can cause entire syllables to disappear or appear perceptually. The current studies tested two rate-based explanations of this phenomenon while attempting to replicate and extend these findings to another language, Russian. In Experiment 1, native Russian speakers listened to Russian sentences which had been subjected to rate manipulations and performed a lexical report task. Experiment 2 investigated speech rate effects in cross-language speech perception; non-native speakers of Russian of both high and low proficiency were tested on the same Russian sentences as in Experiment 1. They decided between two lexical interpretations of a critical portion of the sentence, where one choice contained more phonological material than the other (e.g., /stərʌ'na/ side vs. /strʌ'na/ country. In both experiments, with native and non-native speakers of Russian, context speech rate and the relative duration of the critical sentence portion were found to influence the amount of phonological material perceived. The results support the generalized rate normalization hypothesis, according to which the content perceived in a spectrally ambiguous stretch of speech depends on the duration of that content relative to the surrounding speech, while showing that the findings of Dilley and Pitt (2010 extend to a variety of morphosyntactic contexts and a new language, Russian. Findings indicate that relative timing cues across an utterance can be critical to accurate lexical perception by both native and non-native speakers.

  6. The corrosion rate of copper in a bentonite test package measured with electric resistance sensors

    Energy Technology Data Exchange (ETDEWEB)

    Rosborg, Bo [Division of Surface and Corrosion Science, KTH, Stockholm (Sweden); Kosec, Tadeja; Kranjc, Andrej; Kuhar, Viljem; Legat, Andraz [Slovenian National Building and Civil Engineering Institute, Ljubljana (Slovenia)

    2012-12-15

    LOT1 test parcel A2 was exposed for six years in the Aespoe Hard Rock Laboratory, which offers a realistic environment for the conditions that will prevail in a deep repository for high-level radioactive waste disposal in Sweden. The test parcel contained copper electrodes for real-time corrosion monitoring in bentonite ring 36, where the temperature was 24 deg C, and copper coupons in bentonite rings 22 and 30, where the temperature was higher. After retrieval of the test parcel in January 2006, a bentonite test package consisting of bentonite rings 35 - 37 was placed in a container and sealed with a thick layer of paraffin. Later the same year new copper electrodes were installed in the test package. In January 2007 electric resistance (ER) sensors of pure copper with a thickness of 35 {mu}m were also installed in the test package mainly to facilitate the interpretation of the results from the real-time corrosion monitoring with electrochemical techniques. The ER measurements have shown that the corrosion rate of pure copper exposed in an oxic bentonite/ saline groundwater environment at room temperate decreases slowly with time to low but measurable values. The corrosion rates estimated from the regularly performed EIS measurements replicate the ER data. Thus, for this oxic environment in which copper acquires corrosion potentials of the order of 200 mV (SHE) or higher, electrochemical measurements provide believable data. Comparing the recorded ER data with an estimate of the average corrosion rate based on comparing cross-sections from exposed and protected sensor elements, it is obvious that the former overestimates the actual corrosion rate, which is understandable. It seems as if electrochemical measurements can provide a better estimate of the corrosion rate; however, this is quite dependent on the use of proper measuring frequencies and evaluation methods. In this respect ER measurements are more reliable. It has been shown that real-time corrosion

  7. The corrosion rate of copper in a bentonite test package measured with electric resistance sensors

    International Nuclear Information System (INIS)

    Rosborg, Bo; Kosec, Tadeja; Kranjc, Andrej; Kuhar, Viljem; Legat, Andraz

    2012-12-01

    LOT1 test parcel A2 was exposed for six years in the Aespoe Hard Rock Laboratory, which offers a realistic environment for the conditions that will prevail in a deep repository for high-level radioactive waste disposal in Sweden. The test parcel contained copper electrodes for real-time corrosion monitoring in bentonite ring 36, where the temperature was 24 deg C, and copper coupons in bentonite rings 22 and 30, where the temperature was higher. After retrieval of the test parcel in January 2006, a bentonite test package consisting of bentonite rings 35 - 37 was placed in a container and sealed with a thick layer of paraffin. Later the same year new copper electrodes were installed in the test package. In January 2007 electric resistance (ER) sensors of pure copper with a thickness of 35 μm were also installed in the test package mainly to facilitate the interpretation of the results from the real-time corrosion monitoring with electrochemical techniques. The ER measurements have shown that the corrosion rate of pure copper exposed in an oxic bentonite/ saline groundwater environment at room temperate decreases slowly with time to low but measurable values. The corrosion rates estimated from the regularly performed EIS measurements replicate the ER data. Thus, for this oxic environment in which copper acquires corrosion potentials of the order of 200 mV (SHE) or higher, electrochemical measurements provide believable data. Comparing the recorded ER data with an estimate of the average corrosion rate based on comparing cross-sections from exposed and protected sensor elements, it is obvious that the former overestimates the actual corrosion rate, which is understandable. It seems as if electrochemical measurements can provide a better estimate of the corrosion rate; however, this is quite dependent on the use of proper measuring frequencies and evaluation methods. In this respect ER measurements are more reliable. It has been shown that real-time corrosion

  8. From Theory-Inspired to Theory-Based Interventions: A Protocol for Developing and Testing a Methodology for Linking Behaviour Change Techniques to Theoretical Mechanisms of Action.

    Science.gov (United States)

    Michie, Susan; Carey, Rachel N; Johnston, Marie; Rothman, Alexander J; de Bruin, Marijn; Kelly, Michael P; Connell, Lauren E

    2018-05-18

    Understanding links between behaviour change techniques (BCTs) and mechanisms of action (the processes through which they affect behaviour) helps inform the systematic development of behaviour change interventions. This research aims to develop and test a methodology for linking BCTs to their mechanisms of action. Study 1 (published explicit links): Hypothesised links between 93 BCTs (from the 93-item BCT taxonomy, BCTTv1) and mechanisms of action will be identified from published interventions and their frequency, explicitness and precision documented. Study 2 (expert-agreed explicit links): Behaviour change experts will identify links between 61 BCTs and 26 mechanisms of action in a formal consensus study. Study 3 (integrated matrix of explicit links): Agreement between studies 1 and 2 will be evaluated and a new group of experts will discuss discrepancies. An integrated matrix of BCT-mechanism of action links, annotated to indicate strength of evidence, will be generated. Study 4 (published implicit links): To determine whether groups of co-occurring BCTs can be linked to theories, we will identify groups of BCTs that are used together from the study 1 literature. A consensus exercise will be used to rate strength of links between groups of BCT and theories. A formal methodology for linking BCTs to their hypothesised mechanisms of action can contribute to the development and evaluation of behaviour change interventions. This research is a step towards developing a behaviour change 'ontology', specifying relations between BCTs, mechanisms of action, modes of delivery, populations, settings and types of behaviour.

  9. Interaction of heat production, strain rate and stress power in a plastically deforming body under tensile test

    Science.gov (United States)

    Paglietti, A.

    1982-01-01

    At high strain rates the heat produced by plastic deformation can give rise to a rate dependent response even if the material has rate independent constitutive equations. This effect has to be evaluated when interpreting a material test, or else it could erroneously be ascribed to viscosity. A general thermodynamic theory of tensile testing of elastic-plastic materials is given in this paper; it is valid for large strain at finite strain rates. It enables discovery of the parameters governing the thermodynamic strain rate effect, provides a method for proper interpretation of the results of the tests of dynamic plasticity, and suggests a way of planning experiments in order to detect the real contribution of viscosity.

  10. Automated Test-Form Generation

    Science.gov (United States)

    van der Linden, Wim J.; Diao, Qi

    2011-01-01

    In automated test assembly (ATA), the methodology of mixed-integer programming is used to select test items from an item bank to meet the specifications for a desired test form and optimize its measurement accuracy. The same methodology can be used to automate the formatting of the set of selected items into the actual test form. Three different…

  11. Critical assessment of jet erosion test methodologies for cohesive soil and sediment

    Science.gov (United States)

    Karamigolbaghi, Maliheh; Ghaneeizad, Seyed Mohammad; Atkinson, Joseph F.; Bennett, Sean J.; Wells, Robert R.

    2017-10-01

    The submerged Jet Erosion Test (JET) is a commonly used technique to assess the erodibility of cohesive soil. Employing a linear excess shear stress equation and impinging jet theory, simple numerical methods have been developed to analyze data collected using a JET to determine the critical shear stress and erodibility coefficient of soil. These include the Blaisdell, Iterative, and Scour Depth Methods, and all have been organized into easy to use spreadsheet routines. The analytical framework of the JET and its associated methods, however, are based on many assumptions that may not be satisfied in field and laboratory settings. The main objective of this study is to critically assess this analytical framework and these methodologies. Part of this assessment is to include the effect of flow confinement on the JET. The possible relationship between the derived erodibility coefficient and critical shear stress, a practical tool in soil erosion assessment, is examined, and a review of the deficiencies in the JET methodology also is presented. Using a large database of JET results from the United States and data from literature, it is shown that each method can generate an acceptable curve fit through the scour depth measurements as a function of time. The analysis shows, however, that the Scour Depth and Iterative Methods may result in physically unrealistic values for the erosion parameters. The effect of flow confinement of the impinging jet increases the derived critical shear stress and decreases the erodibility coefficient by a factor of 2.4 relative to unconfined flow assumption. For a given critical shear stress, the length of time over which scour depth data are collected also affects the calculation of erosion parameters. In general, there is a lack of consensus relating the derived soil erodibility coefficient to the derived critical shear stress. Although empirical relationships are statistically significant, the calculated erodibility coefficient for a

  12. Test order in teacher-rated behavior assessments: Is counterbalancing necessary?

    Science.gov (United States)

    Kooken, Janice; Welsh, Megan E; McCoach, D Betsy; Miller, Faith G; Chafouleas, Sandra M; Riley-Tillman, T Chris; Fabiano, Gregory

    2017-01-01

    Counterbalancing treatment order in experimental research design is well established as an option to reduce threats to internal validity, but in educational and psychological research, the effect of varying the order of multiple tests to a single rater has not been examined and is rarely adhered to in practice. The current study examines the effect of test order on measures of student behavior by teachers as raters utilizing data from a behavior measure validation study. Using multilevel modeling to control for students nested within teachers, the effect of rating an earlier measure on the intercept or slope of a later behavior assessment was statistically significant in 22% of predictor main effects for the spring test period. Test order effects had potential for high stakes consequences with differences large enough to change risk classification. Results suggest that researchers and practitioners in classroom settings using multiple measures evaluate the potential impact of test order. Where possible, they should counterbalance when the risk of an order effect exists and report justification for the decision to not counterbalance. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  13. GROUNDED THEORY METHODOLOGY and GROUNDED THEORY RESEARCH in TURKEY

    OpenAIRE

    ARIK, Ferhat; ARIK, Işıl Avşar

    2016-01-01

    This research discusses the historical development of the Grounded Theory Methodology, which is one of the qualitative research method, its transformation over time and how it is used as a methodology in Turkey. The Grounded Theory which was founded by Strauss and Glaser, is a qualitative methodology based on inductive logic to discover theories in contrast with the deductive understanding which is based on testing an existing theory in sociology. It is possible to examine the Grounded Theory...

  14. SPACE Code Assessment for FLECHT Test

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, Hyoung Kyoun; Min, Ji Hong; Park, Chan Eok; Park, Seok Jeong; Kim, Shin Whan [KEPCO E and C, Daejeon (Korea, Republic of)

    2015-10-15

    According to 10 CFR 50 Appendix K, Emergency Core Cooling System (ECCS) performance evaluation model during LBLOCA should be based on the data of FLECHT test. Heat transfer coefficient (HTC) and Carryout Rate Fraction (CRF) during reflood period of LBLOCA should be conservative. To develop Mass and Energy Release (MER) methodology using Safety and Performance Analysis CodE (SPACE), FLECHT test results were compared to the results calculated by SPACE. FLECHT test facility is modeled to compare the reflood HTC and CRF using SPACE. Sensitivity analysis is performed with various options for HTC correlation. Based on this result, it is concluded that the reflood HTC and CRF calculated with COBRA-TF correlation during LBLOCA meet the requirement of 10 CFR 50 Appendix K. In this study, the analysis results using SPACE predicts heat transfer phenomena of FLECHT test reasonably and conservatively. Reflood HTC for the test number of 0690, 3541 and 4225 are conservative in the reference case. In case of 6948 HTC using COBRATF is conservative to calculate film boiling region. All of analysis results for CRF have sufficient conservatism. Based on these results, it is possible to apply with COBRA-TF correlation to develop MER methodology to analyze LBLOCA using SPACE.

  15. Helicopter-Ship Qualification Testing

    NARCIS (Netherlands)

    Hoencamp, A.

    2015-01-01

    The goal of this research project is to develop a novel test methodology which can be used for optimizing cost and time efficiency of helicopter-ship qualification testing without reducing safety. For this purpose, the so-called “SHOL-X” test methodology has been established, which includes the

  16. Safety assessment of a borehole type disposal facility using the ISAM methodology

    International Nuclear Information System (INIS)

    Blerk, J.J. van; Yucel, V.; Kozak, M.W.; Moore, B.A.

    2002-01-01

    As part of the IAEA's Co-ordinated Research Project (CRP) on Improving Long-term of Safety Assessment Methodologies for Near Surface Waste Disposal Facilities (ISAM), three example cases were developed. The aim was to test the ISAM safety assessment methodology using as realistic as possible data. One of the Test Cases, the Borehole Test Case (BTC), related to a proposed future disposal option for disused sealed radioactive sources. This paper uses the various steps of the ISAM safety assessment methodology to describe the work undertaken by ISAM participants in developing the BTC and provides some general conclusions that can be drawn from the findings of their work. (author)

  17. Pulse superimposition calculational methodology for estimating the subcriticality level of nuclear fuel assemblies

    International Nuclear Information System (INIS)

    Talamo, Alberto; Gohar, Y.; Rabiti, C.; Aliberti, G.; Kondev, F.; Smith, D.; Zhong, Z.; Kiyavitskaya, H.; Bournos, V.; Fokov, Y.; Routkovskaya, C.; Serafimovich, I.

    2009-01-01

    One of the most reliable experimental methods for measuring the subcriticality level of a nuclear fuel assembly is the Sjoestrand method applied to the reaction rate generated from a pulsed neutron source. This study developed a new analytical methodology simulating the Sjoestrand method, which allows comparing the experimental and analytical reaction rates and the obtained subcriticality levels. In this methodology, the reaction rate is calculated due to a single neutron pulse using MCNP/MCNPX computer code or any other neutron transport code that explicitly simulates the delayed fission neutrons. The calculation simulates a single neutron pulse over a long time period until the delayed neutron contribution to the reaction rate is vanished. The obtained reaction rate is then superimposed to itself, with respect to the time, to simulate the repeated pulse operation until the asymptotic level of the reaction rate, set by the delayed neutrons, is achieved. The superimposition of the pulse to itself was calculated by a simple C computer program. A parallel version of the C program is used due to the large amount of data being processed, e.g. by the Message Passing Interface (MPI). The analytical results of this new calculation methodology have shown an excellent agreement with the experimental data available from the YALINA-Booster facility of Belarus. This methodology can be used to calculate Bell and Glasstone spatial correction factor.

  18. Pulse superimposition calculational methodology for estimating the subcriticality level of nuclear fuel assemblies

    Energy Technology Data Exchange (ETDEWEB)

    Talamo, Alberto [Argonne National Laboratory, 9700 South Cass Avenue, Argonne, IL 60439 (United States)], E-mail: atalamo@anl.gov; Gohar, Y. [Argonne National Laboratory, 9700 South Cass Avenue, Argonne, IL 60439 (United States); Rabiti, C. [Idaho National Laboratory, P.O. Box 2528, Idaho Falls, ID 83403 (United States); Aliberti, G.; Kondev, F.; Smith, D.; Zhong, Z. [Argonne National Laboratory, 9700 South Cass Avenue, Argonne, IL 60439 (United States); Kiyavitskaya, H.; Bournos, V.; Fokov, Y.; Routkovskaya, C.; Serafimovich, I. [Joint Institute for Power and Nuclear Research-Sosny, National Academy of Sciences (Belarus)

    2009-07-21

    One of the most reliable experimental methods for measuring the subcriticality level of a nuclear fuel assembly is the Sjoestrand method applied to the reaction rate generated from a pulsed neutron source. This study developed a new analytical methodology simulating the Sjoestrand method, which allows comparing the experimental and analytical reaction rates and the obtained subcriticality levels. In this methodology, the reaction rate is calculated due to a single neutron pulse using MCNP/MCNPX computer code or any other neutron transport code that explicitly simulates the delayed fission neutrons. The calculation simulates a single neutron pulse over a long time period until the delayed neutron contribution to the reaction rate is vanished. The obtained reaction rate is then superimposed to itself, with respect to the time, to simulate the repeated pulse operation until the asymptotic level of the reaction rate, set by the delayed neutrons, is achieved. The superimposition of the pulse to itself was calculated by a simple C computer program. A parallel version of the C program is used due to the large amount of data being processed, e.g. by the Message Passing Interface (MPI). The analytical results of this new calculation methodology have shown an excellent agreement with the experimental data available from the YALINA-Booster facility of Belarus. This methodology can be used to calculate Bell and Glasstone spatial correction factor.

  19. Efficiency of the pre-heater against flow rate on primary the beta test loop

    International Nuclear Information System (INIS)

    Edy Sumarno; Kiswanta; Bambang Heru; Ainur R; Joko P

    2013-01-01

    Calculation of efficiency of the pre-heater has been carried out against the flow rate on primary the BETA Test Loop. BETA test loop (UUB) is a facilities of experiments to study the thermal hydraulic phenomenon, especially for thermal hydraulic post-LOCA (Lost of Coolant Accident). Sequences removal on the BETA Test Loop contained a pre-heater that serves as a getter heat from the primary side to the secondary side, determination of efficiency is to compare the incoming heat energy with the energy taken out by a secondary fluid. Characterization is intended to determine the performance of a pre-heater, then used as tool for analysis, and as a reference design experiments. Calculation of efficiency methods performed by operating the pre-heater with fluid flow rate variation on the primary side. Calculation of efficiency on the results obtained that the efficiency change with every change of flow rate, the flow rate is 71.26% on 163.50 ml/s and 60.65% on 850.90 ml/s. Efficiency value can be even greater if the pre-heater tank is wrapped with thermal insulation so there is no heat leakage. (author)

  20. Testing of currency substitution effect on exchange rate volatility in Serbia

    Directory of Open Access Journals (Sweden)

    Petrović Predrag

    2016-01-01

    Full Text Available Despite numerous different definitions existing in the literature, currency substitution is generally understood as a phenomenon when domestic residents prefer to use foreign currency rather than domestic currency. The main reasons for such phenomenon include high and volatile inflation, strong depreciation of national currency and high interest rate differential in favour of foreign currency. Currency substitution, as a monetary phenomenon, is widely spread in Latin American, Eastern European and some Asian countries. This paper is dedicated to the influence of currency substitution on exchange rate volatility in Serbia. The research included testing of three hypotheses: (i currency substitution positively affects depreciation rate volatility, (ii depreciation rate volatility has stronger responses to the past negative than to the past positive depreciation shocks, and (iii currency substitution positively affects expected depreciation rate. The analysis was implemented for the period 2002:m1-2015:m12 (2004:m1- 2015:m12, applying modified EGARCH-M model. Based on the obtained results, all three hypotheses have been supremely rejected regardless of the manner of quantification of currency substitution.

  1. Does provider-initiated HIV testing and counselling lead to higher HIV testing rate and HIV case finding in Rwandan clinics?

    NARCIS (Netherlands)

    Kayigamba, Felix R.; van Santen, Daniëla; Bakker, Mirjam I.; Lammers, Judith; Mugisha, Veronicah; Bagiruwigize, Emmanuel; de Naeyer, Ludwig; Asiimwe, Anita; Schim van der Loeff, Maarten F.

    2016-01-01

    Provider-initiated HIV testing and counselling (PITC) is promoted as a means to increase HIV case finding. We assessed the effectiveness of PITC to increase HIV testing rate and HIV case finding among outpatients in Rwandan health facilities (HF). PITC was introduced in six HFs in 2009-2010. HIV

  2. A Robust Semi-Parametric Test for Detecting Trait-Dependent Diversification.

    Science.gov (United States)

    Rabosky, Daniel L; Huang, Huateng

    2016-03-01

    Rates of species diversification vary widely across the tree of life and there is considerable interest in identifying organismal traits that correlate with rates of speciation and extinction. However, it has been challenging to develop methodological frameworks for testing hypotheses about trait-dependent diversification that are robust to phylogenetic pseudoreplication and to directionally biased rates of character change. We describe a semi-parametric test for trait-dependent diversification that explicitly requires replicated associations between character states and diversification rates to detect effects. To use the method, diversification rates are reconstructed across a phylogenetic tree with no consideration of character states. A test statistic is then computed to measure the association between species-level traits and the corresponding diversification rate estimates at the tips of the tree. The empirical value of the test statistic is compared to a null distribution that is generated by structured permutations of evolutionary rates across the phylogeny. The test is applicable to binary discrete characters as well as continuous-valued traits and can accommodate extremely sparse sampling of character states at the tips of the tree. We apply the test to several empirical data sets and demonstrate that the method has acceptable Type I error rates. © The Author(s) 2015. Published by Oxford University Press, on behalf of the Society of Systematic Biologists. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  3. Assessment of change in knowledge about research methods among delegates attending research methodology workshop.

    Science.gov (United States)

    Shrivastava, Manisha; Shah, Nehal; Navaid, Seema

    2018-01-01

    In an era of evidence based medicine research is an essential part of medical profession whether clinical or academic. A research methodology workshop intends to help participants, those who are newer to research field or those who are already doing empirical research. The present study was conducted to assess the changes in knowledge of the participants of a research methodology workshop through a structured questionnaire. With administrative and ethical approval, a four day research methodology workshop was planned. The participants were subjected to a structured questionnaire (pre-test) containing 20 multiple choice questions (Q1-Q 20) related to the topics to be covered in research methodology workshop before the commencement of the workshop and then subjected to similar posttest questionnaire after the completion of workshop. The mean values of pre and post-test scores were calculated and the results were analyzed and compared. Out of the total 153 delegates, 45(29 %) were males and 108 were (71 %) females. 92 (60%) participants consented to fill the pre-test questionnaire and 68 (44%) filled the post-test questionnaire. The mean Pre-test and post-test scores at 95% Confidence Interval were 07.62 (SD ±3.220) and 09.66 (SD ±2.477) respectively. The differences were found to be significant using Paired Sample T test ( P research methodology workshops. Participatory research methodology workshops are good methods of imparting knowledge, also the long term effects needs to be evaluated.

  4. Summer 2012 Testing and Analysis of the Chemical Mixture Methodology -- Part I

    Energy Technology Data Exchange (ETDEWEB)

    Glantz, Clifford S.; Yu, Xiao-Ying; Coggin, Rebekah L.; Ponder, Lashaundra A.; Booth, Alexander E.; Petrocchi, Achille J.; Horn, Sarah M.; Yao, Juan

    2012-07-01

    This report presents the key findings made by the Chemical Mixture Methodology (CMM) project team during the first stage of their summer 2012 testing and analysis of the CMM. The study focused on answering the following questions: o What is the percentage of the chemicals in the CMM Rev 27 database associated with each Health Code Number (HCN)? How does this result influence the relative importance of acute HCNs and chronic HCNs in the CMM data set? o What is the benefit of using the HCN-based approach? Which Modes of Action and Target Organ Effects tend to be important in determining the HCN-based Hazard Index (HI) for a chemical mixture? o What are some of the potential issues associated with the current HCN-based approach? What are the opportunities for improving the performance and/or technical defensibility of the HCN-based approach? How would those improvements increase the benefit of using the HCN-based approach? o What is the Target Organ System Effect approach and how can it be used to improve upon the current HCN-based approach? How does the benefits users would derive from using the Target Organ System Approach compare to the benefits available from the current HCN-based approach?

  5. Loading rate and test temperature effects on fracture of in situ niobium silicide-niobium composites

    International Nuclear Information System (INIS)

    Rigney, J.D.; Lewandowski, J.J.

    1996-01-01

    Arc cast, extruded, and heat-treated in situ composites of niobium silicide (Nb 5 Si 3 ) intermetallic with niobium phases (primary--Nb p and secondary--Nb s ) exhibited high fracture resistance in comparison to monolithic Nb 5 Si 3 . In toughness tests conducted at 298 K and slow applied loading rates, the fracture process proceeded by the microcracking of the Nb 5 Si 3 and plastic deformation of the Nb p and Nb s phases, producing resistance-curve behavior and toughnesses of 28 MPa√m with damage zone lengths less than 500 microm. The effects of changes in the Nb p yield strength and fracture behavior on the measured toughnesses were investigated by varying the loading rates during fracture tests at both 77 and 298 K. Quantitative fractography was utilized to completely characterize each fracture surface created at 298 K in order to determine the type of fracture mode (i.e., dimpled, cleavage) exhibited by the Nb p . Specimens tested at either higher loading rates or lower test temperatures consistently exhibited a greater amount of cleavage fracture in the Nb p , while the Nb s always remained ductile. However, the fracture toughness values determined from experiments spanning six orders of magnitude in loading rate at 298 and 77 K exhibited little variation, even under conditions when the majority of Nb p phases failed by cleavage at 77 K. The changes in fracture mode with increasing loading rate and/or decreasing test temperature and their effects on fracture toughness are rationalized by comparison to existing theoretical models

  6. Respirator Filter Efficiency Testing Against Particulate and Biological Aerosols Under Moderate to High Flow Rates

    Science.gov (United States)

    2006-08-01

    flow rate through the test filter. The flow rate was measured using a mass flow meter (Series 4000, TSI, Shoreview, MN). Several modifications were made...operating conditions. This included assessing the effect of non- isokinetic sampling, flow calibrations, and characterization of the challenge...sampling bias on the measured penetrations due to the non- isokinetic sampling downstream. 3.3.2.2 System Characterization. Shakedown tests were

  7. 78 FR 44459 - Rate Regulation Reforms

    Science.gov (United States)

    2013-07-24

    ... the interest rate. A simple multiplication of the nominal rate by the portion of the year covered by... makes technical changes to the full and simplified rate procedures; changes the interest rate that... allocation methodology for cross-over traffic. Part IV sets out the change in the interest rate carriers must...

  8. Combining several thermal indices to generate a unique heat comfort assessment methodology

    Directory of Open Access Journals (Sweden)

    Wissam EL Hachem

    2015-11-01

    Full Text Available Purpose: The proposed methodology hopes to provide a systematic multi-disciplinary approach to assess the thermal environment while minimizing unneeded efforts. Design/methodology/approach: Different factors affect the perception of the human thermal experience: metabolic rate (biology, surrounding temperatures (heat balance and environmental factors and cognitive treatment (physiology.This paper proposes a combination of different multidisciplinary variables to generate a unique heat comfort assessment methodology. The variables at stake are physiological, biological, and environmental. Our own heat analysis is thoroughly presented and all relevant equations are described. Findings: Most companies are oblivious about potential dangers of heat stress accidents and thus about methods to monitor and prevent them. This methodology enables the company or the concerned individual to conduct a preliminary assessment with minimal wasted resources and time in unnecessary steps whilst providing a guideline for a detailed study with minimal error rates if needed. More so, thermal comfort is an integral part of sound ergonomics practices, which in turn are decisive for the success of any lean six sigma initiative. Research limitations/implications: This methodology requires several full implementations to finalize its design. Originality/value: Most used heat comfort models are inherently uncertain and tiresome to apply. An extensive literature review confirms the need for a uniform assessment methodology that combines the different thermal comfort models such as the Fanger comfort model (PMV, PPD and WGBT since high error rates coupled with tiresome calculations often hinder the thermal assessment process.

  9. Plutonium Discharge Rates and Spent Nuclear Fuel Inventory Estimates for Nuclear Reactors Worldwide

    Energy Technology Data Exchange (ETDEWEB)

    Brian K. Castle; Shauna A. Hoiland; Richard A. Rankin; James W. Sterbentz

    2012-09-01

    This report presents a preliminary survey and analysis of the five primary types of commercial nuclear power reactors currently in use around the world. Plutonium mass discharge rates from the reactors’ spent fuel at reload are estimated based on a simple methodology that is able to use limited reactor burnup and operational characteristics collected from a variety of public domain sources. Selected commercial reactor operating and nuclear core characteristics are also given for each reactor type. In addition to the worldwide commercial reactors survey, a materials test reactor survey was conducted to identify reactors of this type with a significant core power rating. Over 100 material or research reactors with a core power rating >1 MW fall into this category. Fuel characteristics and spent fuel inventories for these material test reactors are also provided herein.

  10. The PWR integrated leak rate test, a review of experiences and results

    International Nuclear Information System (INIS)

    Keogh, P.

    1985-01-01

    The paper reviews the Integrated Leak Rate Test (ILRT) as it has been carried out in the USA and as reported in papers in European countries. The test procedures are critically appraised and recommendations are given for modifications to them. The values used in a PWR are identified as a main source of leaks and possibilities for improvement are discussed. The use of a part pressure test and its limitations are considered. A part pressure test cannot give the same assurance as a full pressure test but may be useful for the identification of gross leaks. Secondary effects such as weather and the use of Van de Waals equations are considered and are found to be not important for concrete containments. (orig.)

  11. Statistical trend analysis methodology for rare failures in changing technical systems

    International Nuclear Information System (INIS)

    Ott, K.O.; Hoffmann, H.J.

    1983-07-01

    A methodology for a statistical trend analysis (STA) in failure rates is presented. It applies primarily to relatively rare events in changing technologies or components. The formulation is more general and the assumptions are less restrictive than in a previously published version. Relations of the statistical analysis and probabilistic assessment (PRA) are discussed in terms of categorization of decisions for action following particular failure events. The significance of tentatively identified trends is explored. In addition to statistical tests for trend significance, a combination of STA and PRA results quantifying the trend complement is proposed. The STA approach is compared with other concepts for trend characterization. (orig.)

  12. Use and Application of the SADRWMS Methodology and SAFRAN Tool on the Thailand Institute of Nuclear Technology (TINT) Radioactive Waste Management Facility. Test Case Results. 05 October 2011

    International Nuclear Information System (INIS)

    2015-01-01

    The purpose of this document is to describe the working procedure of the test case and to provide feedback on the application of the methodology described in DS284 and the SAFRAN tool. This report documents how the test case was performed, describes how the methodology and software tool were applied, and provides feedback on the use and application of the SAFRAN Tool. The aim of this document is to address the key elements of the safety assessment and to demonstrate their principle contents and roles within the overall context of the safety case. This is done with particular emphasis on investigating the role of the SAFRAN Tool in developing a safety case for facilities similar to the TINT Facility. It is intended that this report will be the first of a series of complimentary safety reports illustrating the use and application of the methodology prescribed in DS284 and the application of the SAFRAN tool to a range of predisposal radioactive waste management activities

  13. A Comparison of Validity Rates between Paper-and-Pencil and Computerized Testing with the MMPI-2

    Science.gov (United States)

    Blazek, Nicole L.; Forbey, Johnathan D.

    2011-01-01

    Although the use of computerized testing in psychopathology assessment has increased in recent years, limited research has examined the impact of this format in terms of potential differences in test validity rates. The current study explores potential differences in the rates of valid and invalid Minnesota Multiphasic Personality Inventory--2…

  14. Size effects in fcc crystals during the high rate compression test

    International Nuclear Information System (INIS)

    Yaghoobi, Mohammadreza; Voyiadjis, George Z.

    2016-01-01

    The present work studies the different mechanisms of size effects in fcc metallic samples of confined volumes during high rate compression tests using large scale atomistic simulation. Different mechanisms of size effects, including the dislocation starvation, source exhaustion, and dislocation source length effect are investigated for pillars with different sizes. The results show that the controlling mechanisms of size effects depend only on the pillar size and not on the value of applied strain. Dislocation starvation is the governing mechanism for very small pillars, i.e. pillars with diameters less than 30 nm. Increasing the pillar size, the dislocation exhaustion mechanism becomes active and there is no more source-limited activations. Next, the average dislocation source length is obtained and compared for pillars with different sizes. The results show that in the case of high rate deformations, the source length does not depend on the sample size, and the related size effects mechanisms are not active anymore. Also, in the case of high rate deformations, there are no size effects for pristine pillars with the diameters larger than 135 nm. In other words, increasing the strain rate decreases the pillar size at which there is no more size effects in the absence of strain gradient. The governing mechanisms of plastic deformation at high strain rate experiments are also different from those of the quasi-static tests. First, the diameter in which the dislocation nucleation at the free surface becomes the dominant mechanism changes from around 200 nm–30 nm. Next, in the case of the pillars with larger diameters, the plastic deformation is governed by the cross-slip instead of the operation of truncated dislocation sources, which is dominant at slower rates of deformation. In order to study the effects of pillar initial structure on the controlling mechanism of size effects, an initial loading and unloading procedure is conducted on some samples prior to the

  15. Standard Test Method for Determining Thermal Neutron Reaction Rates and Thermal Neutron Fluence Rates by Radioactivation Techniques

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2008-01-01

    1.1 The purpose of this test method is to define a general procedure for determining an unknown thermal-neutron fluence rate by neutron activation techniques. It is not practicable to describe completely a technique applicable to the large number of experimental situations that require the measurement of a thermal-neutron fluence rate. Therefore, this method is presented so that the user may adapt to his particular situation the fundamental procedures of the following techniques. 1.1.1 Radiometric counting technique using pure cobalt, pure gold, pure indium, cobalt-aluminum, alloy, gold-aluminum alloy, or indium-aluminum alloy. 1.1.2 Standard comparison technique using pure gold, or gold-aluminum alloy, and 1.1.3 Secondary standard comparison techniques using pure indium, indium-aluminum alloy, pure dysprosium, or dysprosium-aluminum alloy. 1.2 The techniques presented are limited to measurements at room temperatures. However, special problems when making thermal-neutron fluence rate measurements in high-...

  16. Methodological Approaches to Experimental Teaching of Mathematics to University Students

    Directory of Open Access Journals (Sweden)

    Nikolay I.

    2018-03-01

    Full Text Available Introduction: the article imparts authors’ thoughtson a new teaching methodology for mathematical education in universities. The aim of the study is to substantiate the efficiency of the comprehensive usage of mathematical electronic courses, computer tests, original textbooks and methodologies when teaching mathematics to future agrarian engineers. The authors consider this implementation a unified educational process. Materials and Methods: the synthesis of international and domestic pedagogical experience of teaching students in university and the following methods of empirical research were used: pedagogical experiment, pedagogical measurementsand experimental teaching of mathematics. The authors applied the methodology of revealing interdisciplinary links on the continuum of mathematical problems using the key examples and exercises. Results: the online course “Mathematics” was designed and developed on the platform of Learning Management System Moodle. The article presents the results of test assignments assessing students’ intellectual abilities and analysis of solutions of various types of mathematical problems by students. The pedagogical experiment substantiated the integrated selection of textbooks, online course and online tests using the methodology of determination of the key examples and exercises. Discussion and Conclusions: the analysis of the experimental work suggested that the new methodology is able to have positive effect on the learning process. The learning programme determined the problem points for each student. The findings of this study have a number of important implications for future educational practice.

  17. LWR surveillance dosimetry improvement program: PSF metallurgical blind test results

    International Nuclear Information System (INIS)

    Kam, F.B.K.; Stallmann, F.W.; Guthrie, G.; McElroy, W.N.

    1985-01-01

    The ORR-PSF benchmark experiment was designed to simulate the surveillance capsule-pressure vessel configuration in power reactors and to test the validity of procedures which determine the radiation damage in the vessel from test results in the surveillance capsule. The PSF metallurgical blind test was initiated to give participants an opportunity to test their current embrittlement prediction methodologies. Experimental results were withheld from the participants except for the type of information which is normally contained in surveillance reports. Preliminary analysis of the PSF metallurgical blind test results shows that: (1) current prediction methodologies, as used by the PSF Blind Test participants, are adequate, falling within +- 20 0 C of the measured values for Δ NDT. None of the different methods is clearly superior; (2) the proposed revision of Reg. Guide 1.99 (Rev. 2) gives a better representation of the fluence and chemistry dependency of Δ NDT than the current version (Rev. 1); and (3) fluence rate effects can be seen but not quantified. Fluence spectral effects are too small to be detectable in this experiment. (orig.)

  18. Using the building energy simulation test (BESTEST) to evaluate CHENATH, the Nationwide House Energy Rating Scheme Simulation Engine

    Energy Technology Data Exchange (ETDEWEB)

    Delsante, A.E. [Commonwealth Scientific and Industrial Research Organisation (CSIRO), Highett, VIC (Australia). Div. of Building Construction and Engineering

    1995-12-31

    The Nationwide House Energy Rating Scheme (NatHERS) uses a simulation program as its reference tool to evaluate the energy demand of buildings. The Commonwealth Scientific Industrial Research Organisation (CSIRO) developed software called CHENATH, is a significantly enhanced version of the CHEETAH simulation program. As part of the NatHERS development process, it was considered important to subject CHENATH to further testing. Two separate evaluation projects were undertaken. This paper describes one of these projects. CHENATH was compared with a reference set of eight internationally recognized simulation programs using the BESTEST methodology. Annual heating and cooling energy requirements were compared for a specified set of variations on a simple double-glazed building. Annual incident and transmitted solar radiation was also compared, for which CHENATH agreed very well with the reference set. It also agreed well for heating energy, but tended to over-predict cooling energy. This is largely because it controls an environmental temperature rather than the required air temperature. For the same reason CHENATH over-predicted heating and cooling demands. No major discrepancies were found that would suggest bugs in the program. (author). 4 tabs., 10 figs., 4 refs.

  19. 78 FR 41129 - Market Test of Experimental Product - International Merchandise Return Service-Non-Published Rates

    Science.gov (United States)

    2013-07-09

    ...--Non-Published Rates AGENCY: U.S. Postal Service\\TM\\. ACTION: Notice. SUMMARY: The Postal Service hereby gives notice of a market test for International Merchandise Return Service--Non-Published Rates in... Return Service (IMRS) Non-published Rate (NPR) experimental product on August 15, 2013. The Postal...

  20. Small Scale Hydrocarbon Fire Test Concept

    Directory of Open Access Journals (Sweden)

    Joachim Søreng Bjørge

    2017-11-01

    Full Text Available In the oil and gas industry, hydrocarbon process equipment was previously often thermally insulated by applying insulation directly to the metal surface. Fire protective insulation was applied outside the thermal insulation. In some cases, severe corrosion attacks were observed due to ingress of humidity and condensation at cold surfaces. Introducing a 25 mm air gap to prevent wet thermal insulation and metal wall contact is expected to solve the corrosion issues. This improved insulation methodology does, however, require more space that may not be available when refurbishing older process plants. Relocating structural elements would introduce much hot work, which should be minimized in live plants. It is also costly. The aim of the present study is therefore to develop a test concept for testing fire resistance of equipment protected with only air-gap and thermal insulation, i.e., without the fire-protective insulation. The present work demonstrates a conceptual methodology for small scale fire testing of mockups resembling a section of a distillation column. The mockups were exposed to a small-scale propane flame in a test configuration where the flow rate and the flame zone were optimized to give heat flux levels in the range 250–350 kW/m2. Results are presented for a mockup resembling a 16 mm thick distillation column steel wall. It is demonstrated that the modern distance insulation in combination with the heat capacity of the column wall indicates 30+ minutes fire resistance. The results show that this methodology has great potentials for low cost fire testing of other configurations, and it may serve as a set-up for product development.

  1. Bayesian methodology for reliability model acceptance

    International Nuclear Information System (INIS)

    Zhang Ruoxue; Mahadevan, Sankaran

    2003-01-01

    This paper develops a methodology to assess the reliability computation model validity using the concept of Bayesian hypothesis testing, by comparing the model prediction and experimental observation, when there is only one computational model available to evaluate system behavior. Time-independent and time-dependent problems are investigated, with consideration of both cases: with and without statistical uncertainty in the model. The case of time-independent failure probability prediction with no statistical uncertainty is a straightforward application of Bayesian hypothesis testing. However, for the life prediction (time-dependent reliability) problem, a new methodology is developed in this paper to make the same Bayesian hypothesis testing concept applicable. With the existence of statistical uncertainty in the model, in addition to the application of a predictor estimator of the Bayes factor, the uncertainty in the Bayes factor is explicitly quantified through treating it as a random variable and calculating the probability that it exceeds a specified value. The developed method provides a rational criterion to decision-makers for the acceptance or rejection of the computational model

  2. Mean Abnormal Result Rate: Proof of Concept of a New Metric for Benchmarking Selectivity in Laboratory Test Ordering.

    Science.gov (United States)

    Naugler, Christopher T; Guo, Maggie

    2016-04-01

    There is a need to develop and validate new metrics to access the appropriateness of laboratory test requests. The mean abnormal result rate (MARR) is a proposed measure of ordering selectivity, the premise being that higher mean abnormal rates represent more selective test ordering. As a validation of this metric, we compared the abnormal rate of lab tests with the number of tests ordered on the same requisition. We hypothesized that requisitions with larger numbers of requested tests represent less selective test ordering and therefore would have a lower overall abnormal rate. We examined 3,864,083 tests ordered on 451,895 requisitions and found that the MARR decreased from about 25% if one test was ordered to about 7% if nine or more tests were ordered, consistent with less selectivity when more tests were ordered. We then examined the MARR for community-based testing for 1,340 family physicians and found both a wide variation in MARR as well as an inverse relationship between the total tests ordered per year per physician and the physician-specific MARR. The proposed metric represents a new utilization metric for benchmarking relative selectivity of test orders among physicians. © American Society for Clinical Pathology, 2016. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  3. A Quantitative Examination of Critical Success Factors Comparing Agile and Waterfall Project Management Methodologies

    Science.gov (United States)

    Pedersen, Mitra

    2013-01-01

    This study investigated the rate of success for IT projects using agile and standard project management methodologies. Any successful project requires use of project methodology. Specifically, large projects require formal project management methodologies or models, which establish a blueprint of processes and project planning activities. This…

  4. Geotechnical site assessment methodology

    International Nuclear Information System (INIS)

    Tunbridge, L.W.; Richards, L.R.

    1985-09-01

    The reports comprising this volume concern the research conducted on geotechnical site assessment methodology at the Carwynnen test mine in granites in Cornwall, with particular reference to the effect of structures imposed by discontinuities on the engineering behaviour of rock masses. The topics covered are: in-situ stress measurements using (a) the hydraulic fracturing method, or (b) the US Bureau of Mines deformation probe; scanline discontinuity survey - coding form and instructions, and data; applicability of geostatistical estimation methods to scalar rock properties; comments on in-situ stress at the Carwynnen test mine and the state of stress in the British Isles. (U.K.)

  5. Safety assessment of a vault-based disposal facility using the ISAM methodology

    International Nuclear Information System (INIS)

    Kelly, E.; Kim, C.-L.; Lietava, P.; Little, R.; Simon, I.

    2002-01-01

    As part of the IAEA's Co-ordinated Research Project (CRP) on Improving Long-term of Safety Assessment Methodologies for Near Surface Waste Disposal Facilities (ISAM), three example cases were developed. The aim was to testing the ISAM safety assessment methodology using as realistic as possible data. One of the Test Cases, the Vault Test Case (VTC), related to the disposal of low level radioactive waste (LLW) to a hypothetical facility comprising a set of above surface vaults. This paper uses the various steps of the ISAM safety assessment methodology to describe the work undertaken by ISAM participants in developing the VTC and provides some general conclusions that can be drawn from the findings of their work. (author)

  6. Methodological individualism as opposed to methodological holism. History, relevancy and the implications of the (insoluble? debate on the explanatory capacity and scientific status of sociocultural anthropology

    Directory of Open Access Journals (Sweden)

    Nina Kulenović

    2016-02-01

    Full Text Available The paper is part of wider research into the status of explanation in the debate on the scientific status of anthropology – wherein one of the key assumptions is that there is a strong correlation between theoretical and methodological structures which would make them inseparable, and that explanation or explanatory potential, is the point of convergence which can be used to test for the possibility of separating theoretical and methodological structures in the first place. To test this idea, a line of debate between methodological holism and methodological individualism – one of the longest running and most complex debates in the social sciences and humanities – was considered. The historical background of the debate has been highlighted, and its relevancy and implications in the controversy about the explanatory capacity and scientific status of sociocultural anthropology.

  7. Use of cesium-137 methodology in the evaluation of superficial erosive processes

    International Nuclear Information System (INIS)

    Andrello, Avacir Casanova; Appoloni, Carlos Roberto; Guimaraes, Maria de Fatima; Nascimento Filho, Virgilio Franco do

    2003-01-01

    Superficial erosion is one of the main soil degradation agents and erosion rates estimations for different edaphic climate conditions for the conventional models, as USLE and RUSLE, are expensive and time-consuming. The use of cesium- 137 anthropogenic radionuclide is a new methodology that has been much studied and its application in the erosion soil evaluation has grown in countries as USA, UK, Australia and others. A brief narration of this methodology is being presented, as the development of the equations utilized for the erosion rates quantification through the cesium- 137 measurements. Two watersheds studied in Brazil have shown that the cesium- 137 methodology was practicable and coherent with the survey in field for applications in erosion studies. (author)

  8. Bit-error-rate testing of fiber optic data links for MMIC-based phased array antennas

    Science.gov (United States)

    Shalkhauser, K. A.; Kunath, R. R.; Daryoush, A. S.

    1990-01-01

    The measured bit-error-rate (BER) performance of a fiber optic data link to be used in satellite communications systems is presented and discussed. In the testing, the link was measured for its ability to carry high burst rate, serial-minimum shift keyed (SMSK) digital data similar to those used in actual space communications systems. The fiber optic data link, as part of a dual-segment injection-locked RF fiber optic link system, offers a means to distribute these signals to the many radiating elements of a phased array antenna. Test procedures, experimental arrangements, and test results are presented.

  9. Bit error rate testing of fiber optic data links for MMIC-based phased array antennas

    Science.gov (United States)

    Shalkhauser, K. A.; Kunath, R. R.; Daryoush, A. S.

    1990-01-01

    The measured bit-error-rate (BER) performance of a fiber optic data link to be used in satellite communications systems is presented and discussed. In the testing, the link was measured for its ability to carry high burst rate, serial-minimum shift keyed (SMSK) digital data similar to those used in actual space communications systems. The fiber optic data link, as part of a dual-segment injection-locked RF fiber optic link system, offers a means to distribute these signals to the many radiating elements of a phased array antenna. Test procedures, experimental arrangements, and test results are presented.

  10. Methods for measuring specific rates of mercury methylation and degradation and their use in determining factors controlling net rates of mercury methylation

    International Nuclear Information System (INIS)

    Ramlal, P.S.; Rudd, J.W.M.; Hecky, R.E.

    1986-01-01

    A method was developed to estimate specific rates of demethylation of methyl mercury in aquatic samples by measuring the volatile 14 C end products of 14 CH 3 HgI demethylation. This method was used in conjuction with a 203 Hg 2+ radiochemical method which determines specific rates of mercury methylation. Together, these methods enabled us to examine some factors controlling the net rate of mercury methylation. The methodologies were field tested, using lake sediment samples from a recently flooded reservoir in the Southern Indian Lake system which had developed a mercury contamination problem in fish. Ratios of the specific rates of methylation/demethylation were calculated. The highest ratios of methylation/demethylation were calculated. The highest ratios of methylation/demethylation occurred in the flooded shorelines of Southern Indian Lake. These results provide an explanation for the observed increases in the methyl mercury concentrations in fish after flooding

  11. Robust PV Degradation Methodology and Application

    Energy Technology Data Exchange (ETDEWEB)

    Jordan, Dirk [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Deline, Christopher A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Kurtz, Sarah [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Kimball, Greg [SunPower; Anderson, Mike [SunPower

    2017-11-15

    The degradation rate plays an important role in predicting and assessing the long-term energy generation of PV systems. Many methods have been proposed for extracting the degradation rate from operational data of PV systems, but most of the published approaches are susceptible to bias due to inverter clipping, module soiling, temporary outages, seasonality, and sensor degradation. In this manuscript, we propose a methodology for determining PV degradation leveraging available modeled clear-sky irradiance data rather than site sensor data, and a robust year-over-year (YOY) rate calculation. We show the method to provide reliable degradation rate estimates even in the case of sensor drift, data shifts, and soiling. Compared with alternate methods, we demonstrate that the proposed method delivers the lowest uncertainty in degradation rate estimates for a fleet of 486 PV systems.

  12. Test Review: Autism Spectrum Rating Scales

    Science.gov (United States)

    Simek, Amber N.; Wahlberg, Andrea C.

    2011-01-01

    This article reviews Autism Spectrum Rating Scales (ASRS) which are designed to measure behaviors in children between the ages of 2 and 18 that are associated with disorders on the autism spectrum as rated by parents/caregivers and/or teachers. The rating scales include items related to behaviors associated with Autism, Asperger's Disorder, and…

  13. A generic semi-implicit coupling methodology for use in RELAP5-3D(c)

    International Nuclear Information System (INIS)

    Weaver, W.L.; Tomlinson, E.T.; Aumiller, D.L.

    2002-01-01

    A generic semi-implicit coupling methodology has been developed and implemented in the RELAP5-3D (c) computer program. This methodology allows RELAP5-3D (c) to be used with other computer programs to perform integrated analyses of nuclear power reactor systems and related experimental facilities. The coupling methodology potentially allows different programs to be used to model different portions of the system. The programs are chosen based on their capability to model the phenomena that are important in the simulation in the various portions of the system being considered and may use different numbers of conservation equations to model fluid flow in their respective solution domains. The methodology was demonstrated using a test case in which the test geometry was divided into two parts, each of which was solved as a RELAP5-3D (c) simulation. This test problem exercised all of the semi-implicit coupling features that were implemented in RELAP5-3D (c) The results of this verification test case show that the semi-implicit coupling methodology produces the same answer as the simulation of the test system as a single process

  14. Methodological Behaviorism from the Standpoint of a Radical Behaviorist.

    Science.gov (United States)

    Moore, J

    2013-01-01

    Methodological behaviorism is the name for a prescriptive orientation to psychological science. Its first and original feature is that the terms and concepts deployed in psychological theories and explanations should be based on observable stimuli and behavior. I argue that the interpretation of the phrase "based on" has changed over the years because of the influence of operationism. Its second feature, which developed after the first and is prominent in contemporary psychology, is that research should emphasize formal testing of a theory that involves mediating theoretical entities from an nonbehavioral dimension according to the hypothetico-deductive method. I argue that for contemporary methodological behaviorism, explanations of the behavior of both participants and scientists appeal to the mediating entities as mental causes, if only indirectly. In contrast to methodological behaviorism is the radical behaviorism of B. F. Skinner. Unlike methodological behaviorism, radical behaviorism conceives of verbal behavior in terms of an operant process that involves antecedent circumstances and reinforcing consequences, rather than in terms of a nonbehavioral process that involves reference and symbolism. In addition, radical behaviorism recognizes private behavioral events and subscribes to research and explanatory practices that do not include testing hypotheses about supposed mediating entities from another dimension. I conclude that methodological behaviorism is actually closer to mentalism than to Skinner's radical behaviorism.

  15. Prescriptive Training Courseware: IS-Design Methodology

    Directory of Open Access Journals (Sweden)

    Elspeth McKay

    2018-03-01

    Full Text Available Information systems (IS research is found in many diverse communities. This paper explores the human-dimension of human-computer interaction (HCI to present IS-design practice in the light of courseware development. Assumptions are made that online courseware provides the perfect solution for maintaining a knowledgeable, well skilled workforce. However, empirical investigations into the effectiveness of information technology (IT-induced training solutions are scarce. Contemporary research concentrates on information communications technology (ICT training tools without considering their effectiveness. This paper offers a prescriptive IS-design methodology for managing the requirements for efficient and effective courseware development. To develop the methodology, we examined the main instructional design (ID factors that affect the design of IT-induced training programs. We also examined the tension between maintaining a well-skilled workforce and effective instructional systems design (ISD practice by probing the current ID models used by courseware developers since 1990. An empirical research project, which utilized this IS-design methodology investigated the effectiveness of using IT to train government employees in introductory ethics; this was a study that operationalized the interactive effect of cognitive preference and instructional format on training performance outcomes. The data was analysed using Rasch item response theory (IRT that models the discrimination of people’s performance relative to each other’s performance and the test-items’ difficulty relative to each test-item on the same logit scale. The findings revealed that IS training solutions developed using this IS-design methodology can be adapted to provide trainees with their preferred instructional mode and facilitate cost effective eTraining outcomes.

  16. Testing methodology of diamond composite inserts to be used in the drilling of petroleum wells; Metodologia de testes de insertos compositos diamantados a serem usados na perfuracao de pocos de petroleo

    Energy Technology Data Exchange (ETDEWEB)

    Bobrovnitchii, G.S.; Filgueira, M.; Skury, A.L.D.; Tardim, R.C. [Universidade Estadual do Norte Fluminense (UENF), Campos dos Goytacazes, RJ (Brazil)], e-mail: rtardim@terra.com.br

    2006-07-01

    The useful life of the inserts used in the cutters of the drills for perforation of oil wells determines the quality of the perforation as well as the productivity. Therefore, the research of the wear of insert is carried through with the objective to foretell the most important properties of the inserts. Due to the fact of the UENF to be developing the processes of composites sintering to the synthetic diamond base, it is interesting to define the testing methodology of the gotten inserts. The proposed methodology is based on the evaluation of the wear suffered by de sample. For this end a micro processed 'Abrasimeter', model AB800-E, manufactured for the Contenco Company was used. The instrument capacity is 1,36 kVA; axial load applied in the cutter up to 50 kgf; rotation of table speed 20 rpm; course of the tool in radial direction speed before 2 m/min; dimensions of the granite block D = 808 mm, d = 484 mm, h = 50 mm. The gotten results show that the proposed methodology can be used for the evaluation of the inserts of the cutters applied in perforation drills. (author)

  17. Standard Test Method for Measuring Heat Transfer Rate Using a Thin-Skin Calorimeter

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2005-01-01

    1.1 This test method covers the design and use of a thin metallic calorimeter for measuring heat transfer rate (also called heat flux). Thermocouples are attached to the unexposed surface of the calorimeter. A one-dimensional heat flow analysis is used for calculating the heat transfer rate from the temperature measurements. Applications include aerodynamic heating, laser and radiation power measurements, and fire safety testing. 1.2 Advantages 1.2.1 Simplicity of ConstructionThe calorimeter may be constructed from a number of materials. The size and shape can often be made to match the actual application. Thermocouples may be attached to the metal by spot, electron beam, or laser welding. 1.2.2 Heat transfer rate distributions may be obtained if metals with low thermal conductivity, such as some stainless steels, are used. 1.2.3 The calorimeters can be fabricated with smooth surfaces, without insulators or plugs and the attendant temperature discontinuities, to provide more realistic flow conditions for ...

  18. Calorimeter measures high nuclear heating rates and their gradients across a reactor test hole

    Science.gov (United States)

    Burwell, D.; Coombe, J. R.; Mc Bride, J.

    1970-01-01

    Pedestal-type calorimeter measures gamma-ray heating rates from 0.5 to 7.0 watts per gram of aluminum. Nuclear heating rate is a function of cylinder temperature change, measured by four chromel-alumel thermocouples attached to the calorimeter, and known thermoconductivity of the tested material.

  19. Measurement test on creep strain rate of uranium-zirconium solid solutions

    International Nuclear Information System (INIS)

    Ogata, Takanari; Akabori, Mitsuo; Ogawa, Toru

    1996-11-01

    In order to measure creep strain rate of a small specimen of U-Zr solid solution, authors proposed an estimation method which was based upon the stress relaxation after compression. It was applied to measurement test on creep strain rate of the U-10wt%Zr specimen in the temperature range of 757 to 911degC. It may be concluded that the proposed method is valid, provided that the strain is within the appropriate range and that sufficient amount of the load decrement is observed. The obtained creep rate of U-10wt%Zr alloy indicated significantly smaller value, compared to the experimental data for pure U metal and evaluated data for U-Pu-Zr alloy. However, more careful measurement is desired in future since the present data are thought to be influenced by the precipitations included in the specimen. (author)

  20. A bench-scale biotreatability methodology to evaluate field bioremediation

    International Nuclear Information System (INIS)

    Saberiyan, A.G.; MacPherson, J.R. Jr.; Moore, R.; Pruess, A.J.; Andrilenas, J.S.

    1995-01-01

    A bench-scale biotreatability methodology was designed to assess field bioremediation of petroleum contaminated soil samples. This methodology was performed successfully on soil samples from more than 40 sites. The methodology is composed of two phases, characterization and experimentation. The first phase is physical, chemical, and biological characterization of the contaminated soil sample. This phase determines soil parameters, contaminant type, presence of indigenous contaminant-degrading bacteria, and bacterial population size. The second phase, experimentation, consists of a respirometry test to measure the growth of microbes indirectly (via generation of CO 2 ) and the consumption of their food source directly (via contaminant loss). Based on a Monod kinetic analysis, the half-life of a contaminant can be calculated. Abiotic losses are accounted for based on a control test. The contaminant molecular structure is used to generate a stoichiometric equation. The stoichiometric equation yields a theoretical ratio for mg of contaminant degraded per mg of CO 2 produced. Data collected from the respirometry test are compared to theoretical values to evaluate bioremediation feasibility

  1. Evaluation of LLTR Series II tests A-1A and A-1B test results

    International Nuclear Information System (INIS)

    Shoopak, B.F.; Amos, J.C.; Norvell, T.J.

    1980-03-01

    The standard methodology, with minor modifications provides conservative yet realistic predictions of leaksite and other sodium system pressures in the LLTR Series II vessel and piping. The good agreement between predicted and measured pressures indicates that the TRANSWRAP/RELAP modeling developed from the Series I tests is applicable to larger scale units prototypical of the Clinch River steam generator design. Calculated sodium system pressures are sensitive to several modeling parameters including rupture disc modeling, acoustic velocity in the test vessel, and flow rate from the rupture tube. The acoustic velocity which produced best agreement with leaksite pressures was calculated based on the shroud diameter and shroud wall thickness. The corresponding rupture tube discharge coefficient was that of the standard design methodology developed from Series I testing. As found in Series I testing, the Series II data suggests that the leading edge of the flow in the relief line is two phase for a single, doubled-ended guillotine tube rupture. The steam generator shroud acts as if it is relatively transparent to the transmission of radial pressures to the vessel wall. Slightly lower sodium system maximum pressures measured during Test A-1b compared to Test A-1a are attributed to premature failure (failure at a lower pressure) of the rupture disc in contact with the sodium for test A-1b. The delay in failure of the second disc in Test A-1b, which was successfully modeled with TRANSWRAP, is attributed to the limited energy in the nitrogen injection

  2. Tractor accelerated test on test rig

    Directory of Open Access Journals (Sweden)

    M. Mattetti

    2013-09-01

    Full Text Available The experimental tests performed to validate a tractor prototype before its production, need a substantial financial and time commitment. The tests could be reduced using accelerated tests able to reproduce on the structural part of the tractor, the same damage produced on the tractor during real life in a reduced time. These tests were usually performed reproducing a particular harsh condition a defined number of times, as for example using a bumpy road on track to carry out the test in any weather condition. Using these procedures the loads applied on the tractor structure are different with respect to those obtained during the real use, with the risk to apply loads hard to find in reality. Recently it has been demonstrated how, using the methodologies designed for cars, it is possible to also expedite the structural tests for tractors. In particular, automotive proving grounds were recently successfully used with tractors to perform accelerated structural tests able to reproduce the real use of the machine with an acceleration factor higher than that obtained with the traditional methods. However, the acceleration factor obtained with a tractor on proving grounds is in any case reduced due to the reduced speed of the tractors with respect to cars. In this context, the goal of the paper is to show the development of a methodology to perform an accelerated structural test on a medium power tractor using a 4 post test rig. In particular, several proving ground testing conditions have been performed to measure the loads on the tractor. The loads obtained were then edited to remove the not damaging portion of signals, and finally the loads obtained were reproduced in a 4 post test rig. The methodology proposed could be a valid alternative to the use of a proving ground to reproduce accelerated structural tests on tractors.

  3. Assessing importance and satisfaction judgments of intermodal work commuters with electronic survey methodology.

    Science.gov (United States)

    2013-09-01

    Recent advances in multivariate methodology provide an opportunity to further the assessment of service offerings in public transportation for work commuting. We offer methodologies that are alternative to direct rating scale and have advantages in t...

  4. Selection of skin dose calculation methodologies

    International Nuclear Information System (INIS)

    Farrell, W.E.

    1987-01-01

    This paper reports that good health physics practice dictates that a dose assessment be performed for any significant skin contamination incident. There are, however, several methodologies that could be used, and while there is probably o single methodology that is proper for all cases of skin contamination, some are clearly more appropriate than others. This can be demonstrated by examining two of the more distinctly different options available for estimating skin dose the calculational methods. The methods compiled by Healy require separate beta and gamma calculations. The beta calculational method is the derived by Loevinger, while the gamma dose is calculated from the equation for dose rate from an infinite plane source with an absorber between the source and the detector. Healy has provided these formulas in graphical form to facilitate rapid dose rate determinations at density thicknesses of 7 and 20 mg/cm 2 . These density thicknesses equate to the regulatory definition of the sensitive layer of the skin and a more arbitrary value to account of beta absorption in contaminated clothing

  5. Insights from implementation of a risk management methodology

    International Nuclear Information System (INIS)

    Mahn, J.A.; Germann, R.P.; Jacobs, R.R.

    1992-01-01

    In 1988, GPU Nuclear (GPUN) Corporation embarked on a research effort to identify or develop an appropriate methodology for proactively managing risks. The objective of this effort was to increase its ability to identify potential risks and to aid resource allocation decision making for risk control. Such a methodology was presented at a risk management symposium sponsored by GPUN in September of 1989. A pilot project based on this methodology has been conducted at GPUN to test and validate the elements of the methodology and to compare the results of its application with current corporate methods for guiding risk decision making. The pilot project also led to a follow-up policy-capturing study to elicit information about the various risk decision-making models of GPUN decision makers. The combination of these endeavors provided an opportunity to gain numerous insights with respect to understanding the real value of a risk management process, obtaining acceptance of and commitment to risk management and improving operational aspects of the methodology

  6. Assessing Communication Skills of Medical Students in Objective Structured Clinical Examinations (OSCE)--A Systematic Review of Rating Scales.

    Science.gov (United States)

    Cömert, Musa; Zill, Jördis Maria; Christalle, Eva; Dirmaier, Jörg; Härter, Martin; Scholl, Isabelle

    2016-01-01

    Teaching and assessment of communication skills have become essential in medical education. The Objective Structured Clinical Examination (OSCE) has been found as an appropriate means to assess communication skills within medical education. Studies have demonstrated the importance of a valid assessment of medical students' communication skills. Yet, the validity of the performance scores depends fundamentally on the quality of the rating scales used in an OSCE. Thus, this systematic review aimed at providing an overview of existing rating scales, describing their underlying definition of communication skills, determining the methodological quality of psychometric studies and the quality of psychometric properties of the identified rating scales. We conducted a systematic review to identify psychometrically tested rating scales, which have been applied in OSCE settings to assess communication skills of medical students. Our search strategy comprised three databases (EMBASE, PsycINFO, and PubMed), reference tracking and consultation of experts. We included studies that reported psychometric properties of communication skills assessment rating scales used in OSCEs by examiners only. The methodological quality of included studies was assessed using the COnsensus based Standards for the selection of health status Measurement INstruments (COSMIN) checklist. The quality of psychometric properties was evaluated using the quality criteria of Terwee and colleagues. Data of twelve studies reporting on eight rating scales on communication skills assessment in OSCEs were included. Five of eight rating scales were explicitly developed based on a specific definition of communication skills. The methodological quality of studies was mainly poor. The psychometric quality of the eight rating scales was mainly intermediate. Our results reveal that future psychometric evaluation studies focusing on improving the methodological quality are needed in order to yield psychometrically

  7. Physical data generation methodology for return-to-power steam line break analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zee, Sung Kyun; Lee, Chung Chan; Lee, Chang Kue [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1996-02-01

    Current methodology to generate physics data for steamline break accident analysis of CE-type nuclear plant such as Yonggwang Unit 3 is valid only if the core reactivity does not reach the criticality after shutdown. Therefore, the methodology requires tremendous amount of net scram worth, specially at the end of the cycle when moderator temperature coefficient is most negative. Therefore, we need a new methodology to obtain reasonably conservation physics data, when the reactor returns to power condition. Current methodology used ROCS which include only closed channel model. But it is well known that the closed channel model estimates the core reactivity too much negative if core flow rate is low. Therefore, a conservative methodology is presented which utilizes open channel 3D HERMITE model. Current methodology uses ROCS which include only closed channel model. But it is well known that the closed channel model estimates the core reactivity too much negative if core flow rate is low. Therefore, a conservative methodology is presented which utilizes open channel 3D HERMITE model. Return-to-power reactivity credit is produced to assist the reactivity table generated by closed channel model. Other data includes hot channel axial power shape, peaking factor and maximum quality for DNBR analysis. It also includes pin census for radiological consequence analysis. 48 figs., 22 tabs., 18 refs. (Author) .new.

  8. On the upper ocean turbulent dissipation rate due to microscale breakers and small whitecaps

    Science.gov (United States)

    Banner, Michael L.; Morison, Russel P.

    2018-06-01

    In ocean wave modelling, accurately computing the evolution of the wind-wave spectrum depends on the source terms and the spectral bandwidth used. The wave dissipation rate source term which spectrally quantifies wave breaking and other dissipative processes remains poorly understood, including the spectral bandwidth needed to capture the essential model physics. The observational study of Sutherland and Melville (2015a) investigated the relative dissipation rate contributions of breaking waves, from large-scale whitecaps to microbreakers. They concluded that a large fraction of wave energy was dissipated by microbreakers. However, in strong contrast with their findings, our analysis of their data and other recent data sets shows that for young seas, microbreakers and small whitecaps contribute only a small fraction of the total breaking wave dissipation rate. For older seas, we find microbreakers and small whitecaps contribute a large fraction of the breaking wave dissipation rate, but this is only a small fraction of the total dissipation rate, which is now dominated by non-breaking contributions. Hence, for all the wave age conditions observed, microbreakers make an insignificant contribution to the total wave dissipation rate in the wave boundary layer. We tested the sensitivity of the results to the SM15a whitecap analysis methodology by transforming the SM15a breaking data using our breaking crest processing methodology. This resulted in the small-scale breaking waves making an even smaller contribution to the total wave dissipation rate, and so the result is independent of the breaker processing methodology. Comparison with other near-surface total TKE dissipation rate observations also support this conclusion. These contributions to the spectral dissipation rate in ocean wave models are small and need not be explicitly resolved.

  9. Comparison of constant-rate pumping test and slug interference test results at the Hanford Site B pond multilevel test facility

    International Nuclear Information System (INIS)

    Spane, F.A. Jr.; Thorne, P.D.

    1995-10-01

    Pacific Northwest Laboratory (PNL), as part of the Hanford Site Ground-Water Surveillance Project, is responsible for monitoring the movement and fate of contamination within the unconfined aquifer to ensure that public health and the environment are protected. To support the monitoring and assessment of contamination migration on the Hanford Site, a sitewide 3-dimensional groundwater flow model is being developed. Providing quantitative hydrologic property data is instrumental in development of the 3-dimensional model. Multilevel monitoring facilities have been installed to provide detailed, vertically distributed hydrologic characterization information for the Hanford Site unconfined aquifer. In previous reports, vertically distributed water-level and hydrochemical data obtained over time from these multi-level monitoring facilities have been evaluated and reported. This report describes the B pond facility in Section 2.0. It also provides analysis results for a constant-rate pumping test (Section 3.0) and slug interference test (Section 4.0) that were conducted at a multilevel test facility located near B Pond (see Figure 1. 1) in the central part of the Hanford Site. A hydraulic test summary (Section 5.0) that focuses on the comparison of hydraulic property estimates obtained using the two test methods is also presented. Reference materials are listed in Section 6.0

  10. Taking the Test Taker's Perspective: Response Process and Test Motivation in Multidimensional Forced-Choice Versus Rating Scale Instruments.

    Science.gov (United States)

    Sass, Rachelle; Frick, Susanne; Reips, Ulf-Dietrich; Wetzel, Eunike

    2018-03-01

    The multidimensional forced-choice (MFC) format has been proposed as an alternative to the rating scale (RS) response format. However, it is unclear how changing the response format may affect the response process and test motivation of participants. In Study 1, we investigated the MFC response process using the think-aloud technique. In Study 2, we compared test motivation between the RS format and different versions of the MFC format (presenting 2, 3, 4, and 5 items simultaneously). The response process to MFC item blocks was similar to the RS response process but involved an additional step of weighing the items within a block against each other. The RS and MFC response format groups did not differ in their test motivation. Thus, from the test taker's perspective, the MFC format is somewhat more demanding to respond to, but this does not appear to decrease test motivation.

  11. Medicine, methodology, and values: trade-offs in clinical science and practice.

    Science.gov (United States)

    Ho, Vincent K Y

    2011-01-01

    The current guidelines of evidence-based medicine (EBM) presuppose that clinical research and clinical practice should advance from rigorous scientific tests as they generate reliable, value-free knowledge. Under this presupposition, hypotheses postulated by doctors and patients in the process of their decision making are preferably tested in randomized clinical trials (RCTs), and in systematic reviews and meta-analyses summarizing outcomes from multiple RCTs. Since testing under this scheme is predominantly focused on the criteria of generality and precision achieved through methodological rigor, at the cost of the criterion of realism, translating test results to clinical practice is often problematic. Choices concerning which methodological criteria should have priority are inevitable, however, as clinical trials, and scientific research in general, cannot meet all relevant criteria at the same time. Since these choices may be informed by considerations external to science, we must acknowledge that science cannot be value-free in a strict sense, and this invites a more prominent role for value-laden considerations in evaluating clinical research. The urgency for this becomes even more apparent when we consider the important yet implicit role of scientific theories in EBM, which may also be subjected to methodological evaluation and for which selectiveness in methodological focus is likewise inevitable.

  12. Experimental Methodology for Estimation of Local Heat Fluxes and Burning Rates in Steady Laminar Boundary Layer Diffusion Flames.

    Science.gov (United States)

    Singh, Ajay V; Gollner, Michael J

    2016-06-01

    Modeling the realistic burning behavior of condensed-phase fuels has remained out of reach, in part because of an inability to resolve the complex interactions occurring at the interface between gas-phase flames and condensed-phase fuels. The current research provides a technique to explore the dynamic relationship between a combustible condensed fuel surface and gas-phase flames in laminar boundary layers. Experiments have previously been conducted in both forced and free convective environments over both solid and liquid fuels. A unique methodology, based on the Reynolds Analogy, was used to estimate local mass burning rates and flame heat fluxes for these laminar boundary layer diffusion flames utilizing local temperature gradients at the fuel surface. Local mass burning rates and convective and radiative heat feedback from the flames were measured in both the pyrolysis and plume regions by using temperature gradients mapped near the wall by a two-axis traverse system. These experiments are time-consuming and can be challenging to design as the condensed fuel surface burns steadily for only a limited period of time following ignition. The temperature profiles near the fuel surface need to be mapped during steady burning of a condensed fuel surface at a very high spatial resolution in order to capture reasonable estimates of local temperature gradients. Careful corrections for radiative heat losses from the thermocouples are also essential for accurate measurements. For these reasons, the whole experimental setup needs to be automated with a computer-controlled traverse mechanism, eliminating most errors due to positioning of a micro-thermocouple. An outline of steps to reproducibly capture near-wall temperature gradients and use them to assess local burning rates and heat fluxes is provided.

  13. Methodology for testing and validating knowledge bases

    Science.gov (United States)

    Krishnamurthy, C.; Padalkar, S.; Sztipanovits, J.; Purves, B. R.

    1987-01-01

    A test and validation toolset developed for artificial intelligence programs is described. The basic premises of this method are: (1) knowledge bases have a strongly declarative character and represent mostly structural information about different domains, (2) the conditions for integrity, consistency, and correctness can be transformed into structural properties of knowledge bases, and (3) structural information and structural properties can be uniformly represented by graphs and checked by graph algorithms. The interactive test and validation environment have been implemented on a SUN workstation.

  14. Testing Long-Run Purchasing Power Parity under Exchange Rate Targeting

    OpenAIRE

    Sophocles N. Brissimis; Dimitris A. Sideris; Fragiska K. Voumvaki

    2004-01-01

    The present paper exploits the idea that empirical estimates of the long-run PPP relationship may compound two distinct influences coming from the behavior of market participants and policy makers when the latter are targeting the exchange rate. This tends to bias tests of long-run PPP against its acceptance. The validity of the theoretical arguments is assessed by drawing on the experience of two European Union countries, Greece and France for the post-Bretton Woods period. Estimation biases...

  15. Rate transient analysis for homogeneous and heterogeneous gas reservoirs using the TDS technique

    International Nuclear Information System (INIS)

    Escobar, Freddy Humberto; Sanchez, Jairo Andres; Cantillo, Jose Humberto

    2008-01-01

    In this study pressure test analysis in wells flowing under constant wellbore flowing pressure for homogeneous and naturally fractured gas reservoir using the TDS technique is introduced. Although, constant rate production is assumed in the development of the conventional well test analysis methods, constant pressure production conditions are sometimes used in the oil and gas industry. The constant pressure technique or rate transient analysis is more popular reckoned as decline curve analysis under which rate is allows to decline instead of wellbore pressure. The TDS technique, everyday more used even in the most recognized software packages although without using its trade brand name, uses the log-log plot to analyze pressure and pressure derivative test data to identify unique features from which exact analytical expression are derived to easily estimate reservoir and well parameters. For this case, the fingerprint characteristics from the log-log plot of the reciprocal rate and reciprocal rate derivative were employed to obtain the analytical expressions used for the interpretation analysis. Many simulation experiments demonstrate the accuracy of the new method. Synthetic examples are shown to verify the effectiveness of the proposed methodology

  16. Initial characterization of the ATR [Advanced Test Reactor] Large Gamma Facility

    International Nuclear Information System (INIS)

    Schnitzler, B.G.; Rogers, J.W.

    1986-05-01

    Radiation fields in the ATR Large Gamma Facility test volume are characterized. The preliminary characterization efforts described in this report include total dose rate measurements in the facility, development of a simple methodology for calculating radiation fields from the ATR fuel element power histories, and a comparison of the measured and calculated values

  17. A new lean change methodology for small & medium sized enterprises

    OpenAIRE

    April, Joris; Powell, Daryl; Bart, Schanssema

    2010-01-01

    SMEs find it difficult to implement productivity improvement tools, particularly those associated with Lean Manufacturing. Larger companies have more success due to greater access to resources. To provide the SMEs with a way to implement Lean sustainably, the European project ERIP develops a new lean change methodology for SMEs. In this paper the methodology is explained and three test cases show the strength of the methodology. The method is a sequence of achieving management and company sup...

  18. Negativization rates of IgE radioimmunoassay and basophil activation test in immediate reactions to penicillins.

    Science.gov (United States)

    Fernández, T D; Torres, M J; Blanca-López, N; Rodríguez-Bada, J L; Gomez, E; Canto, G; Mayorga, C; Blanca, M

    2009-02-01

    Skin test sensitivity in patients with immediate allergy to penicillins tends to decrease over time, but no information is available concerning in vitro tests. We analysed the negativization rates of two in vitro methods that determine specific immunoglobulin E (IgE) antibodies, the basophil activation test using flow cytometry (BAT) and the radioallergosorbent test (RAST), in immediate allergic reactions to penicillins. Forty-one patients with immediate allergic reactions to amoxicillin were followed up over a 4-year period. BAT and RAST were performed at 6-month intervals. Patients were randomized into groups: Group I, skin tests carried out at regular intervals; Group II, skin tests made only at the beginning of the study. Differences were observed between RAST and BAT (P testing influenced the rate of negativization of the RAST assay, contributing to maintenance of in vitro sensitivity. Because of the loss of sensitivity over time, the determination of specific IgE antibodies to penicillins in patients with immediate allergic reactions must be done as soon as possible after the reaction.

  19. Parent Ratings of Impulsivity and Inhibition Predict State Testing Scores

    Directory of Open Access Journals (Sweden)

    Rebecca A. Lundwall

    2018-03-01

    Full Text Available One principle of cognitive development is that earlier intervention for educational difficulties tends to improve outcomes such as future educational and career success. One possible way to help students who struggle is to determine if they process information differently. Such determination might lead to clues for interventions. For example, early information processing requires attention before the information can be identified, encoded, and stored. The aim of the present study was to investigate whether parent ratings of inattention, inhibition, and impulsivity, and whether error rate on a reflexive attention task could be used to predict child scores on state standardized tests. Finding such an association could provide assistance to educators in identifying academically struggling children who might require targeted educational interventions. Children (N = 203 were invited to complete a peripheral cueing task (which measures the automatic reorienting of the brain’s attentional resources from one location to another. While the children completed the task, their parents completed a questionnaire. The questionnaire gathered information on broad indicators of child functioning, including observable behaviors of impulsivity, inattention, and inhibition, as well as state academic scores (which the parent retrieved online from their school. We used sequential regression to analyze contributions of error rate and parent-rated behaviors in predicting six academic scores. In one of the six analyses (for science, we found that the improvement was significant from the simplified model (with only family income, child age, and sex as predictors to the full model (adding error rate and three parent-rated behaviors. Two additional analyses (reading and social studies showed near significant improvement from simplified to full models. Parent-rated behaviors were significant predictors in all three of these analyses. In the reading score analysis

  20. Average System Cost Methodology : Administrator's Record of Decision.

    Energy Technology Data Exchange (ETDEWEB)

    United States. Bonneville Power Administration.

    1984-06-01

    Significant features of average system cost (ASC) methodology adopted are: retention of the jurisdictional approach where retail rate orders of regulartory agencies provide primary data for computing the ASC for utilities participating in the residential exchange; inclusion of transmission costs; exclusion of construction work in progress; use of a utility's weighted cost of debt securities; exclusion of income taxes; simplification of separation procedures for subsidized generation and transmission accounts from other accounts; clarification of ASC methodology rules; more generous review timetable for individual filings; phase-in of reformed methodology; and each exchanging utility must file under the new methodology within 20 days of implementation by the Federal Energy Regulatory Commission of the ten major participating utilities, the revised ASC will substantially only affect three. (PSB)

  1. High Rate User Ka-Band Phased Array Antenna Test Results

    Science.gov (United States)

    Caroglanian, Armen; Perko, Kenneth; Seufert, Steve; Dod, Tom; Warshowsky, Jay; Day, John H. (Technical Monitor)

    2001-01-01

    The High Rate User Phased Array Antenna (HRUPAA) is a Ka-Band planar phased array designed by the Harris Corporation for the NASA Goddard Space Flight Center. The HRUPAA permits a satellite to downlink data either to a ground station or through the Tracking and Data Relay Satellite System (TDRSS). The HRUPAA is scanned electronically by ground station / user satellite command over a 120 degree cone angle. The phased array has the advantage of not imparting attitude disturbances to the user spacecraft. The 288-element transmit-only array has distributed RF amplifiers integrated behind each of the printed patch antenna elements. The array has 33 dBW EIRP and is left-hand circularly polarized. An engineering model of a partially populated array has been developed and delivered to NASA Goddard Space Flight Center. This report deals with the testing of the engineering model at the Goddard Antenna Range near-field and compact range facilities. The antenna specifications are described first, followed by the test plan and test results.

  2. Event based uncertainty assessment in urban drainage modelling, applying the GLUE methodology

    DEFF Research Database (Denmark)

    Thorndahl, Søren; Beven, K.J.; Jensen, Jacob Birk

    2008-01-01

    of combined sewer overflow. The GLUE methodology is used to test different conceptual setups in order to determine if one model setup gives a better goodness of fit conditional on the observations than the other. Moreover, different methodological investigations of GLUE are conducted in order to test......In the present paper an uncertainty analysis on an application of the commercial urban drainage model MOUSE is conducted. Applying the Generalized Likelihood Uncertainty Estimation (GLUE) methodology the model is conditioned on observation time series from two flow gauges as well as the occurrence...... if the uncertainty analysis is unambiguous. It is shown that the GLUE methodology is very applicable in uncertainty analysis of this application of an urban drainage model, although it was shown to be quite difficult of get good fits of the whole time series....

  3. Psychomotor testing predicts rate of skill acquisition for proficiency-based laparoscopic skills training.

    Science.gov (United States)

    Stefanidis, Dimitrios; Korndorffer, James R; Black, F William; Dunne, J Bruce; Sierra, Rafael; Touchard, Cheri L; Rice, David A; Markert, Ronald J; Kastl, Peter R; Scott, Daniel J

    2006-08-01

    Laparoscopic simulator training translates into improved operative performance. Proficiency-based curricula maximize efficiency by tailoring training to meet the needs of each individual; however, because rates of skill acquisition vary widely, such curricula may be difficult to implement. We hypothesized that psychomotor testing would predict baseline performance and training duration in a proficiency-based laparoscopic simulator curriculum. Residents (R1, n = 20) were enrolled in an IRB-approved prospective study at the beginning of the academic year. All completed the following: a background information survey, a battery of 12 innate ability measures (5 motor, and 7 visual-spatial), and baseline testing on 3 validated simulators (5 videotrainer [VT] tasks, 12 virtual reality [minimally invasive surgical trainer-virtual reality, MIST-VR] tasks, and 2 laparoscopic camera navigation [LCN] tasks). Participants trained to proficiency, and training duration and number of repetitions were recorded. Baseline test scores were correlated to skill acquisition rate. Cutoff scores for each predictive test were calculated based on a receiver operator curve, and their sensitivity and specificity were determined in identifying slow learners. Only the Cards Rotation test correlated with baseline simulator ability on VT and LCN. Curriculum implementation required 347 man-hours (6-person team) and 795,000 dollars of capital equipment. With an attendance rate of 75%, 19 of 20 residents (95%) completed the curriculum by the end of the academic year. To complete training, a median of 12 hours (range, 5.5-21), and 325 repetitions (range, 171-782) were required. Simulator score improvement was 50%. Training duration and repetitions correlated with prior video game and billiard exposure, grooved pegboard, finger tap, map planning, Rey Figure Immediate Recall score, and baseline performance on VT and LCN. The map planning cutoff score proved most specific in identifying slow learners

  4. ChargeOut! : discounted cash flow compared with traditional machine-rate analysis

    Science.gov (United States)

    Ted Bilek

    2008-01-01

    ChargeOut!, a discounted cash-flow methodology in spreadsheet format for analyzing machine costs, is compared with traditional machine-rate methodologies. Four machine-rate models are compared and a common data set representative of logging skidders’ costs is used to illustrate the differences between ChargeOut! and the machine-rate methods. The study found that the...

  5. Personal Radiation Detector Field Test and Evaluation Campaign

    International Nuclear Information System (INIS)

    Chris A. Hodge, Ding Yuan, Raymond P. Keegan, Michael A. Krstich

    2007-01-01

    Following the success of the Anole test of portable detection system, the U.S. Department of Homeland Security (DHS) Domestic Nuclear Detection Office organized a test and evaluation campaign for personal radiation detectors (PRDs), also known as 'Pagers'. This test, 'Bobcat', was conducted from July 17 to August 8, 2006, at the Nevada Test Site. The Bobcat test was designed to evaluate the performance of PRDs under various operational scenarios, such as pedestrian surveying, mobile surveying, cargo container screening, and pedestrian chokepoint monitoring. Under these testing scenarios, many operational characteristics of the PRDs, such as gamma and neutron sensitivities, positive detection and false alarm rates, response delay times, minimum detectable activities, and source localization errors, were analyzed. This paper will present the design, execution, and methodologies used to test this equipment for the DHS

  6. Comparison of measured and calculated reaction rate distributions in an scwr-like test lattice

    Energy Technology Data Exchange (ETDEWEB)

    Raetz, Dominik, E-mail: dominik.raetz@psi.ch [Paul Scherrer Institut, CH-5232 Villigen PSI (Switzerland); Jordan, Kelly A., E-mail: kelly.jordan@psi.ch [Paul Scherrer Institut, CH-5232 Villigen PSI (Switzerland); Murphy, Michael F., E-mail: mike.murphy@psi.ch [Paul Scherrer Institut, CH-5232 Villigen PSI (Switzerland); Perret, Gregory, E-mail: gregory.perret@psi.ch [Paul Scherrer Institut, CH-5232 Villigen PSI (Switzerland); Chawla, Rakesh, E-mail: rakesh.chawla@psi.ch [Paul Scherrer Institut, CH-5232 Villigen PSI (Switzerland); Ecole Polytechnique Federale de Lausanne (EPFL), CH-1015 Lausanne, EPFL (Switzerland)

    2011-04-15

    High resolution gamma-ray spectroscopy measurements were performed on 61 rods of an SCWR-like fuel lattice, after irradiation in the central test zone of the PROTEUS zero-power research reactor at the Paul Scherrer Institute in Switzerland. The derived reaction rates are the capture rate in {sup 238}U (C{sub 8}) and the total fission rate (F{sub tot}), and also the reaction rate ratio C{sub 8}/F{sub tot}. Each of these has been mapped rod-wise on the lattice and compared to calculated results from whole-reactor Monte Carlo simulations with MCNPX. Ratios of calculated to experimental values (C/E's) have been assessed for the C{sub 8}, F{sub tot} and C{sub 8}/F{sub tot} distributions across the lattice. These C/E's show excellent agreement between the calculations and the measurements. For the {sup 238}U capture rate distribution, the 1{sigma} level in the comparisons corresponds to an uncertainty of {+-}0.8%, while for the total fission rate the corresponding value is {+-}0.4%. The uncertainty for C{sub 8}/F{sub tot}, assessed as a reaction rate ratio characterizing each individual rod position in the test lattice, is significantly higher at {+-}2.2%. To determine the reproducibility of these results, the measurements were performed twice, once in 2006 and again in 2009. The agreement between these two measurement sets is within the respective statistical uncertainties.

  7. Implementing the cost-optimal methodology in EU countries

    DEFF Research Database (Denmark)

    Atanasiu, Bogdan; Kouloumpi, Ilektra; Thomsen, Kirsten Engelund

    This study presents three cost-optimal calculations. The overall aim is to provide a deeper analysis and to provide additional guidance on how to properly implement the cost-optimality methodology in Member States. Without proper guidance and lessons from exemplary case studies using realistic...... input data (reflecting the likely future development), there is a risk that the cost-optimal methodology may be implemented at sub-optimal levels. This could lead to a misalignment between the defined cost-optimal levels and the long-term goals, leaving a significant energy saving potential unexploited....... Therefore, this study provides more evidence on the implementation of the cost-optimal methodology and highlights the implications of choosing different values for key factors (e.g. discount rates, simulation variants/packages, costs, energy prices) at national levels. The study demonstrates how existing...

  8. Gas leakage rate through reinforced concrete shear walls: Numerical study

    International Nuclear Information System (INIS)

    Wang Ting; Hutchinson, Tara C.

    2005-01-01

    Unlined reinforced concrete shear walls are often used as 'tertiary boundaries' in the United States Department of Energy (DOE) to house dangerous gases. An unanticipated event, such as an earthquake, may cause gases stored inside the walls to disperse into the environment resulting in excess pollution. To address this concern, in this paper, a methodology to numerically predict the gas leakage rate through these shear walls under lateral loading conditions is proposed. This methodology involves finite element and flow rate analysis. Strain distributions are obtained from the finite element analysis, and then used to simulate the crack characteristics on the concrete specimen. The flow rate through the damaged concrete specimen is then estimated using flow rate formulas available from the literature. Results from an experimental specimen are used to evaluate the methodology, and particularly its robustness in the flow rate estimation

  9. Glucose Pump Test can be Used to Measure Blood Flow Rate of ...

    African Journals Online (AJOL)

    2018-02-07

    Feb 7, 2018 ... In 93 chronic hemodialysis patients with native AV fistula, blood flow rates were measured by Doppler US .... Arterial blood pressure from nonvascular access arm was measured by aneroid sphygmomanometer. The patients did not .... to detect differences in treatments across multiple test attempts. P < 0.05 ...

  10. Severe accident analysis methodology in support of accident management

    International Nuclear Information System (INIS)

    Boesmans, B.; Auglaire, M.; Snoeck, J.

    1997-01-01

    The author addresses the implementation at BELGATOM of a generic severe accident analysis methodology, which is intended to support strategic decisions and to provide quantitative information in support of severe accident management. The analysis methodology is based on a combination of severe accident code calculations, generic phenomenological information (experimental evidence from various test facilities regarding issues beyond present code capabilities) and detailed plant-specific technical information

  11. Hydrogeological testing in the Sellafield area

    International Nuclear Information System (INIS)

    Sutton, J.S.

    1996-01-01

    A summary of the hydrogeological test methodologies employed in the Sellafield geological investigations is provided in order that an objective appraisal of the quality of the data can be formed. A brief presentation of some of these data illustrates the corroborative nature of different test and measurement methodologies and provides a preliminary view of the results obtained. The programme of hydrogeological testing is an evolving one and methodologies are developing as work proceeds and targets become more clearly defined. As the testing is focused on relatively low permeability rocks at depth, the approach to testing differs slightly from conventional hydrogeological well testing and makes extensive use of oilfield technology. (author)

  12. Impact limiters for radioactive materials transport packagings: a methodology for assessment

    International Nuclear Information System (INIS)

    Mourao, Rogerio Pimenta

    2002-01-01

    This work aims at establishing a methodology for design assessment of a cellular material-filled impact limiter to be used as part of a radioactive material transport packaging. This methodology comprises the selection of the cellular material, its structural characterization by mechanical tests, the development of a case study in the nuclear field, preliminary determination of the best cellular material density for the case study, performance of the case and its numerical simulation using the finite element method. Among the several materials used as shock absorbers in packagings, the polyurethane foam was chosen, particularly the foam obtained from the castor oil plant (Ricinus communis), a non-polluting and renewable source. The case study carried out was the 9 m drop test of a package prototype containing radioactive wastes incorporated in a cement matrix, considered one of the most severe tests prescribed by the Brazilian and international transport standards. Prototypes with foam density pre-determined as ideal as well as prototypes using lighter and heavier foams were tested for comparison. The results obtained validate the methodology in that expectations regarding the ideal foam density were confirmed by the drop tests and the numerical simulation. (author)

  13. A robust methodology for kinetic model parameter estimation for biocatalytic reactions

    DEFF Research Database (Denmark)

    Al-Haque, Naweed; Andrade Santacoloma, Paloma de Gracia; Lima Afonso Neto, Watson

    2012-01-01

    lead to globally optimized parameter values. In this article, a robust methodology to estimate parameters for biocatalytic reaction kinetic expressions is proposed. The methodology determines the parameters in a systematic manner by exploiting the best features of several of the current approaches...... parameters, which are strongly correlated with each other. State-of-the-art methodologies such as nonlinear regression (using progress curves) or graphical analysis (using initial rate data, for example, the Lineweaver-Burke plot, Hanes plot or Dixon plot) often incorporate errors in the estimates and rarely...

  14. HIV testing during pregnancy: use of secondary data to estimate 2006 test coverage and prevalence in Brazil

    Directory of Open Access Journals (Sweden)

    Célia Landmann Szwarcwald

    Full Text Available This paper describes a methodological proposal based on secondary data and the main results of the HIV-Sentinel Study among childbearing women, carried out in Brazil during 2006. A probabilistic sample of childbearing women was selected in two stages. In the first stage, 150 health establishments were selected, stratified by municipality size (<50,000; 50,000-399,999; 400,000+. In the second stage, 100-120 women were selected systematically. Data collection was based on HIV-test results registered in pre-natal cards and in hospital records. The analysis focused on coverage of HIV-testing during pregnancy and HIV prevalence rate. Logistic regression models were used to test inequalities in HIV-testing coverage during pregnancy by macro-region of residence, municipality size, race, educational level and age group. The study included 16,158 women. Results were consistent with previous studies based on primary data collection. Among the women receiving pre-natal care with HIV-test results registered in their pre-natal cards, HIV prevalence was 0.41%. Coverage of HIV-testing during pregnancy was 62.3% in the country as a whole, but ranged from 40.6% in the Northeast to 85.8% in the South. Significant differences according to race, educational level and municipality size were also found. The proposed methodology is low-cost, easy to apply, and permits identification of problems in routine service provision, in addition to monitoring compliance with Ministry of Health recommendations for pre-natal care.

  15. A Human-Centered Design Methodology to Enhance the Usability, Human Factors, and User Experience of Connected Health Systems: A Three-Phase Methodology

    Science.gov (United States)

    Harte, Richard; Glynn, Liam; Rodríguez-Molinero, Alejandro; Baker, Paul MA; Scharf, Thomas; ÓLaighin, Gearóid

    2017-01-01

    Background Design processes such as human-centered design, which involve the end user throughout the product development and testing process, can be crucial in ensuring that the product meets the needs and capabilities of the user, particularly in terms of safety and user experience. The structured and iterative nature of human-centered design can often present a challenge when design teams are faced with the necessary, rapid, product development life cycles associated with the competitive connected health industry. Objective We wanted to derive a structured methodology that followed the principles of human-centered design that would allow designers and developers to ensure that the needs of the user are taken into account throughout the design process, while maintaining a rapid pace of development. In this paper, we present the methodology and its rationale before outlining how it was applied to assess and enhance the usability, human factors, and user experience of a connected health system known as the Wireless Insole for Independent and Safe Elderly Living (WIISEL) system, a system designed to continuously assess fall risk by measuring gait and balance parameters associated with fall risk. Methods We derived a three-phase methodology. In Phase 1 we emphasized the construction of a use case document. This document can be used to detail the context of use of the system by utilizing storyboarding, paper prototypes, and mock-ups in conjunction with user interviews to gather insightful user feedback on different proposed concepts. In Phase 2 we emphasized the use of expert usability inspections such as heuristic evaluations and cognitive walkthroughs with small multidisciplinary groups to review the prototypes born out of the Phase 1 feedback. Finally, in Phase 3 we emphasized classical user testing with target end users, using various metrics to measure the user experience and improve the final prototypes. Results We report a successful implementation of the

  16. Carbon Dioxide Washout Testing Using Various Inlet Vent Configurations in the Mark-III Space Suit

    Science.gov (United States)

    Korona, F. Adam; Norcross, Jason; Conger, Bruce; Navarro, Moses

    2014-01-01

    Requirements for using a space suit during ground testing include providing adequate carbon dioxide (CO2) washout for the suited subject. Acute CO2 exposure can lead to symptoms including headache, dyspnea, lethargy, and eventually unconsciousness or even death. Symptoms depend on several factors including inspired partial pressure of CO2 (ppCO2), duration of exposure, metabolic rate of the subject, and physiological differences between subjects. Computational Fluid Dynamics (CFD) analysis has predicted that the configuration of the suit inlet vent has a significant effect on oronasal CO2 concentrations. The main objective of this test was to characterize inspired oronasal ppCO2 for a variety of inlet vent configurations in the Mark-III suit across a range of workload and flow rates. Data and trends observed during testing along with refined CFD models will be used to help design an inlet vent configuration for the Z-2 space suit. The testing methodology used in this test builds upon past CO2 washout testing performed on the Z-1 suit, Rear Entry I-Suit, and the Enhanced Mobility Advanced Crew Escape Suit. Three subjects performed two test sessions each in the Mark-III suit to allow for comparison between tests. Six different helmet inlet vent configurations were evaluated during each test session. Suit pressure was maintained at 4.3 psid. Suited test subjects walked on a treadmill to generate metabolic workloads of approximately 2000 and 3000 BTU/hr. Supply airflow rates of 6 and 4 actual cubic feet per minute were tested at each workload. Subjects wore an oronasal mask with an open port in front of the mouth and were allowed to breathe freely. Oronasal ppCO2 was monitored real-time via gas analyzers with sampling tubes connected to the oronasal mask. Metabolic rate was calculated from the CO2 production measured by an additional gas analyzer at the air outlet from the suit. Real-time metabolic rate measurements were used to adjust the treadmill workload to meet

  17. Civil migration and risk assessment methodology

    International Nuclear Information System (INIS)

    Onishi, Y.; Brown, S.M.; Olsen, A.R.; Parkhurst, M.A.

    1981-01-01

    To provide a scientific basis for risk assessment and decision making, the Chemical Migration and Risk Assessment (CMRA) Methodology was developed to simulate overland and instream toxic containment migration and fate, and to predict the probability of acute and chronic impacts on aquatic biota. The simulation results indicated that the time between the pesticide application and the subsequent runoff producing event was the most important factor determining the amount of the alachlor. The study also revealed that sediment transport has important effects on contaminant migration when sediment concentrations in receiving streams are high or contaminants are highly susceptible to adsorption by sediment. Although the capabilities of the CMRA methodology were only partially tested in this study, the results demonstrate that methodology can be used as a scientific decision-making tool for toxic chemical regulations, a research tool to evaluate the relative significance of various transport and degradation phenomena, as well as a tool to examine the effectiveness of toxic chemical control practice

  18. Interlaboratory comparison and accreditation in quality control testing of diagnostic X-ray equipment

    International Nuclear Information System (INIS)

    Kepler, K.; Vladimirov, A.; Servomaa, A.

    2005-01-01

    The Univ. of Tartu provides a quality control service to the majority of diagnostic X-ray departments in Estonia. Its methodology has been adopted from the IEC and other relevant standards. Recently the Testing Centre of the Univ. of Tartu was accredited on this methodology by ISO/IEC 17025. Besides the implementation of the quality management system, participation in interlaboratory comparison (ILC) was one of the prerequisites for the accreditation. Tests for estimating reproducibility of tube voltage and dose rate, accuracy of the voltage and accuracy of exposure time were carried out on a diagnostic X-ray unit in the Radiation and Nuclear Safety Authority in Helsinki. The measurement performance was judged by calculating deviation En normalised with respect to the stated uncertainties. En values for all tests were less than unity and by the common ILC criteria the testing performance could be considered as acceptable. (authors)

  19. Do exchange rates follow random walks? A variance ratio test of the ...

    African Journals Online (AJOL)

    The random-walk hypothesis in foreign-exchange rates market is one of the most researched areas, particularly in developed economies. However, emerging markets in sub-Saharan Africa have received little attention in this regard. This study applies Lo and MacKinlay's (1988) conventional variance ratio test and Wright's ...

  20. A Methodology for Equitable Performance Assessment and Presentation of Wave Energy Converters Based on Sea Trials

    DEFF Research Database (Denmark)

    Kofoed, Jens Peter; Pecher, Arthur; Margheritini, Lucia

    2013-01-01

    This paper provides a methodology for the analysis and presentation of data obtained from sea trials of wave energy converters (WEC). The equitable aspect of this methodology lies in its wide application, as any WEC at any scale or stage of development can be considered as long as the tests are p...... parameters influence the performance of the WEC can also be investigated using this methodology.......This paper provides a methodology for the analysis and presentation of data obtained from sea trials of wave energy converters (WEC). The equitable aspect of this methodology lies in its wide application, as any WEC at any scale or stage of development can be considered as long as the tests...... leads to testing campaigns that are not as extensive as desired. Therefore, the performance analysis should be robust enough to allow for not fully complete sea trials and sub optimal performance data. In other words, this methodology is focused at retrieving the maximum amount of useful information out...

  1. A proposal to order the neutron data set in neutron spectrometry using the RDANN methodology

    Energy Technology Data Exchange (ETDEWEB)

    Ortiz R, J.M.; Martinez B, M.R.; Vega C, H.R. [UAZ, Av. Ramon Lopez Velarde No. 801, 98000 Zacatecas (Mexico)

    2006-07-01

    A new proposal to order a neutron data set in the design process of artificial neural networks in the neutron spectrometry field is presented for first time. The robust design of artificial neural networks methodology was applied to 187 neutron spectra data set compiled by the International Atomic Energy Agency. Four cases of grouping the neutron spectra were considered and around 1000 different neural networks were designed, trained and tested with different net topologies each one. After carrying out the systematic methodology for all the cases, it was determined that the best neural network topology that produced the best reconstructed neutron spectra was case with 187 neutron spectra data set, determining that the best neural network topology is: 7 entrance neurons, 14 neurons in a hidden layer and 31 neurons in the exit layer, with a value of 0.1 in the learning rate and 0.1 in the moment. (Author)

  2. A proposal to order the neutron data set in neutron spectrometry using the RDANN methodology

    International Nuclear Information System (INIS)

    Ortiz R, J.M.; Martinez B, M.R.; Vega C, H.R.

    2006-01-01

    A new proposal to order a neutron data set in the design process of artificial neural networks in the neutron spectrometry field is presented for first time. The robust design of artificial neural networks methodology was applied to 187 neutron spectra data set compiled by the International Atomic Energy Agency. Four cases of grouping the neutron spectra were considered and around 1000 different neural networks were designed, trained and tested with different net topologies each one. After carrying out the systematic methodology for all the cases, it was determined that the best neural network topology that produced the best reconstructed neutron spectra was case with 187 neutron spectra data set, determining that the best neural network topology is: 7 entrance neurons, 14 neurons in a hidden layer and 31 neurons in the exit layer, with a value of 0.1 in the learning rate and 0.1 in the moment. (Author)

  3. Integral Design Methodology of Photocatalytic Reactors for Air Pollution Remediation

    Directory of Open Access Journals (Sweden)

    Claudio Passalía

    2017-06-01

    Full Text Available An integral reactor design methodology was developed to address the optimal design of photocatalytic wall reactors to be used in air pollution control. For a target pollutant to be eliminated from an air stream, the proposed methodology is initiated with a mechanistic derived reaction rate. The determination of intrinsic kinetic parameters is associated with the use of a simple geometry laboratory scale reactor, operation under kinetic control and a uniform incident radiation flux, which allows computing the local superficial rate of photon absorption. Thus, a simple model can describe the mass balance and a solution may be obtained. The kinetic parameters may be estimated by the combination of the mathematical model and the experimental results. The validated intrinsic kinetics obtained may be directly used in the scaling-up of any reactor configuration and size. The bench scale reactor may require the use of complex computational software to obtain the fields of velocity, radiation absorption and species concentration. The complete methodology was successfully applied to the elimination of airborne formaldehyde. The kinetic parameters were determined in a flat plate reactor, whilst a bench scale corrugated wall reactor was used to illustrate the scaling-up methodology. In addition, an optimal folding angle of the corrugated reactor was found using computational fluid dynamics tools.

  4. Trip Energy Estimation Methodology and Model Based on Real-World Driving Data for Green Routing Applications: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Holden, Jacob [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Van Til, Harrison J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Wood, Eric W [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Gonder, Jeffrey D [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Zhu, Lei [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2018-02-09

    A data-informed model to predict energy use for a proposed vehicle trip has been developed in this paper. The methodology leverages nearly 1 million miles of real-world driving data to generate the estimation model. Driving is categorized at the sub-trip level by average speed, road gradient, and road network geometry, then aggregated by category. An average energy consumption rate is determined for each category, creating an energy rates look-up table. Proposed vehicle trips are then categorized in the same manner, and estimated energy rates are appended from the look-up table. The methodology is robust and applicable to almost any type of driving data. The model has been trained on vehicle global positioning system data from the Transportation Secure Data Center at the National Renewable Energy Laboratory and validated against on-road fuel consumption data from testing in Phoenix, Arizona. The estimation model has demonstrated an error range of 8.6% to 13.8%. The model results can be used to inform control strategies in routing tools, such as change in departure time, alternate routing, and alternate destinations to reduce energy consumption. This work provides a highly extensible framework that allows the model to be tuned to a specific driver or vehicle type.

  5. Ecological and methodological drivers of species’ distribution and phenology responses to climate change

    KAUST Repository

    Brown, Christopher J.

    2015-12-10

    Climate change is shifting species’ distribution and phenology. Ecological traits, such as mobility or reproductive mode, explain variation in observed rates of shift for some taxa. However, estimates of relationships between traits and climate responses could be influenced by how responses are measured. We compiled a global dataset of 651 published marine species’ responses to climate change, from 47 papers on distribution shifts and 32 papers on phenology change. We assessed the relative importance of two classes of predictors of the rate of change, ecological traits of the responding taxa and methodological approaches for quantifying biological responses. Methodological differences explained 22% of the variation in range shifts, more than the 7.8% of the variation explained by ecological traits. For phenology change, methodological approaches accounted for 4% of the variation in measurements, whereas 8% of the variation was explained by ecological traits. Our ability to predict responses from traits was hindered by poor representation of species from the tropics, where temperature isotherms are moving most rapidly. Thus, the mean rate of distribution change may be underestimated by this and other global syntheses. Our analyses indicate that methodological approaches should be explicitly considered when designing, analysing and comparing results among studies. To improve climate impact studies, we recommend that: (1) re-analyses of existing time-series state how the existing datasets may limit the inferences about possible climate responses; (2) qualitative comparisons of species’ responses across different studies be limited to studies with similar methodological approaches; (3) meta-analyses of climate responses include methodological attributes as covariates and; (4) that new time series be designed to include detection of early warnings of change or ecologically relevant change. Greater consideration of methodological attributes will improve the

  6. A methodology for quantitatively managing the bug fixing process using Mahalanobis Taguchi system

    Directory of Open Access Journals (Sweden)

    Boby John

    2015-12-01

    Full Text Available The controlling of bug fixing process during the system testing phase of software development life cycle is very important for fixing all the detected bugs within the scheduled time. The presence of open bugs often delays the release of the software or result in releasing the software with compromised functionalities. These can lead to customer dissatisfaction, cost overrun and eventually the loss of market share. In this paper, the authors propose a methodology to quantitatively manage the bug fixing process during system testing. The proposed methodology identifies the critical milestones in the system testing phase which differentiates the successful projects from the unsuccessful ones using Mahalanobis Taguchi system. Then a model is developed to predict whether a project is successful or not with the bug fix progress at critical milestones as control factors. Finally the model is used to control the bug fixing process. It is found that the performance of the proposed methodology using Mahalanobis Taguchi system is superior to the models developed using other multi-dimensional pattern recognition techniques. The proposed methodology also reduces the number of control points providing the managers with more options and flexibility to utilize the bug fixing resources across system testing phase. Moreover the methodology allows the mangers to carry out mid- course corrections to bring the bug fixing process back on track so that all the detected bugs can be fixed on time. The methodology is validated with eight new projects and the results are very encouraging.

  7. Reproducibility of subjective appetite ratings and ad libitum test meal energy intake in overweight and obese males.

    Science.gov (United States)

    Horner, Katy M; Byrne, Nuala M; King, Neil A

    2014-10-01

    To determine whether changes in appetite and energy intake (EI) can be detected and play a role in the effectiveness of interventions, it is necessary to identify their variability under normal conditions. We assessed the reproducibility of subjective appetite ratings and ad libitum test meal EI after a standardised pre-load in overweight and obese males. Fifteen overweight and obese males (BMI 30.3 ± 4.9 kg/m(2), aged 34.9 ± 10.6 years) completed two identical test days, 7 days apart. Participants were provided with a standardised fixed breakfast (1676 kJ) and 5 h later an ad libitum pasta lunch. An electronic appetite rating system was used to assess subjective ratings before and after the fixed breakfast, and periodically during the postprandial period. EI was assessed at the ad libitum lunch meal. Sample size estimates for paired design studies were calculated. Appetite ratings demonstrated a consistent oscillating pattern between test days, and were more reproducible for mean postprandial than fasting ratings. The correlation between ad libitum EI on the two test days was r = 0.78 (P appetite and ad libitum test meal EI in overweight and obese males is comparable to previous reports in normal weight adults. Sample size requirements for studies vary depending on the parameter of interest and sensitivity needed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Load and resistance factor rating (LRFR) in New York State : volume II.

    Science.gov (United States)

    2011-09-01

    This report develops a Load and Resistance Factor Rating (NYS-LRFR) methodology : for New York bridges. The methodology is applicable for the rating of existing : bridges, the posting of under-strength bridges, and checking Permit trucks. The : propo...

  9. Load and resistance factor rating (LRFR) in New York State : volume I.

    Science.gov (United States)

    2011-09-01

    This report develops a Load and Resistance Factor Rating (NYS-LRFR) methodology : for New York bridges. The methodology is applicable for the rating of existing : bridges, the posting of under-strength bridges, and checking Permit trucks. The : propo...

  10. Prioritization methodology for chemical replacement

    Science.gov (United States)

    Cruit, Wendy; Goldberg, Ben; Schutzenhofer, Scott

    1995-01-01

    Since United States of America federal legislation has required ozone depleting chemicals (class 1 & 2) to be banned from production, The National Aeronautics and Space Administration (NASA) and industry have been required to find other chemicals and methods to replace these target chemicals. This project was initiated as a development of a prioritization methodology suitable for assessing and ranking existing processes for replacement 'urgency.' The methodology was produced in the form of a workbook (NASA Technical Paper 3421). The final workbook contains two tools, one for evaluation and one for prioritization. The two tools are interconnected in that they were developed from one central theme - chemical replacement due to imposed laws and regulations. This workbook provides matrices, detailed explanations of how to use them, and a detailed methodology for prioritization of replacement technology. The main objective is to provide a GUIDELINE to help direct the research for replacement technology. The approach for prioritization called for a system which would result in a numerical rating for the chemicals and processes being assessed. A Quality Function Deployment (QFD) technique was used in order to determine numerical values which would correspond to the concerns raised and their respective importance to the process. This workbook defines the approach and the application of the QFD matrix. This technique: (1) provides a standard database for technology that can be easily reviewed, and (2) provides a standard format for information when requesting resources for further research for chemical replacement technology. Originally, this workbook was to be used for Class 1 and Class 2 chemicals, but it was specifically designed to be flexible enough to be used for any chemical used in a process (if the chemical and/or process needs to be replaced). The methodology consists of comparison matrices (and the smaller comparison components) which allow replacement technology

  11. A Methodology To Incorporate The Safety Culture Into Probabilistic Safety Assessments

    Energy Technology Data Exchange (ETDEWEB)

    Park, Sunghyun; Kim, Namyeong; Jae, Moosung [Hanyang University, Seoul (Korea, Republic of)

    2015-10-15

    In order to incorporate organizational factors into PSA, a methodology needs to be developed. Using the AHP to weigh organizational factors as well as the SLIM to rate those factors, a methodology is introduced in this study. The safety issues related to nuclear safety culture have occurred increasingly. The quantification tool has to be developed in order to include the organizational factor into Probabilistic Safety Assessments. In this study, the state-of-the-art for the organizational evaluation methodologies has been surveyed. This study includes the research for organizational factors, maintenance process, maintenance process analysis models, a quantitative methodology using Analytic Hierarchy Process, Success Likelihood Index Methodology. The purpose of this study is to develop a methodology to incorporate the safety culture into PSA for obtaining more objective risk than before. The organizational factor considered in nuclear safety culture might affect the potential risk of human error and hardware-failure. The safety culture impact index to monitor the plant safety culture can be assessed by applying the developed methodology into a nuclear power plant.

  12. A Methodology To Incorporate The Safety Culture Into Probabilistic Safety Assessments

    International Nuclear Information System (INIS)

    Park, Sunghyun; Kim, Namyeong; Jae, Moosung

    2015-01-01

    In order to incorporate organizational factors into PSA, a methodology needs to be developed. Using the AHP to weigh organizational factors as well as the SLIM to rate those factors, a methodology is introduced in this study. The safety issues related to nuclear safety culture have occurred increasingly. The quantification tool has to be developed in order to include the organizational factor into Probabilistic Safety Assessments. In this study, the state-of-the-art for the organizational evaluation methodologies has been surveyed. This study includes the research for organizational factors, maintenance process, maintenance process analysis models, a quantitative methodology using Analytic Hierarchy Process, Success Likelihood Index Methodology. The purpose of this study is to develop a methodology to incorporate the safety culture into PSA for obtaining more objective risk than before. The organizational factor considered in nuclear safety culture might affect the potential risk of human error and hardware-failure. The safety culture impact index to monitor the plant safety culture can be assessed by applying the developed methodology into a nuclear power plant

  13. Perception-oriented methodology for robust motion estimation design

    NARCIS (Netherlands)

    Heinrich, A.; Vleuten, van der R.J.; Haan, de G.

    2014-01-01

    Optimizing a motion estimator (ME) for picture rate conversion is challenging. This is because there are many types of MEs and, within each type, many parameters, which makes subjective assessment of all the alternatives impractical. To solve this problem, we propose an automatic design methodology

  14. Standard Test Method for Measuring Fast-Neutron Reaction Rates by Radioactivation of Titanium

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2008-01-01

    1.1 This test method covers procedures for measuring reaction rates by the activation reactions 46Ti(n,p) 46Sc + 47Ti(n, np)46Sc. Note 1—Since the cross section for the (n,np) reaction is relatively small for energies less than 12 MeV and is not easily distinguished from that of the (n,p) reaction, this test method will refer to the (n,p) reaction only. 1.2 The reaction is useful for measuring neutrons with energies above approximately 4.4 MeV and for irradiation times up to about 250 days (for longer irradiations, see Practice E 261). 1.3 With suitable techniques, fission-neutron fluence rates above 109 cm–2·s–1 can be determined. However, in the presence of a high thermal-neutron fluence rate, 46Sc depletion should be investigated. 1.4 Detailed procedures for other fast-neutron detectors are referenced in Practice E 261. 1.5 The values stated in SI units are to be regarded as standard. No other units of measurement are included in this standard. 1.6 This standard does not purport to address all...

  15. Brittle materials at high-loading rates: an open area of research

    Science.gov (United States)

    Forquin, Pascal

    2017-01-01

    Brittle materials are extensively used in many civil and military applications involving high-strain-rate loadings such as: blasting or percussive drilling of rocks, ballistic impact against ceramic armour or transparent windshields, plastic explosives used to damage or destroy concrete structures, soft or hard impacts against concrete structures and so on. With all of these applications, brittle materials are subjected to intense loadings characterized by medium to extremely high strain rates (few tens to several tens of thousands per second) leading to extreme and/or specific damage modes such as multiple fragmentation, dynamic cracking, pore collapse, shearing, mode II fracturing and/or microplasticity mechanisms in the material. Additionally, brittle materials exhibit complex features such as a strong strain-rate sensitivity and confining pressure sensitivity that justify expending greater research efforts to understand these complex features. Currently, the most popular dynamic testing techniques used for this are based on the use of split Hopkinson pressure bar methodologies and/or plate-impact testing methods. However, these methods do have some critical limitations and drawbacks when used to investigate the behaviour of brittle materials at high loading rates. The present theme issue of Philosophical Transactions A provides an overview of the latest experimental methods and numerical tools that are currently being developed to investigate the behaviour of brittle materials at high loading rates. This article is part of the themed issue 'Experimental testing and modelling of brittle materials at high strain rates'.

  16. Diuresis renography in children: methodological aspects

    International Nuclear Information System (INIS)

    Bonnin, F.; Le Stanc, E.; Busquet, G.; Saidi, L.; Lyonnet, F.

    1995-01-01

    In paediatrics, diuresis renography is used as a method to guide clinical management of hydronephrosis or hydro-uretero-nephrosis. Various pitfalls in the technique and other errors exist and may lead to a misinterpretation of the test. The methodology for performing and interpreting the diuresis renography is discussed. (authors). 12 refs., 4 figs

  17. Employee Turnover: An Empirical and Methodological Assessment.

    Science.gov (United States)

    Muchinsky, Paul M.; Tuttle, Mark L.

    1979-01-01

    Reviews research on the prediction of employee turnover. Groups predictor variables into five general categories: attitudinal (job satisfaction), biodata, work-related, personal, and test-score predictors. Consistent relationships between common predictor variables and turnover were found for four categories. Eight methodological problems/issues…

  18. Survival analysis of colorectal cancer patients with tumor recurrence using global score test methodology

    Energy Technology Data Exchange (ETDEWEB)

    Zain, Zakiyah, E-mail: zac@uum.edu.my; Ahmad, Yuhaniz, E-mail: yuhaniz@uum.edu.my [School of Quantitative Sciences, Universiti Utara Malaysia, UUM Sintok 06010, Kedah (Malaysia); Azwan, Zairul, E-mail: zairulazwan@gmail.com, E-mail: farhanaraduan@gmail.com, E-mail: drisagap@yahoo.com; Raduan, Farhana, E-mail: zairulazwan@gmail.com, E-mail: farhanaraduan@gmail.com, E-mail: drisagap@yahoo.com; Sagap, Ismail, E-mail: zairulazwan@gmail.com, E-mail: farhanaraduan@gmail.com, E-mail: drisagap@yahoo.com [Surgery Department, Universiti Kebangsaan Malaysia Medical Centre, Jalan Yaacob Latif, 56000 Bandar Tun Razak, Kuala Lumpur (Malaysia); Aziz, Nazrina, E-mail: nazrina@uum.edu.my

    2014-12-04

    Colorectal cancer is the third and the second most common cancer worldwide in men and women respectively, and the second in Malaysia for both genders. Surgery, chemotherapy and radiotherapy are among the options available for treatment of patients with colorectal cancer. In clinical trials, the main purpose is often to compare efficacy between experimental and control treatments. Treatment comparisons often involve several responses or endpoints, and this situation complicates the analysis. In the case of colorectal cancer, sets of responses concerned with survival times include: times from tumor removal until the first, the second and the third tumor recurrences, and time to death. For a patient, the time to recurrence is correlated to the overall survival. In this study, global score test methodology is used in combining the univariate score statistics for comparing treatments with respect to each survival endpoint into a single statistic. The data of tumor recurrence and overall survival of colorectal cancer patients are taken from a Malaysian hospital. The results are found to be similar to those computed using the established Wei, Lin and Weissfeld method. Key factors such as ethnic, gender, age and stage at diagnose are also reported.

  19. Measuring hand hygiene compliance rates in different special care settings: a comparative study of methodologies

    Directory of Open Access Journals (Sweden)

    Thyago Pereira Magnus

    2015-04-01

    Conclusions: Hand hygiene compliance was reasonably high in these units, as measured by direct observation. However, a lack of correlation with results obtained by other methodologies brings into question the validity of direct observation results, and suggests that periodic audits using other methods may be needed.

  20. Real exchange rate misalignments

    OpenAIRE

    Terra, Maria Cristina T.; Valladares, Frederico Estrella Carneiro

    2003-01-01

    This paper characterizes episodes of real appreciations and depreciations for a sample of 85 countries, approximately from 1960 to 1998. First, the equilibrium real exchange rate series are constructed for each country using Goldfajn and Valdes (1999) methodology (cointegration with fundamentals). Then, departures from equilibrium real exchange rate (misalignments) are obtained, and a Markov Switching Model is used to characterize the misalignments series as stochastic autor...